NASA Technical Reports Server (NTRS)
Stokes, R. L.
1979-01-01
Tests performed to determine accuracy and efficiency of bus separators used in microprocessors are presented. Functional, AC parametric, and DC parametric tests were performed in a Tektronix S-3260 automated test system. All the devices passed the functional tests and yielded nominal values in the parametric test.
A unified framework for weighted parametric multiple test procedures.
Xi, Dong; Glimm, Ekkehard; Maurer, Willi; Bretz, Frank
2017-09-01
We describe a general framework for weighted parametric multiple test procedures based on the closure principle. We utilize general weighting strategies that can reflect complex study objectives and include many procedures in the literature as special cases. The proposed weighted parametric tests bridge the gap between rejection rules using either adjusted significance levels or adjusted p-values. This connection is made by allowing intersection hypotheses of the underlying closed test procedure to be tested at level smaller than α. This may be also necessary to take certain study situations into account. For such cases we introduce a subclass of exact α-level parametric tests that satisfy the consonance property. When the correlation is known only for certain subsets of the test statistics, a new procedure is proposed to fully utilize this knowledge within each subset. We illustrate the proposed weighted parametric tests using a clinical trial example and conduct a simulation study to investigate its operating characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Note on the Assumption of Identical Distributions for Nonparametric Tests of Location
ERIC Educational Resources Information Center
Nordstokke, David W.; Colp, S. Mitchell
2018-01-01
Often, when testing for shift in location, researchers will utilize nonparametric statistical tests in place of their parametric counterparts when there is evidence or belief that the assumptions of the parametric test are not met (i.e., normally distributed dependent variables). An underlying and often unattended to assumption of nonparametric…
Parametric tests of a traction drive retrofitted to an automotive gas turbine
NASA Technical Reports Server (NTRS)
Rohn, D. A.; Lowenthal, S. H.; Anderson, N. E.
1980-01-01
The results of a test program to retrofit a high performance fixed ratio Nasvytis Multiroller Traction Drive in place of a helical gear set to a gas turbine engine are presented. Parametric tests up to a maximum engine power turbine speed of 45,500 rpm and to a power level of 11 kW were conducted. Comparisons were made to similar drives that were parametrically tested on a back-to-back test stand. The drive showed good compatibility with the gas turbine engine. Specific fuel consumption of the engine with the traction drive speed reducer installed was comparable to the original helical gearset equipped engine.
Electrical Characterization of the RCA CDP1822SD Random Access Memory, Volume 1, Appendix a
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
Electrical characteristization tests were performed on 35 RCA CDP1822SD, 256-by-4-bit, CMOS, random access memories. The tests included three functional tests, AC and DC parametric tests, a series of schmoo plots, rise/fall time screening, and a data retention test. All tests were performed on an automated IC test system with temperatures controlled by a thermal airstream unit. All the functional tests, the data retention test, and the AC and DC parametric tests were performed at ambient temperatures of 25 C, -20 C, -55 C, 85 C, and 125 C. The schmoo plots were performed at ambient temperatures of 25 C, -55 C, and 125 C. The data retention test was performed at 25 C. Five devices failed one or more functional tests and four of these devices failed to meet the expected limits of a number of AC parametric tests. Some of the schmoo plots indicated a small degree of interaction between parameters.
ERIC Educational Resources Information Center
Osler, James Edward
2014-01-01
This monograph provides an epistemological rational for the design of a novel post hoc statistical measure called "Tri-Center Analysis". This new statistic is designed to analyze the post hoc outcomes of the Tri-Squared Test. In Tri-Center Analysis trichotomous parametric inferential parametric statistical measures are calculated from…
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Pretest uncertainty analysis for chemical rocket engine tests
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1987-01-01
A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.
Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 5, Appendix D
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
The electrical characterization and qualification test results are presented for the RCA MWS 5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Average input high current, worst case input high current, output low current, and data setup time are some of the results presented.
Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 4, Appendix C
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
The electrical characterization and qualification test results are presented for the RCA MWS5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Statistical analysis data is supplied along with write pulse width, read cycle time, write cycle time, and chip enable time data.
Electrical Evaluation of RCA MWS5501D Random Access Memory, Volume 2, Appendix a
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
The electrical characterization and qualification test results are presented for the RCA MWS5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. The address access time, address readout time, the data hold time, and the data setup time are some of the results surveyed.
Parametric analysis of ATM solar array.
NASA Technical Reports Server (NTRS)
Singh, B. K.; Adkisson, W. B.
1973-01-01
The paper discusses the methods used for the calculation of ATM solar array performance characteristics and provides the parametric analysis of solar panels used in SKYLAB. To predict the solar array performance under conditions other than test conditions, a mathematical model has been developed. Four computer programs have been used to convert the solar simulator test data to the parametric curves. The first performs module summations, the second determines average solar cell characteristics which will cause a mathematical model to generate a curve matching the test data, the third is a polynomial fit program which determines the polynomial equations for the solar cell characteristics versus temperature, and the fourth program uses the polynomial coefficients generated by the polynomial curve fit program to generate the parametric data.
SEC sensor parametric test and evaluation system
NASA Technical Reports Server (NTRS)
1978-01-01
This system provides the necessary automated hardware required to carry out, in conjunction with the existing 70 mm SEC television camera, the sensor evaluation tests which are described in detail. The Parametric Test Set (PTS) was completed and is used in a semiautomatic data acquisition and control mode to test the development of the 70 mm SEC sensor, WX 32193. Data analysis of raw data is performed on the Princeton IBM 360-91 computer.
Design, construction, operation, and evaluation of a prototype culm combustion boiler/heater unit
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Aciermo, J.; Richards, H.; Spindler, F.
1983-10-01
A process for utilizing anthracite culm in a fluidized bed combustion system was demonstrated by the design and construction of a prototype steam plant at Shamokin, PA, and operation of the plant for parametric tests and a nine month extended durability test. The parametric tests evaluated turndown capability of the plant and established turndown techniques to be used to achieve best performance. Throughout the test program the fluidized bed boiler durability was excellent, showing very high resistence to corrosion and erosion. A series of 39 parametric tests was performed in order to demonstrate turndown capabilities of the atmospheric fluidized bedmore » boiler burning anthracite culm. Four tests were performed with bituminous coal waste (called gob) which contains 4.8 to 5.5% sulfur. Heating value of both fuels is approximately 3000 Btu/lb and ash content is approximately 70%. Combustion efficiency, boiler efficiency, and emissions of NO/sub x/ and SO/sub 2/ were also determined for the tests.« less
Location tests for biomarker studies: a comparison using simulations for the two-sample case.
Scheinhardt, M O; Ziegler, A
2013-01-01
Gene, protein, or metabolite expression levels are often non-normally distributed, heavy tailed and contain outliers. Standard statistical approaches may fail as location tests in this situation. In three Monte-Carlo simulation studies, we aimed at comparing the type I error levels and empirical power of standard location tests and three adaptive tests [O'Gorman, Can J Stat 1997; 25: 269 -279; Keselman et al., Brit J Math Stat Psychol 2007; 60: 267- 293; Szymczak et al., Stat Med 2013; 32: 524 - 537] for a wide range of distributions. We simulated two-sample scenarios using the g-and-k-distribution family to systematically vary tail length and skewness with identical and varying variability between groups. All tests kept the type I error level when groups did not vary in their variability. The standard non-parametric U-test performed well in all simulated scenarios. It was outperformed by the two non-parametric adaptive methods in case of heavy tails or large skewness. Most tests did not keep the type I error level for skewed data in the case of heterogeneous variances. The standard U-test was a powerful and robust location test for most of the simulated scenarios except for very heavy tailed or heavy skewed data, and it is thus to be recommended except for these cases. The non-parametric adaptive tests were powerful for both normal and non-normal distributions under sample variance homogeneity. But when sample variances differed, they did not keep the type I error level. The parametric adaptive test lacks power for skewed and heavy tailed distributions.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
kruX: matrix-based non-parametric eQTL discovery.
Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom
2014-01-14
The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.
Characteristics of stereo reproduction with parametric loudspeakers
NASA Astrophysics Data System (ADS)
Aoki, Shigeaki; Toba, Masayoshi; Tsujita, Norihisa
2012-05-01
A parametric loudspeaker utilizes nonlinearity of a medium and is known as a super-directivity loudspeaker. The parametric loudspeaker is one of the prominent applications of nonlinear ultrasonics. So far, the applications have been limited monaural reproduction sound system for public address in museum, station and street etc. In this paper, we discussed characteristics of stereo reproduction with two parametric loudspeakers by comparing with those with two ordinary dynamic loudspeakers. In subjective tests, three typical listening positions were selected to investigate the possibility of correct sound localization in a wide listening area. The binaural information was ILD (Interaural Level Difference) or ITD (Interaural Time Delay). The parametric loudspeaker was an equilateral hexagon. The inner and outer diameters were 99 and 112 mm, respectively. Signals were 500 Hz, 1 kHz, 2 kHz and 4 kHz pure tones and pink noise. Three young males listened to test signals 10 times in each listening condition. Subjective test results showed that listeners at the three typical listening positions perceived correct sound localization of all signals using the parametric loudspeakers. It was almost similar to those using the ordinary dynamic loudspeakers, however, except for the case of sinusoidal waves with ITD. It was determined the parametric loudspeaker could exclude the contradiction between the binaural information ILD and ITD that occurred in stereo reproduction with ordinary dynamic loudspeakers because the super directivity of parametric loudspeaker suppressed the cross talk components.
Introduction to Permutation and Resampling-Based Hypothesis Tests
ERIC Educational Resources Information Center
LaFleur, Bonnie J.; Greevy, Robert A.
2009-01-01
A resampling-based method of inference--permutation tests--is often used when distributional assumptions are questionable or unmet. Not only are these methods useful for obvious departures from parametric assumptions (e.g., normality) and small sample sizes, but they are also more robust than their parametric counterparts in the presences of…
Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A
2015-05-01
Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.
Coaxial Dump Ramjet Combustor Combustion Instabilities. Part I. Parametric Test Data.
1981-07-01
AD-AIII 355 COAXIAL DUP RA8.? COMBUSTOR COMBUSTION INSTABILITIES I/~ PART I PARAUER1C. 1111 AIR FORCE WRIONT AERONUTICAL LAOS WRIOIII-PATTERSOll...MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANOAROS - 193- A AFWAL-TR-81 -2047 Part 1 COAXIAL DUMP RAMJET COMBUSTOR COMBUSTION INSTABILITIES PART...COMBUSTOR Interim Report for Period COMBUSTION INSTABILITIES February 1979 - March 1980 Part I - Parametric Test Data S. PERFORMING ORG. REPORT NUMBER 7
Hutson, Alan D
2018-01-01
In this note, we develop a new and novel semi-parametric estimator of the survival curve that is comparable to the product-limit estimator under very relaxed assumptions. The estimator is based on a beta parametrization that warps the empirical distribution of the observed censored and uncensored data. The parameters are obtained using a pseudo-maximum likelihood approach adjusting the survival curve accounting for the censored observations. In the univariate setting, the new estimator tends to better extend the range of the survival estimation given a high degree of censoring. However, the key feature of this paper is that we develop a new two-group semi-parametric exact permutation test for comparing survival curves that is generally superior to the classic log-rank and Wilcoxon tests and provides the best global power across a variety of alternatives. The new test is readily extended to the k group setting. PMID:26988931
Study of parametric instability in gravitational wave detectors with silicon test masses
NASA Astrophysics Data System (ADS)
Zhang, Jue; Zhao, Chunnong; Ju, Li; Blair, David
2017-03-01
Parametric instability is an intrinsic risk in high power laser interferometer gravitational wave detectors, in which the optical cavity modes interact with the acoustic modes of the mirrors, leading to exponential growth of the acoustic vibration. In this paper, we investigate the potential parametric instability for a proposed next generation gravitational wave detector, the LIGO Voyager blue design, with cooled silicon test masses of size 45 cm in diameter and 55 cm in thickness. It is shown that there would be about two unstable modes per test mass at an arm cavity power of 3 MW, with the highest parametric gain of ∼76. While this is less than the predicted number of unstable modes for Advanced LIGO (∼40 modes with max gain of ∼32 at the designed operating power of 830 kW), the importance of developing suitable instability suppression schemes is emphasized.
Effects of cosmic rays on single event upsets
NASA Technical Reports Server (NTRS)
Venable, D. D.; Zajic, V.; Lowe, C. W.; Olidapupo, A.; Fogarty, T. N.
1989-01-01
Assistance was provided to the Brookhaven Single Event Upset (SEU) Test Facility. Computer codes were developed for fragmentation and secondary radiation affecting Very Large Scale Integration (VLSI) in space. A computer controlled CV (HP4192) test was developed for Terman analysis. Also developed were high speed parametric tests which are independent of operator judgment and a charge pumping technique for measurement of D(sub it) (E). The X-ray secondary effects, and parametric degradation as a function of dose rate were simulated. The SPICE simulation of static RAMs with various resistor filters was tested.
Le Boedec, Kevin
2016-12-01
According to international guidelines, parametric methods must be chosen for RI construction when the sample size is small and the distribution is Gaussian. However, normality tests may not be accurate at small sample size. The purpose of the study was to evaluate normality test performance to properly identify samples extracted from a Gaussian population at small sample sizes, and assess the consequences on RI accuracy of applying parametric methods to samples that falsely identified the parent population as Gaussian. Samples of n = 60 and n = 30 values were randomly selected 100 times from simulated Gaussian, lognormal, and asymmetric populations of 10,000 values. The sensitivity and specificity of 4 normality tests were compared. Reference intervals were calculated using 6 different statistical methods from samples that falsely identified the parent population as Gaussian, and their accuracy was compared. Shapiro-Wilk and D'Agostino-Pearson tests were the best performing normality tests. However, their specificity was poor at sample size n = 30 (specificity for P < .05: .51 and .50, respectively). The best significance levels identified when n = 30 were 0.19 for Shapiro-Wilk test and 0.18 for D'Agostino-Pearson test. Using parametric methods on samples extracted from a lognormal population but falsely identified as Gaussian led to clinically relevant inaccuracies. At small sample size, normality tests may lead to erroneous use of parametric methods to build RI. Using nonparametric methods (or alternatively Box-Cox transformation) on all samples regardless of their distribution or adjusting, the significance level of normality tests depending on sample size would limit the risk of constructing inaccurate RI. © 2016 American Society for Veterinary Clinical Pathology.
Comparison of parametric and bootstrap method in bioequivalence test.
Ahn, Byung-Jin; Yim, Dong-Seok
2009-10-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.
Comparison of Parametric and Bootstrap Method in Bioequivalence Test
Ahn, Byung-Jin
2009-01-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption. PMID:19915699
kruX: matrix-based non-parametric eQTL discovery
2014-01-01
Background The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. Results We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. Conclusion kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com. PMID:24423115
Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry
2004-06-01
This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.
Parametric vs. non-parametric statistics of low resolution electromagnetic tomography (LORETA).
Thatcher, R W; North, D; Biver, C
2005-01-01
This study compared the relative statistical sensitivity of non-parametric and parametric statistics of 3-dimensional current sources as estimated by the EEG inverse solution Low Resolution Electromagnetic Tomography (LORETA). One would expect approximately 5% false positives (classification of a normal as abnormal) at the P < .025 level of probability (two tailed test) and approximately 1% false positives at the P < .005 level. EEG digital samples (2 second intervals sampled 128 Hz, 1 to 2 minutes eyes closed) from 43 normal adult subjects were imported into the Key Institute's LORETA program. We then used the Key Institute's cross-spectrum and the Key Institute's LORETA output files (*.lor) as the 2,394 gray matter pixel representation of 3-dimensional currents at different frequencies. The mean and standard deviation *.lor files were computed for each of the 2,394 gray matter pixels for each of the 43 subjects. Tests of Gaussianity and different transforms were computed in order to best approximate a normal distribution for each frequency and gray matter pixel. The relative sensitivity of parametric vs. non-parametric statistics were compared using a "leave-one-out" cross validation method in which individual normal subjects were withdrawn and then statistically classified as being either normal or abnormal based on the remaining subjects. Log10 transforms approximated Gaussian distribution in the range of 95% to 99% accuracy. Parametric Z score tests at P < .05 cross-validation demonstrated an average misclassification rate of approximately 4.25%, and range over the 2,394 gray matter pixels was 27.66% to 0.11%. At P < .01 parametric Z score cross-validation false positives were 0.26% and ranged from 6.65% to 0% false positives. The non-parametric Key Institute's t-max statistic at P < .05 had an average misclassification error rate of 7.64% and ranged from 43.37% to 0.04% false positives. The nonparametric t-max at P < .01 had an average misclassification rate of 6.67% and ranged from 41.34% to 0% false positives of the 2,394 gray matter pixels for any cross-validated normal subject. In conclusion, adequate approximation to Gaussian distribution and high cross-validation can be achieved by the Key Institute's LORETA programs by using a log10 transform and parametric statistics, and parametric normative comparisons had lower false positive rates than the non-parametric tests.
Problems of the design of low-noise input devices. [parametric amplifiers
NASA Technical Reports Server (NTRS)
Manokhin, V. M.; Nemlikher, Y. A.; Strukov, I. A.; Sharfov, Y. A.
1974-01-01
An analysis is given of the requirements placed on the elements of parametric centimeter waveband amplifiers for achievement of minimal noise temperatures. A low-noise semiconductor parametric amplifier using germanium parametric diodes for a receiver operating in the 4 GHz band was developed and tested confirming the possibility of satisfying all requirements.
Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E
2013-06-01
Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric Society.
How to Compare Parametric and Nonparametric Person-Fit Statistics Using Real Data
ERIC Educational Resources Information Center
Sinharay, Sandip
2017-01-01
Person-fit assessment (PFA) is concerned with uncovering atypical test performance as reflected in the pattern of scores on individual items on a test. Existing person-fit statistics (PFSs) include both parametric and nonparametric statistics. Comparison of PFSs has been a popular research topic in PFA, but almost all comparisons have employed…
Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 1
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
Electrical characterization and qualification tests were performed on the RCA MWS5001D, 1024 by 1-bit, CMOS, random access memory. Characterization tests were performed on five devices. The tests included functional tests, AC parametric worst case pattern selection test, determination of worst-case transition for setup and hold times and a series of schmoo plots. The qualification tests were performed on 32 devices and included a 2000 hour burn in with electrical tests performed at 0 hours and after 168, 1000, and 2000 hours of burn in. The tests performed included functional tests and AC and DC parametric tests. All of the tests in the characterization phase, with the exception of the worst-case transition test, were performed at ambient temperatures of 25, -55 and 125 C. The worst-case transition test was performed at 25 C. The preburn in electrical tests were performed at 25, -55, and 125 C. All burn in endpoint tests were performed at 25, -40, -55, 85, and 125 C.
Towards an Empirically Based Parametric Explosion Spectral Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, S R; Walter, W R; Ruppert, S
2009-08-31
Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never before been tested. The focus of our work is on the local and regional distances (< 2000 km) and phases (Pn, Pg, Sn, Lg) necessary to see small explosions. We are developing a parametric model of the nuclear explosion seismic source spectrum that is compatible with the earthquake-based geometrical spreading and attenuation models developed using the Magnitude Distance Amplitude Correction (MDAC) techniques (Walter and Taylor, 2002). The explosion parametric model will be particularly important in regions without any priormore » explosion data for calibration. The model is being developed using the available body of seismic data at local and regional distances for past nuclear explosions at foreign and domestic test sites. Parametric modeling is a simple and practical approach for widespread monitoring applications, prior to the capability to carry out fully deterministic modeling. The achievable goal of our parametric model development is to be able to predict observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing. The relationship between the parametric equations and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source.« less
BROËT, PHILIPPE; TSODIKOV, ALEXANDER; DE RYCKE, YANN; MOREAU, THIERRY
2010-01-01
This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests. PMID:15293627
ERIC Educational Resources Information Center
Cui, Zhongmin; Kolen, Michael J.
2008-01-01
This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Maity, Arnab; Carroll, Raymond J; Mammen, Enno; Chatterjee, Nilanjan
2009-01-01
Motivated from the problem of testing for genetic effects on complex traits in the presence of gene-environment interaction, we develop score tests in general semiparametric regression problems that involves Tukey style 1 degree-of-freedom form of interaction between parametrically and non-parametrically modelled covariates. We find that the score test in this type of model, as recently developed by Chatterjee and co-workers in the fully parametric setting, is biased and requires undersmoothing to be valid in the presence of non-parametric components. Moreover, in the presence of repeated outcomes, the asymptotic distribution of the score test depends on the estimation of functions which are defined as solutions of integral equations, making implementation difficult and computationally taxing. We develop profiled score statistics which are unbiased and asymptotically efficient and can be performed by using standard bandwidth selection methods. In addition, to overcome the difficulty of solving functional equations, we give easy interpretations of the target functions, which in turn allow us to develop estimation procedures that can be easily implemented by using standard computational methods. We present simulation studies to evaluate type I error and power of the method proposed compared with a naive test that does not consider interaction. Finally, we illustrate our methodology by analysing data from a case-control study of colorectal adenoma that was designed to investigate the association between colorectal adenoma and the candidate gene NAT2 in relation to smoking history.
Development of suspended core soft glass fibers for far-detuned parametric conversion
NASA Astrophysics Data System (ADS)
Rampur, Anupamaa; Ciąćka, Piotr; Cimek, Jarosław; Kasztelanic, Rafał; Buczyński, Ryszard; Klimczak, Mariusz
2018-04-01
Light sources utilizing χ (2) parametric conversion combine high brightness with attractive operation wavelengths in the near and mid-infrared. In optical fibers, it is possible to use χ (3) degenerate four-wave mixing in order to obtain signal-to-idler frequency detuning of over 100 THz. We report on a test series of nonlinear soft glass suspended core fibers intended for parametric conversion of 1000-1100 nm signal wavelengths available from an array of mature lasers into the near-to-mid-infrared range of 2700-3500 nm under pumping with an erbium sub-picosecond laser system. The presented discussion includes modelling of the fiber properties, details of their physical development and characterization, and experimental tests of parametric conversion.
Robust non-parametric one-sample tests for the analysis of recurrent events.
Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia
2010-12-30
One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.
[The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].
Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel
2017-01-01
The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.
How to Evaluate Phase Differences between Trial Groups in Ongoing Electrophysiological Signals
VanRullen, Rufin
2016-01-01
A growing number of studies endeavor to reveal periodicities in sensory and cognitive functions, by comparing the distribution of ongoing (pre-stimulus) oscillatory phases between two (or more) trial groups reflecting distinct experimental outcomes. A systematic relation between the phase of spontaneous electrophysiological signals, before a stimulus is even presented, and the eventual result of sensory or cognitive processing for that stimulus, would be indicative of an intrinsic periodicity in the underlying neural process. Prior studies of phase-dependent perception have used a variety of analytical methods to measure and evaluate phase differences, and there is currently no established standard practice in this field. The present report intends to remediate this need, by systematically comparing the statistical power of various measures of “phase opposition” between two trial groups, in a number of real and simulated experimental situations. Seven measures were evaluated: one parametric test (circular Watson-Williams test), and three distinct measures of phase opposition (phase bifurcation index, phase opposition sum, and phase opposition product) combined with two procedures for non-parametric statistical testing (permutation, or a combination of z-score and permutation). While these are obviously not the only existing or conceivable measures, they have all been used in recent studies. All tested methods performed adequately on a previously published dataset (Busch et al., 2009). On a variety of artificially constructed datasets, no single measure was found to surpass all others, but instead the suitability of each measure was contingent on several experimental factors: the time, frequency, and depth of oscillatory phase modulation; the absolute and relative amplitudes of post-stimulus event-related potentials for the two trial groups; the absolute and relative trial numbers for the two groups; and the number of permutations used for non-parametric testing. The concurrent use of two phase opposition measures, the parametric Watson-Williams test and a non-parametric test based on summing inter-trial coherence values for the two trial groups, appears to provide the most satisfactory outcome in all situations tested. Matlab code is provided to automatically compute these phase opposition measures. PMID:27683543
Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter
2011-04-13
The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.
Numerical prediction of 3-D ejector flows
NASA Technical Reports Server (NTRS)
Roberts, D. W.; Paynter, G. C.
1979-01-01
The use of parametric flow analysis, rather than parametric scale testing, to support the design of an ejector system offers a number of potential advantages. The application of available 3-D flow analyses to the design ejectors can be subdivided into several key elements. These are numerics, turbulence modeling, data handling and display, and testing in support of analysis development. Experimental and predicted jet exhaust for the Boeing 727 aircraft are examined.
ABALUCK, JASON
2017-01-01
We explore the in- and out- of sample robustness of tests for choice inconsistencies based on parameter restrictions in parametric models, focusing on tests proposed by Ketcham, Kuminoff and Powers (KKP). We argue that their non-parametric alternatives are inherently conservative with respect to detecting mistakes. We then show that our parametric model is robust to KKP’s suggested specification checks, and that comprehensive goodness of fit measures perform better with our model than the expected utility model. Finally, we explore the robustness of our 2011 results to alternative normative assumptions highlighting the role of brand fixed effects and unobservable characteristics. PMID:29170561
Single-arm phase II trial design under parametric cure models.
Wu, Jianrong
2015-01-01
The current practice of designing single-arm phase II survival trials is limited under the exponential model. Trial design under the exponential model may not be appropriate when a portion of patients are cured. There is no literature available for designing single-arm phase II trials under the parametric cure model. In this paper, a test statistic is proposed, and a sample size formula is derived for designing single-arm phase II trials under a class of parametric cure models. Extensive simulations showed that the proposed test and sample size formula perform very well under different scenarios. Copyright © 2015 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giammichele, N.; Fontaine, G.; Brassard, P.
We present a prescription for parametrizing the chemical profile in the core of white dwarfs in light of the recent discovery that pulsation modes may sometimes be deeply confined in some cool pulsating white dwarfs. Such modes may be used as unique probes of the complicated chemical stratification that results from several processes that occurred in previous evolutionary phases of intermediate-mass stars. This effort is part of our ongoing quest for more credible and realistic seismic models of white dwarfs using static, parametrized equilibrium structures. Inspired by successful techniques developed in design optimization fields (such as aerodynamics), we exploit Akimamore » splines for the tracing of the chemical profile of oxygen (carbon) in the core of a white dwarf model. A series of tests are then presented to better seize the precision and significance of the results that can be obtained in an asteroseismological context. We also show that the new parametrization passes an essential basic test, as it successfully reproduces the chemical stratification of a full evolutionary model.« less
NASA Astrophysics Data System (ADS)
Giammichele, N.; Charpinet, S.; Fontaine, G.; Brassard, P.
2017-01-01
We present a prescription for parametrizing the chemical profile in the core of white dwarfs in light of the recent discovery that pulsation modes may sometimes be deeply confined in some cool pulsating white dwarfs. Such modes may be used as unique probes of the complicated chemical stratification that results from several processes that occurred in previous evolutionary phases of intermediate-mass stars. This effort is part of our ongoing quest for more credible and realistic seismic models of white dwarfs using static, parametrized equilibrium structures. Inspired by successful techniques developed in design optimization fields (such as aerodynamics), we exploit Akima splines for the tracing of the chemical profile of oxygen (carbon) in the core of a white dwarf model. A series of tests are then presented to better seize the precision and significance of the results that can be obtained in an asteroseismological context. We also show that the new parametrization passes an essential basic test, as it successfully reproduces the chemical stratification of a full evolutionary model.
Two-sample tests and one-way MANOVA for multivariate biomarker data with nondetects.
Thulin, M
2016-09-10
Testing whether the mean vector of a multivariate set of biomarkers differs between several populations is an increasingly common problem in medical research. Biomarker data is often left censored because some measurements fall below the laboratory's detection limit. We investigate how such censoring affects multivariate two-sample and one-way multivariate analysis of variance tests. Type I error rates, power and robustness to increasing censoring are studied, under both normality and non-normality. Parametric tests are found to perform better than non-parametric alternatives, indicating that the current recommendations for analysis of censored multivariate data may have to be revised. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Comparison of four approaches to a rock facies classification problem
Dubois, M.K.; Bohling, Geoffrey C.; Chakrabarti, S.
2007-01-01
In this study, seven classifiers based on four different approaches were tested in a rock facies classification problem: classical parametric methods using Bayes' rule, and non-parametric methods using fuzzy logic, k-nearest neighbor, and feed forward-back propagating artificial neural network. Determining the most effective classifier for geologic facies prediction in wells without cores in the Panoma gas field, in Southwest Kansas, was the objective. Study data include 3600 samples with known rock facies class (from core) with each sample having either four or five measured properties (wire-line log curves), and two derived geologic properties (geologic constraining variables). The sample set was divided into two subsets, one for training and one for testing the ability of the trained classifier to correctly assign classes. Artificial neural networks clearly outperformed all other classifiers and are effective tools for this particular classification problem. Classical parametric models were inadequate due to the nature of the predictor variables (high dimensional and not linearly correlated), and feature space of the classes (overlapping). The other non-parametric methods tested, k-nearest neighbor and fuzzy logic, would need considerable improvement to match the neural network effectiveness, but further work, possibly combining certain aspects of the three non-parametric methods, may be justified. ?? 2006 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Kovach, L. S.; Zdankiewicz, E. M.
1987-01-01
Vapor compression distillation technology for phase change recovery of potable water from wastewater has evolved as a technically mature approach for use aboard the Space Station. A program to parametrically test an advanced preprototype Vapor Compression Distillation Subsystem (VCDS) was completed during 1985 and 1986. In parallel with parametric testing, a hardware improvement program was initiated to test the feasibility of incorporating several key improvements into the advanced preprototype VCDS following initial parametric tests. Specific areas of improvement included long-life, self-lubricated bearings, a lightweight, highly-efficient compressor, and a long-life magnetic drive. With the exception of the self-lubricated bearings, these improvements are incorporated. The advanced preprototype VCDS was designed to reclaim 95 percent of the available wastewater at a nominal water recovery rate of 1.36 kg/h achieved at a solids concentration of 2.3 percent and 308 K condenser temperature. While this performance was maintained for the initial testing, a 300 percent improvement in water production rate with a corresponding lower specific energy was achieved following incorporation of the improvements. Testing involved the characterization of key VCDS performance factors as a function of recycle loop solids concentration, distillation unit temperature and fluids pump speed. The objective of this effort was to expand the VCDS data base to enable defining optimum performance characteristics for flight hardware development.
Ultrasonically Absorptive Coatings for Hypersonic Laminar Flow Control
2007-12-01
integratt JAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of th...solid foundation for large-scale demonstration of the UAC-LFC performance the CUBRC LENS I -tnel as wel as fabrication of ceramic UAC samples...with regular microstructure to be tested the CUBRC LENS I tunnel, extensive parametric studies of the UAC laminar flow control performance were conducted
Ultrasonically Absorptive Coatings for Hypersonic
2008-05-13
UAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of the UAC-LFC...approaching the large-scale demonstration stage in the CUBRC LENS tunnel as well as fabrication of ceramic UAC samples integrated into TPS. Summary...integrate UAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of
The use of analysis of variance procedures in biological studies
Williams, B.K.
1987-01-01
The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
The chi-square test of independence.
McHugh, Mary L
2013-01-01
The Chi-square statistic is a non-parametric (distribution free) tool designed to analyze group differences when the dependent variable is measured at a nominal level. Like all non-parametric statistics, the Chi-square is robust with respect to the distribution of the data. Specifically, it does not require equality of variances among the study groups or homoscedasticity in the data. It permits evaluation of both dichotomous independent variables, and of multiple group studies. Unlike many other non-parametric and some parametric statistics, the calculations needed to compute the Chi-square provide considerable information about how each of the groups performed in the study. This richness of detail allows the researcher to understand the results and thus to derive more detailed information from this statistic than from many others. The Chi-square is a significance statistic, and should be followed with a strength statistic. The Cramer's V is the most common strength test used to test the data when a significant Chi-square result has been obtained. Advantages of the Chi-square include its robustness with respect to distribution of the data, its ease of computation, the detailed information that can be derived from the test, its use in studies for which parametric assumptions cannot be met, and its flexibility in handling data from both two group and multiple group studies. Limitations include its sample size requirements, difficulty of interpretation when there are large numbers of categories (20 or more) in the independent or dependent variables, and tendency of the Cramer's V to produce relative low correlation measures, even for highly significant results.
An Empirical Study of Eight Nonparametric Tests in Hierarchical Regression.
ERIC Educational Resources Information Center
Harwell, Michael; Serlin, Ronald C.
When normality does not hold, nonparametric tests represent an important data-analytic alternative to parametric tests. However, the use of nonparametric tests in educational research has been limited by the absence of easily performed tests for complex experimental designs and analyses, such as factorial designs and multiple regression analyses,…
An Item Response Theory Model for Test Bias.
ERIC Educational Resources Information Center
Shealy, Robin; Stout, William
This paper presents a conceptualization of test bias for standardized ability tests which is based on multidimensional, non-parametric, item response theory. An explanation of how individually-biased items can combine through a test score to produce test bias is provided. It is contended that bias, although expressed at the item level, should be…
Parametric Methods for Dynamic 11C-Phenytoin PET Studies.
Mansor, Syahir; Yaqub, Maqsood; Boellaard, Ronald; Froklage, Femke E; de Vries, Anke; Bakker, Esther D M; Voskuyl, Rob A; Eriksson, Jonas; Schwarte, Lothar A; Verbeek, Joost; Windhorst, Albert D; Lammertsma, Adriaan A
2017-03-01
In this study, the performance of various methods for generating quantitative parametric images of dynamic 11 C-phenytoin PET studies was evaluated. Methods: Double-baseline 60-min dynamic 11 C-phenytoin PET studies, including online arterial sampling, were acquired for 6 healthy subjects. Parametric images were generated using Logan plot analysis, a basis function method, and spectral analysis. Parametric distribution volume (V T ) and influx rate ( K 1 ) were compared with those obtained from nonlinear regression analysis of time-activity curves. In addition, global and regional test-retest (TRT) variability was determined for parametric K 1 and V T values. Results: Biases in V T observed with all parametric methods were less than 5%. For K 1 , spectral analysis showed a negative bias of 16%. The mean TRT variabilities of V T and K 1 were less than 10% for all methods. Shortening the scan duration to 45 min provided similar V T and K 1 with comparable TRT performance compared with 60-min data. Conclusion: Among the various parametric methods tested, the basis function method provided parametric V T and K 1 values with the least bias compared with nonlinear regression data and showed TRT variabilities lower than 5%, also for smaller volume-of-interest sizes (i.e., higher noise levels) and shorter scan duration. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
The report gives results of parametric test to evaluate the injection powdered activated carbon to control volatile pollutants in municipal waste combustor (MWC) flue gas. he tests were conducted at a spray dryer absorber/electrostatic precipitator (SD/ESP)-equipped MWC in Camden...
Kerschbamer, Rudolf
2015-05-01
This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.
Tsamados, Michel; Feltham, Daniel; Petty, Alek; Schroeder, David; Flocco, Daniela
2015-10-13
We present a modelling study of processes controlling the summer melt of the Arctic sea ice cover. We perform a sensitivity study and focus our interest on the thermodynamics at the ice-atmosphere and ice-ocean interfaces. We use the Los Alamos community sea ice model CICE, and additionally implement and test three new parametrization schemes: (i) a prognostic mixed layer; (ii) a three equation boundary condition for the salt and heat flux at the ice-ocean interface; and (iii) a new lateral melt parametrization. Recent additions to the CICE model are also tested, including explicit melt ponds, a form drag parametrization and a halodynamic brine drainage scheme. The various sea ice parametrizations tested in this sensitivity study introduce a wide spread in the simulated sea ice characteristics. For each simulation, the total melt is decomposed into its surface, bottom and lateral melt components to assess the processes driving melt and how this varies regionally and temporally. Because this study quantifies the relative importance of several processes in driving the summer melt of sea ice, this work can serve as a guide for future research priorities. © 2015 The Author(s).
Nonparametric predictive inference for combining diagnostic tests with parametric copula
NASA Astrophysics Data System (ADS)
Muhammad, Noryanti; Coolen, F. P. A.; Coolen-Maturi, T.
2017-09-01
Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine and health care. The Receiver Operating Characteristic (ROC) curve is a popular statistical tool for describing the performance of diagnostic tests. The area under the ROC curve (AUC) is often used as a measure of the overall performance of the diagnostic test. In this paper, we interest in developing strategies for combining test results in order to increase the diagnostic accuracy. We introduce nonparametric predictive inference (NPI) for combining two diagnostic test results with considering dependence structure using parametric copula. NPI is a frequentist statistical framework for inference on a future observation based on past data observations. NPI uses lower and upper probabilities to quantify uncertainty and is based on only a few modelling assumptions. While copula is a well-known statistical concept for modelling dependence of random variables. A copula is a joint distribution function whose marginals are all uniformly distributed and it can be used to model the dependence separately from the marginal distributions. In this research, we estimate the copula density using a parametric method which is maximum likelihood estimator (MLE). We investigate the performance of this proposed method via data sets from the literature and discuss results to show how our method performs for different family of copulas. Finally, we briefly outline related challenges and opportunities for future research.
Zou, Kelly H; Resnic, Frederic S; Talos, Ion-Florin; Goldberg-Zimring, Daniel; Bhagwat, Jui G; Haker, Steven J; Kikinis, Ron; Jolesz, Ferenc A; Ohno-Machado, Lucila
2005-10-01
Medical classification accuracy studies often yield continuous data based on predictive models for treatment outcomes. A popular method for evaluating the performance of diagnostic tests is the receiver operating characteristic (ROC) curve analysis. The main objective was to develop a global statistical hypothesis test for assessing the goodness-of-fit (GOF) for parametric ROC curves via the bootstrap. A simple log (or logit) and a more flexible Box-Cox normality transformations were applied to untransformed or transformed data from two clinical studies to predict complications following percutaneous coronary interventions (PCIs) and for image-guided neurosurgical resection results predicted by tumor volume, respectively. We compared a non-parametric with a parametric binormal estimate of the underlying ROC curve. To construct such a GOF test, we used the non-parametric and parametric areas under the curve (AUCs) as the metrics, with a resulting p value reported. In the interventional cardiology example, logit and Box-Cox transformations of the predictive probabilities led to satisfactory AUCs (AUC=0.888; p=0.78, and AUC=0.888; p=0.73, respectively), while in the brain tumor resection example, log and Box-Cox transformations of the tumor size also led to satisfactory AUCs (AUC=0.898; p=0.61, and AUC=0.899; p=0.42, respectively). In contrast, significant departures from GOF were observed without applying any transformation prior to assuming a binormal model (AUC=0.766; p=0.004, and AUC=0.831; p=0.03), respectively. In both studies the p values suggested that transformations were important to consider before applying any binormal model to estimate the AUC. Our analyses also demonstrated and confirmed the predictive values of different classifiers for determining the interventional complications following PCIs and resection outcomes in image-guided neurosurgery.
Thoracic Injury Risk Curves for Rib Deflections of the SID-IIs Build Level D.
Irwin, Annette L; Crawford, Greg; Gorman, David; Wang, Sikui; Mertz, Harold J
2016-11-01
Injury risk curves for SID-IIs thorax and abdomen rib deflections proposed for future NCAP side impact evaluations were developed from tests conducted with the SID-IIs FRG. Since the floating rib guide is known to reduce the magnitude of the peak rib deflections, injury risk curves developed from SID-IIs FRG data are not appropriate for use with SID-IIs build level D. PMHS injury data from three series of sled tests and one series of whole-body drop tests are paired with thoracic rib deflections from equivalent tests with SID-IIs build level D. Where possible, the rib deflections of SID-IIs build level D were scaled to adjust for differences in impact velocity between the PMHS and SID-IIs tests. Injury risk curves developed by the Mertz-Weber modified median rank method are presented and compared to risk curves developed by other parametric and non-parametric methods.
Experimental Characterization of Gas Turbine Emissions at Simulated Flight Altitude Conditions
NASA Technical Reports Server (NTRS)
Howard, R. P.; Wormhoudt, J. C.; Whitefield, P. D.
1996-01-01
NASA's Atmospheric Effects of Aviation Project (AEAP) is developing a scientific basis for assessment of the atmospheric impact of subsonic and supersonic aviation. A primary goal is to assist assessments of United Nations scientific organizations and hence, consideration of emissions standards by the International Civil Aviation Organization (ICAO). Engine tests have been conducted at AEDC to fulfill the need of AEAP. The purpose of these tests is to obtain a comprehensive database to be used for supplying critical information to the atmospheric research community. It includes: (1) simulated sea-level-static test data as well as simulated altitude data; and (2) intrusive (extractive probe) data as well as non-intrusive (optical techniques) data. A commercial-type bypass engine with aviation fuel was used in this test series. The test matrix was set by parametrically selecting the temperature, pressure, and flow rate at sea-level-static and different altitudes to obtain a parametric set of data.
The Environmental Technology Verification report discusses the technology and performance of a gaseous-emissions monitoring system for large, natural-gas-fired internal combustion engines. The device tested is the Parametric Emissions Monitoring System (PEMS) manufactured by ANR ...
Toward an Empirically-Based Parametric Explosion Spectral Model
2011-09-01
Site (NNSS, formerly the Nevada Test Site ) with data from explosions at the Semipalatinsk Test ...Nevada Test Site ) with data from explosions at the Semipalatinsk Test Site recorded at the Borovoye Geophysical Observatory (BRV). The BRV data archive...explosions at Semipalatinsk Test Site of the former Soviet Union (Figure 4). As an example, we plot the regional phase spectra of one of
Out-of-core Evaluations of Uranium Nitride-fueled Converters
NASA Technical Reports Server (NTRS)
Shimada, K.
1972-01-01
Two uranium nitride fueled converters were tested parametrically for their initial characterization and are currently being life-tested out of core. Test method being employed for the parametric and the diagnostic measurements during the life tests, and test results are presented. One converter with a rhenium emitter had an initial output power density of 6.9 W/ sq cm at the black body emitter temperature of 1900 K. The power density remained unchanged for the first 1000 hr of life test but degraded nearly 50% percent during the following 1000 hr. Electrode work function measurements indicated that the uranium fuel was diffusing out of the emitter clad of 0.635 mm. The other converter with a tungsten emitter had an initial output power density of 2.2 W/ sq cm at 1900 K with a power density of 3.9 W/sq cm at 4300 h. The power density suddenly degraded within 20 hr to practically zero output at 4735 hr.
Determination of Acreage Thermal Protection Foam Loss From Ice and Foam Impacts
NASA Technical Reports Server (NTRS)
Carney, Kelly S.; Lawrence, Charles
2015-01-01
A parametric study was conducted to establish Thermal Protection System (TPS) loss from foam and ice impact conditions similar to what might occur on the Space Launch System. This study was based upon the large amount of testing and analysis that was conducted with both ice and foam debris impacts on TPS acreage foam for the Space Shuttle Project External Tank. Test verified material models and modeling techniques that resulted from Space Shuttle related testing were utilized for this parametric study. Parameters varied include projectile mass, impact velocity and impact angle (5 degree and 10 degree impacts). The amount of TPS acreage foam loss as a result of the various impact conditions is presented.
NASA Astrophysics Data System (ADS)
Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi
2016-05-01
This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable compared with that of other stations. The highest values of sustainability index at Mundra (0.120) and Naliya (0.112) stations confirmed the earlier findings of R y- R e- V y approach. In general, annual rainfall of the study area is less reliable, less resilient, and moderately vulnerable, which emphasizes the need of developing suitable strategies for managing water resources of the area on sustainable basis. Finally, it is recommended that multiple statistical tests (at least two) should be used in time-series modeling for making reliable decisions. Moreover, methodology and findings of the sustainability concept in rainfall time series can easily be adopted in other arid regions of the world.
Sarkar, Rajarshi
2013-07-01
The validity of the entire renal function tests as a diagnostic tool depends substantially on the Biological Reference Interval (BRI) of urea. Establishment of BRI of urea is difficult partly because exclusion criteria for selection of reference data are quite rigid and partly due to the compartmentalization considerations regarding age and sex of the reference individuals. Moreover, construction of Biological Reference Curve (BRC) of urea is imperative to highlight the partitioning requirements. This a priori study examines the data collected by measuring serum urea of 3202 age and sex matched individuals, aged between 1 and 80 years, by a kinetic UV Urease/GLDH method on a Roche Cobas 6000 auto-analyzer. Mann-Whitney U test of the reference data confirmed the partitioning requirement by both age and sex. Further statistical analysis revealed the incompatibility of the data for a proposed parametric model. Hence the data was non-parametrically analysed. BRI was found to be identical for both sexes till the 2(nd) decade, and the BRI for males increased progressively 6(th) decade onwards. Four non-parametric models were postulated for construction of BRC: Gaussian kernel, double kernel, local mean and local constant, of which the last one generated the best-fitting curves. Clinical decision making should become easier and diagnostic implications of renal function tests should become more meaningful if this BRI is followed and the BRC is used as a desktop tool in conjunction with similar data for serum creatinine.
Hock, Sia Chong; Constance, Neo Xue Rui; Wah, Chan Lai
2012-01-01
Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach for determining the pharmaceutical quality of the finished dosage form. In the case of terminally sterilized parenteral products, the limitations of conventional batch testing have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms, beyond terminally sterilized parenteral products. For parametric release to be possible, manufacturers must be capable of designing quality into the product, monitoring the manufacturing processes, and controlling the quality of intermediates and finished products in real-time. Process analytical technology (PAT) has been thought to be capable of contributing to these prerequisites. It is believed that the appropriate use of PAT tools can eventually lead to the possibility of real-time release of other pharmaceutical dosage forms, by-passing the need for end-product batch testing. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. Last but not least, current regulations governing the use of PAT and the manufacturing challenges associated with PAT implementation are also discussed. Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach. In the case of terminally sterilized parenteral products, these limitations have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms. With the advancement of process analytical technology (PAT), it is possible to monitor the manufacturing processes closely. This will eventually enable quality control of the intermediates and finished products, and thus their release in real-time. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. It will also discuss the current regulations governing the use of PAT and the manufacturing challenges associated with the implementation of PAT.
Resonant dampers for parametric instabilities in gravitational wave detectors
NASA Astrophysics Data System (ADS)
Gras, S.; Fritschel, P.; Barsotti, L.; Evans, M.
2015-10-01
Advanced gravitational wave interferometric detectors will operate at their design sensitivity with nearly ˜1 MW of laser power stored in the arm cavities. Such large power may lead to the uncontrolled growth of acoustic modes in the test masses due to the transfer of optical energy to the mechanical modes of the arm cavity mirrors. These parametric instabilities have the potential to significantly compromise the detector performance and control. Here we present the design of "acoustic mode dampers" that use the piezoelectric effect to reduce the coupling of optical to mechanical energy. Experimental measurements carried on an Advanced LIGO-like test mass have shown a tenfold reduction in the amplitude of several mechanical modes, thus suggesting that this technique can greatly mitigate the impact of parametric instabilities in advanced detectors.
NASA Astrophysics Data System (ADS)
Linden, Sebastian; Virey, Jean-Marc
2008-07-01
We test the robustness and flexibility of the Chevallier-Polarski-Linder (CPL) parametrization of the dark energy equation of state w(z)=w0+wa(z)/(1+z) in recovering a four-parameter steplike fiducial model. We constrain the parameter space region of the underlying fiducial model where the CPL parametrization offers a reliable reconstruction. It turns out that non-negligible biases leak into the results for recent (z<2.5) rapid transitions, but that CPL yields a good reconstruction in all other cases. The presented analysis is performed with supernova Ia data as forecasted for a space mission like SNAP/JDEM, combined with future expectations for the cosmic microwave background shift parameter R and the baryonic acoustic oscillation parameter A.
Parametric, nonparametric and parametric modelling of a chaotic circuit time series
NASA Astrophysics Data System (ADS)
Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.
2000-09-01
The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.
Parametric Modeling for Fluid Systems
NASA Technical Reports Server (NTRS)
Pizarro, Yaritzmar Rosario; Martinez, Jonathan
2013-01-01
Fluid Systems involves different projects that require parametric modeling, which is a model that maintains consistent relationships between elements as is manipulated. One of these projects is the Neo Liquid Propellant Testbed, which is part of Rocket U. As part of Rocket U (Rocket University), engineers at NASA's Kennedy Space Center in Florida have the opportunity to develop critical flight skills as they design, build and launch high-powered rockets. To build the Neo testbed; hardware from the Space Shuttle Program was repurposed. Modeling for Neo, included: fittings, valves, frames and tubing, between others. These models help in the review process, to make sure regulations are being followed. Another fluid systems project that required modeling is Plant Habitat's TCUI test project. Plant Habitat is a plan to develop a large growth chamber to learn the effects of long-duration microgravity exposure to plants in space. Work for this project included the design and modeling of a duct vent for flow test. Parametric Modeling for these projects was done using Creo Parametric 2.0.
EMISSION TEST REPORT, OMSS FIELD TEST ON CARBON INJECTION FOR MERCURY CONTROL
The report discusses results of a parametric evaluation of powdered activated carbon for control of mercury (Hg) emission from a municipal waste cornbustor (MWC) equipped with a lime spray dryer absorber/fabric filter (SD/FF). The primary test objectives were to evaluate the effe...
Sample Size Determination for One- and Two-Sample Trimmed Mean Tests
ERIC Educational Resources Information Center
Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng
2008-01-01
Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.
A non-parametric consistency test of the ΛCDM model with Planck CMB data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aghamousa, Amir; Shafieloo, Arman; Hamann, Jan, E-mail: amir@aghamousa.com, E-mail: jan.hamann@unsw.edu.au, E-mail: shafieloo@kasi.re.kr
Non-parametric reconstruction methods, such as Gaussian process (GP) regression, provide a model-independent way of estimating an underlying function and its uncertainty from noisy data. We demonstrate how GP-reconstruction can be used as a consistency test between a given data set and a specific model by looking for structures in the residuals of the data with respect to the model's best-fit. Applying this formalism to the Planck temperature and polarisation power spectrum measurements, we test their global consistency with the predictions of the base ΛCDM model. Our results do not show any serious inconsistencies, lending further support to the interpretation ofmore » the base ΛCDM model as cosmology's gold standard.« less
Nonparametric tests for equality of psychometric functions.
García-Pérez, Miguel A; Núñez-Antón, Vicente
2017-12-07
Many empirical studies measure psychometric functions (curves describing how observers' performance varies with stimulus magnitude) because these functions capture the effects of experimental conditions. To assess these effects, parametric curves are often fitted to the data and comparisons are carried out by testing for equality of mean parameter estimates across conditions. This approach is parametric and, thus, vulnerable to violations of the implied assumptions. Furthermore, testing for equality of means of parameters may be misleading: Psychometric functions may vary meaningfully across conditions on an observer-by-observer basis with no effect on the mean values of the estimated parameters. Alternative approaches to assess equality of psychometric functions per se are thus needed. This paper compares three nonparametric tests that are applicable in all situations of interest: The existing generalized Mantel-Haenszel test, a generalization of the Berry-Mielke test that was developed here, and a split variant of the generalized Mantel-Haenszel test also developed here. Their statistical properties (accuracy and power) are studied via simulation and the results show that all tests are indistinguishable as to accuracy but they differ non-uniformly as to power. Empirical use of the tests is illustrated via analyses of published data sets and practical recommendations are given. The computer code in MATLAB and R to conduct these tests is available as Electronic Supplemental Material.
Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif
2014-12-01
A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.
A Nonparametric Geostatistical Method For Estimating Species Importance
Andrew J. Lister; Rachel Riemann; Michael Hoppus
2001-01-01
Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...
Test data from small solid propellant rocket motor plume measurements (FA-21)
NASA Technical Reports Server (NTRS)
Hair, L. M.; Somers, R. E.
1976-01-01
A program is described for obtaining a reliable, parametric set of measurements in the exhaust plumes of solid propellant rocket motors. Plume measurements included pressures, temperatures, forces, heat transfer rates, particle sampling, and high-speed movies. Approximately 210,000 digital data points and 15,000 movie frames were acquired. Measurements were made at points in the plumes via rake-mounted probes, and on the surface of a large plate impinged by the exhaust plume. Parametric variations were made in pressure altitude, propellant aluminum loading, impinged plate incidence angle and distance from nozzle exit to plate or rake. Reliability was incorporated by continual use of repeat runs. The test setup of the various hardware items is described along with an account of test procedures. Test results and data accuracy are discussed. Format of the data presentation is detailed. Complete data are included in the appendix.
Inhibition of Orthopaedic Implant Infections by Immunomodulatory Effects of Host Defense Peptides
2014-12-01
significance was determined by t- tests or by one-way analysis of variance (ANOVA) followed by Bonferroni post hoc tests in experiments with multiple...groups. Non- parametric Mann-Whitney tests , Kruskal-Wallis ANOVA followed by Newman-Kuels post hoc tests , or van Elteren’s two-way tests were applied to...in D, and black symbols in A), statistical analysis was by one-way ANOVA followed by Bonferroni versus control, post hoc tests . Otherwise, statistical
A Cartesian parametrization for the numerical analysis of material instability
Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.; ...
2016-02-25
We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less
A Cartesian parametrization for the numerical analysis of material instability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.
We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less
Zhang, Kai; Cao, Libo; Wang, Yulong; Hwang, Eunjoo; Reed, Matthew P; Forman, Jason; Hu, Jingwen
2017-10-01
Field data analyses have shown that obesity significantly increases the occupant injury risks in motor vehicle crashes, but the injury assessment tools for people with obesity are largely lacking. The objectives of this study were to use a mesh morphing method to rapidly generate parametric finite element models with a wide range of obesity levels and to evaluate their biofidelity against impact tests using postmortem human subjects (PMHS). Frontal crash tests using three PMHS seated in a vehicle rear seat compartment with body mass index (BMI) from 24 to 40 kg/m 2 were selected. To develop the human models matching the PMHS geometry, statistical models of external body shape, rib cage, pelvis, and femur were applied to predict the target geometry using age, sex, stature, and BMI. A mesh morphing method based on radial basis functions was used to rapidly morph a baseline human model into the target geometry. The model-predicted body excursions and injury measures were compared to the PMHS tests. Comparisons of occupant kinematics and injury measures between the tests and simulations showed reasonable correlations across the wide range of BMI levels. The parametric human models have the capability to account for the obesity effects on the occupant impact responses and injury risks. © 2017 The Obesity Society.
Aryal, Madhava P; Nagaraja, Tavarekere N; Brown, Stephen L; Lu, Mei; Bagher-Ebadian, Hassan; Ding, Guangliang; Panda, Swayamprava; Keenan, Kelly; Cabral, Glauber; Mikkelsen, Tom; Ewing, James R
2014-10-01
The distribution of dynamic contrast-enhanced MRI (DCE-MRI) parametric estimates in a rat U251 glioma model was analyzed. Using Magnevist as contrast agent (CA), 17 nude rats implanted with U251 cerebral glioma were studied by DCE-MRI twice in a 24 h interval. A data-driven analysis selected one of three models to estimate either (1) plasma volume (vp), (2) vp and forward volume transfer constant (K(trans)) or (3) vp, K(trans) and interstitial volume fraction (ve), constituting Models 1, 2 and 3, respectively. CA distribution volume (VD) was estimated in Model 3 regions by Logan plots. Regions of interest (ROIs) were selected by model. In the Model 3 ROI, descriptors of parameter distributions--mean, median, variance and skewness--were calculated and compared between the two time points for repeatability. All distributions of parametric estimates in Model 3 ROIs were positively skewed. Test-retest differences between population summaries for any parameter were not significant (p ≥ 0.10; Wilcoxon signed-rank and paired t tests). These and similar measures of parametric distribution and test-retest variance from other tumor models can be used to inform the choice of biomarkers that best summarize tumor status and treatment effects. Copyright © 2014 John Wiley & Sons, Ltd.
A Nonparametric K-Sample Test for Equality of Slopes.
ERIC Educational Resources Information Center
Penfield, Douglas A.; Koffler, Stephen L.
1986-01-01
The development of a nonparametric K-sample test for equality of slopes using Puri's generalized L statistic is presented. The test is recommended when the assumptions underlying the parametric model are violated. This procedure replaces original data with either ranks (for data with heavy tails) or normal scores (for data with light tails).…
Performance of DIMTEST-and NOHARM-Based Statistics for Testing Unidimensionality
ERIC Educational Resources Information Center
Finch, Holmes; Habing, Brian
2007-01-01
This Monte Carlo study compares the ability of the parametric bootstrap version of DIMTEST with three goodness-of-fit tests calculated from a fitted NOHARM model to detect violations of the assumption of unidimensionality in testing data. The effectiveness of the procedures was evaluated for different numbers of items, numbers of examinees,…
A Wind-Tunnel Parametric Investigation of Tiltrotor Whirl-Flutter Stability Boundaries
NASA Technical Reports Server (NTRS)
Piatak, David J.; Kvaternik, Raymond G.; Nixon, Mark W.; Langston, Chester W.; Singleton, Jeffrey D.; Bennett, Richard L.; Brown, Ross K.
2001-01-01
A wind-tunnel investigation of tiltrotor whirl-flutter stability boundaries has been conducted on a 1/5-size semispan tiltrotor model known as the Wing and Rotor Aeroelastic Test System (WRATS) in the NASA-Langley Transonic Dynamics Tunnel as part of a joint NASA/Army/Bell Helicopter Textron, Inc. (BHTI) research program. The model was first developed by BHTI as part of the JVX (V-22) research and development program in the 1980's and was recently modified to incorporate a hydraulically-actuated swashplate control system for use in active controls research. The modifications have changed the model's pylon mass properties sufficiently to warrant testing to re-establish its baseline stability boundaries. A parametric investigation of the effect of rotor design variables on stability was also conducted. The model was tested in both the on-downstop and off-downstop configurations, at cruise flight and hover rotor rotational speeds, and in both air and heavy gas (R-134a) test mediums. Heavy gas testing was conducted to quantify Mach number compressibility effects on tiltrotor stability. Experimental baseline stability boundaries in air are presented with comparisons to results from parametric variations of rotor pitch-flap coupling and control system stiffness. Increasing the rotor pitch-flap coupling (delta(sub 3) more negative) was found to have a destabilizing effect on stability, while a reduction in control system stiffness was found to have little effect on whirl-flutter stability. Results indicate that testing in R-134a, and thus matching full-scale tip Mach number, has a destabilizing effect, which demonstrates that whirl-flutter stability boundaries in air are unconservative.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Dunn, H. J.; Sandford, Maynard C.
1988-01-01
Nominal roll control laws were designed, implemented, and tested on an aeroelastically-scaled free-to-roll wind-tunnel model of an advanced fighter configuration. The tests were performed in the NASA Langley Transonic Dynamics Tunnel. A parametric study of the nominal roll control system was conducted. This parametric study determined possible control system gain variations which yielded identical closed-loop stability (roll mode pole location) and identical roll response but different maximum control-surface deflections. Comparison of analytical predictions with wind-tunnel results was generally very good.
Diode step stress program, JANTX1N5614
NASA Technical Reports Server (NTRS)
1978-01-01
The reliability of switching diode JANTX1N5614 was tested. The effect of power/temperature step stress on the diode was determined. Control sample units were maintained for verification of the electrical parametric testing. Results are reported.
NASA Technical Reports Server (NTRS)
Tornabene, Robert
2005-01-01
In pulse detonation engines, the potential exists for gas pulses from the combustor to travel upstream and adversely affect the inlet performance of the engine. In order to determine the effect of these high frequency pulses on the inlet performance, an air pulsation valve was developed to provide air pulses downstream of a supersonic parametric inlet test section. The purpose of this report is to document the design and characterization tests that were performed on a pulsation valve that was tested at the NASA Glenn Research Center 1x1 Supersonic Wind Tunnel (SWT) test facility. The high air flow pulsation valve design philosophy and analyses performed are discussed and characterization test results are presented. The pulsation valve model was devised based on the concept of using a free spinning ball valve driven from a variable speed electric motor to generate air flow pulses at preset frequencies. In order to deliver the proper flow rate, the flow port was contoured to maximize flow rate and minimize pressure drop. To obtain sharp pressure spikes the valve flow port was designed to be as narrow as possible to minimize port dwell time.
NASA Technical Reports Server (NTRS)
Washburn, David A.; Rumbaugh, Duane M.
1992-01-01
Nonhuman primates provide useful models for studying a variety of medical, biological, and behavioral topics. Four years of joystick-based automated testing of monkeys using the Language Research Center's Computerized Test System (LRC-CTS) are examined to derive hints and principles for comparable testing with other species - including humans. The results of multiple parametric studies are reviewed, and reliability data are presented to reveal the surprises and pitfalls associated with video-task testing of performance.
NASA Technical Reports Server (NTRS)
Buchholz, R. E.; Gamble, M.
1972-01-01
This test was run as a continuation of a prior investigation of aerodynamic performance and static stability tests for a parametric space shuttle launch vehicle. The purposes of this test were: (1) to obtain a more complete set of data in the transonic flight region, (2) to investigate new H-0 tank noseshapes and tank diameters, (3) to obtain control effectiveness data for the orbiter at 0 degree incidence and with a smaller diameter H-0 tank, and (4) to determine the effects of varying solid rocket motor-to-H0 tank gap size. Experimental data were obtained for angles of attack from -10 to +10 degrees and for angles of sideslip from +10 to -10 degrees at Mach numbers ranging from .6 to 4.96.
A Nonparametric Approach to Estimate Classification Accuracy and Consistency
ERIC Educational Resources Information Center
Lathrop, Quinn N.; Cheng, Ying
2014-01-01
When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…
Parametric tests of a 40-Ah bipolar nickel-hydrogen battery
NASA Technical Reports Server (NTRS)
Cataldo, R. L.
1986-01-01
A series of tests were performed to characterize battery performance relating to certain operating parameters which include charge current, discharge current, temperature, and pressure. The parameters were varied to confirm battery design concepts and to determine optimal operating conditions.
ERIC Educational Resources Information Center
Thompson, Bruce; Melancon, Janet G.
Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…
ERIC Educational Resources Information Center
Stephens, Torrance; Braithwaite, Harold; Johnson, Larry; Harris, Catrell; Katkowsky, Steven; Troutman, Adewale
2008-01-01
Objective: To examine impact of CVD risk reduction intervention for African-American men in the Atlanta Empowerment Zone (AEZ) designed to target anger management. Design: Wilcoxon Signed-Rank Test was employed as a non-parametric alternative to the t-test for independent samples. This test was employed because the data used in this analysis…
Transistor step stress program for JANTX2N4150
NASA Technical Reports Server (NTRS)
1979-01-01
Reliability analysis of the transistor JANTX2N4150 manufactured by General Semiconductor and Transitron is reported. The discrete devices were subjected to power and temperature step stress tests and then to electrical tests after completing the power/temperature step stress point. Control sample units were maintained for verification of the electrical parametric testing. Results are presented.
Royston, Patrick; Parmar, Mahesh K B
2014-08-07
Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.
Vexler, Albert; Tanajian, Hovig; Hutson, Alan D
In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.
Lee, Soomin; Katsuura, Tetsuo; Shimomura, Yoshihiro
2011-01-01
In recent years, a new type of speaker called the parametric speaker has been used to generate highly directional sound, and these speakers are now commercially available. In our previous study, we verified that the burden of the parametric speaker was lower than that of the general speaker for endocrine functions. However, nothing has yet been demonstrated about the effects of the shorter distance than 2.6 m between parametric speakers and the human body. Therefore, we investigated the distance effect on endocrinological function and subjective evaluation. Nine male subjects participated in this study. They completed three consecutive sessions: a 20-min quiet period as a baseline, a 30-min mental task period with general speakers or parametric speakers, and a 20-min recovery period. We measured salivary cortisol and chromogranin A (CgA) concentrations. Furthermore, subjects took the Kwansei-gakuin Sleepiness Scale (KSS) test before and after the task and also a sound quality evaluation test after it. Four experiments, one with a speaker condition (general speaker and parametric speaker), the other with a distance condition (0.3 m and 1.0 m), were conducted, respectively, at the same time of day on separate days. We used three-way repeated measures ANOVA (speaker factor × distance factor × time factor) to examine the effects of the parametric speaker. We found that the endocrinological functions were not significantly different between the speaker condition and the distance condition. The results also showed that the physiological burdens increased with progress in time independent of the speaker condition and distance condition.
Feng, Dai; Cortese, Giuliana; Baumgartner, Richard
2017-12-01
The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.
Once-through integral system (OTIS): Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gloudemans, J R
1986-09-01
A scaled experimental facility, designated the once-through integral system (OTIS), was used to acquire post-small break loss-of-coolant accident (SBLOCA) data for benchmarking system codes. OTIS was also used to investigate the application of the Abnormal Transient Operating Guidelines (ATOG) used in the Babcock and Wilcox (B and W) designed nuclear steam supply system (NSSS) during the course of an SBLOCA. OTIS was a single-loop facility with a plant to model power scale factor of 1686. OTIS maintained the key elevations, approximate component volumes, and loop flow resistances, and simulated the major component phenomena of a B and W raised-loop nuclearmore » plant. A test matrix consisting of 15 tests divided into four categories was performed. The largest group contained 10 tests and was defined to parametrically obtain an extensive set of plant-typical experimental data for code benchmarking. Parameters such as leak size, leak location, and high-pressure injection (HPI) shut-off head were individually varied. The remaining categories were specified to study the impact of the ATOGs (2 tests), to note the effect of guard heater operation on observed phenomena (2 tests), and to provide a data set for comparison with previous test experience (1 test). A summary of the test results and a detailed discussion of Test 220100 is presented. Test 220100 was the nominal or reference test for the parametric studies. This test was performed with a scaled 10-cm/sup 2/ leak located in the cold leg suction piping.« less
Observed changes in relative humidity and dew point temperature in coastal regions of Iran
NASA Astrophysics Data System (ADS)
Hosseinzadeh Talaee, P.; Sabziparvar, A. A.; Tabari, Hossein
2012-12-01
The analysis of trends in hydroclimatic parameters and assessment of their statistical significance have recently received a great concern to clarify whether or not there is an obvious climate change. In the current study, parametric linear regression and nonparametric Mann-Kendall tests were applied for detecting annual and seasonal trends in the relative humidity (RH) and dew point temperature ( T dew) time series at ten coastal weather stations in Iran during 1966-2005. The serial structure of the data was considered, and the significant serial correlations were eliminated using the trend-free pre-whitening method. The results showed that annual RH increased by 1.03 and 0.28 %/decade at the northern and southern coastal regions of the country, respectively, while annual T dew increased by 0.29 and 0.15°C per decade at the northern and southern regions, respectively. The significant trends were frequent in the T dew series, but they were observed only at 2 out of the 50 RH series. The results showed that the difference between the results of the parametric and nonparametric tests was small, although the parametric test detected larger significant trends in the RH and T dew time series. Furthermore, the differences between the results of the trend tests were not related to the normality of the statistical distribution.
The influence of intraocular pressure and air jet pressure on corneal contactless tonometry tests.
Simonini, Irene; Pandolfi, Anna
2016-05-01
The air puff is a dynamic contactless tonometer test used in ophthalmology clinical practice to assess the biomechanical properties of the human cornea and the intraocular pressure due to the filling fluids of the eye. The test is controversial, since the dynamic response of the cornea is governed by the interaction of several factors which cannot be discerned within a single measurement. In this study we describe a numerical model of the air puff tests, and perform a parametric analysis on the major action parameters (jet pressure and intraocular pressure) to assess their relevance on the mechanical response of a patient-specific cornea. The particular cornea considered here has been treated with laser reprofiling to correct myopia, and the parametric study has been conducted on both the preoperative and postoperative geometries. The material properties of the cornea have been obtained by means of an identification procedure that compares the static biomechanical response of preoperative and postoperative corneas under the physiological IOP. The parametric study on the intraocular pressure suggests that the displacement of the cornea׳s apex can be a reliable indicator for tonometry, and the one on the air jet pressure predicts the outcomes of two or more distinct measurements on the same cornea, which can be used in inverse procedures to estimate the material properties of the tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.
Serum and Plasma Metabolomic Biomarkers for Lung Cancer.
Kumar, Nishith; Shahjaman, Md; Mollah, Md Nurul Haque; Islam, S M Shahinul; Hoque, Md Aminul
2017-01-01
In drug invention and early disease prediction of lung cancer, metabolomic biomarker detection is very important. Mortality rate can be decreased, if cancer is predicted at the earlier stage. Recent diagnostic techniques for lung cancer are not prognosis diagnostic techniques. However, if we know the name of the metabolites, whose intensity levels are considerably changing between cancer subject and control subject, then it will be easy to early diagnosis the disease as well as to discover the drug. Therefore, in this paper we have identified the influential plasma and serum blood sample metabolites for lung cancer and also identified the biomarkers that will be helpful for early disease prediction as well as for drug invention. To identify the influential metabolites, we considered a parametric and a nonparametric test namely student׳s t-test as parametric and Kruskal-Wallis test as non-parametric test. We also categorized the up-regulated and down-regulated metabolites by the heatmap plot and identified the biomarkers by support vector machine (SVM) classifier and pathway analysis. From our analysis, we got 27 influential (p-value<0.05) metabolites from plasma sample and 13 influential (p-value<0.05) metabolites from serum sample. According to the importance plot through SVM classifier, pathway analysis and correlation network analysis, we declared 4 metabolites (taurine, aspertic acid, glutamine and pyruvic acid) as plasma biomarker and 3 metabolites (aspartic acid, taurine and inosine) as serum biomarker.
Loring, David W; Larrabee, Glenn J
2006-06-01
The Halstead-Reitan Battery has been instrumental in the development of neuropsychological practice in the United States. Although Reitan administered both the Wechsler-Bellevue Intelligence Scale and Halstead's test battery when evaluating Halstead's theory of biologic intelligence, the relative sensitivity of each test battery to brain damage continues to be an area of controversy. Because Reitan did not perform direct parametric analysis to contrast group performances, we reanalyze Reitan's original validation data from both Halstead (Reitan, 1955) and Wechsler batteries (Reitan, 1959a) and calculate effect sizes and probability levels using traditional parametric approaches. Eight of the 10 tests comprising Halstead's original Impairment Index, as well as the Impairment Index itself, statistically differentiated patients with unequivocal brain damage from controls. In addition, 13 of 14 Wechsler measures including Full-Scale IQ also differed statistically between groups (Brain Damage Full-Scale IQ = 96.2; Control Group Full Scale IQ = 112.6). We suggest that differences in the statistical properties of each battery (e.g., raw scores vs. standardized scores) likely contribute to classification characteristics including test sensitivity and specificity.
Fabrication and Testing of Microfluidic Optomechanical Oscillators
Han, Kewen; Kim, Kyu Hyun; Kim, Junhwan; Lee, Wonsuk; Liu, Jing; Fan, Xudong; Carmon, Tal; Bahl, Gaurav
2014-01-01
Cavity optomechanics experiments that parametrically couple the phonon modes and photon modes have been investigated in various optical systems including microresonators. However, because of the increased acoustic radiative losses during direct liquid immersion of optomechanical devices, almost all published optomechanical experiments have been performed in solid phase. This paper discusses a recently introduced hollow microfluidic optomechanical resonator. Detailed methodology is provided to fabricate these ultra-high-Q microfluidic resonators, perform optomechanical testing, and measure radiation pressure-driven breathing mode and SBS-driven whispering gallery mode parametric vibrations. By confining liquids inside the capillary resonator, high mechanical- and optical- quality factors are simultaneously maintained. PMID:24962013
Modelling road accident blackspots data with the discrete generalized Pareto distribution.
Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María
2014-10-01
This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Soblosky, J S; Colgin, L L; Chorney-Lane, D; Davidson, J F; Carey, M E
1997-12-30
Hindlimb and forelimb deficits in rats caused by sensorimotor cortex lesions are frequently tested by using the narrow flat beam (hindlimb), the narrow pegged beam (hindlimb and forelimb) or the grid-walking (forelimb) tests. Although these are excellent tests, the narrow flat beam generates non-parametric data so that using more powerful parametric statistical analyses are prohibited. All these tests can be difficult to score if the rat is moving rapidly. Foot misplacements, especially on the grid-walking test, are indicative of an ongoing deficit, but have not been reliably and accurately described and quantified previously. In this paper we present an easy to construct and use horizontal ladder-beam with a camera system on rails which can be used to evaluate both hindlimb and forelimb deficits in a single test. By slow motion videotape playback we were able to quantify and demonstrate foot misplacements which go beyond the recovery period usually seen using more conventional measures (i.e. footslips and footfaults). This convenient system provides a rapid and reliable method for recording and evaluating rat performance on any type of beam and may be useful for measuring sensorimotor recovery following brain injury.
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloff, H. R.
1977-01-01
A preliminary assessment of vibroacoustic test plan optimization for free flyer STS payloads is presented and the effects on alternate test plans for Spacelab sortie payloads number of missions are also examined. The component vibration failure probability and the number of components in the housekeeping subassemblies are provided. Decision models are used to evaluate the cost effectiveness of seven alternate test plans using protoflight hardware.
Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.
1978-01-01
Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.
NASA Astrophysics Data System (ADS)
Dhitareka, P. H.; Firman, H.; Rusyati, L.
2018-05-01
This research is comparing science virtual and paper-based test in measuring grade 7 students’ critical thinking based on Multiple Intelligences and gender. Quasi experimental method with within-subjects design is conducted in this research in order to obtain the data. The population of this research was all seventh grade students in ten classes of one public secondary school in Bandung. There were 71 students within two classes taken randomly became the sample in this research. The data are obtained through 28 questions with a topic of living things and environmental sustainability constructed based on eight critical thinking elements proposed by Inch then the questions provided in science virtual and paper-based test. The data was analysed by using paired-samples t test when the data are parametric and Wilcoxon signed ranks test when the data are non-parametric. In general comparison, the p-value of the comparison between science virtual and paper-based tests’ score is 0.506, indicated that there are no significance difference between science virtual and paper-based test based on the tests’ score. The results are furthermore supported by the students’ attitude result which is 3.15 from the scale from 1 to 4, indicated that they have positive attitudes towards Science Virtual Test.
Low noise parametric amplifiers for radio astronomy observations at 18-21 cm wavelength
NASA Technical Reports Server (NTRS)
Kanevskiy, B. Z.; Veselov, V. M.; Strukov, I. A.; Etkin, V. S.
1974-01-01
The principle characteristics and use of SHF parametric amplifiers for radiometer input devices are explored. Balanced parametric amplifiers (BPA) are considered as the SHF signal amplifiers allowing production of the amplifier circuit without a special filter to achieve decoupling. Formulas to calculate the basic parameters of a BPA are given. A modulator based on coaxial lines is discussed as the input element of the SHF. Results of laboratory tests of the receiver section and long-term stability studies of the SHF sector are presented.
ERIC Educational Resources Information Center
Steinhauer, H. M.
2012-01-01
Engineering graphics has historically been viewed as a challenging course to teach as students struggle to grasp and understand the fundamental concepts and then to master their proper application. The emergence of stable, fast, affordable 3D parametric modeling platforms such as CATIA, Pro-E, and AutoCAD while providing several pedagogical…
A Compilation of Hazard and Test Data for Pyrotechnic Compositions
1980-10-01
heated. These changes may be related to dehydration , decomposition , crystal- line transition, melting, boiling, vaporization, polymerization, oxidation...123 180 + 66 162 + 16 506 +169 447 +199 448+ 159 Decomposition temperature °C 277 + 102 561 j; 135 205 + 75 182 + 24 550 + 168 505 +224 517 + 153...of compatibility or classification. The following tests are included in the parametric tests: 1. Autoignition Temperature 2. Decomposition
The Importance of Practice in the Development of Statistics.
1983-01-01
RESOLUTION TEST CHART NATIONAL BUREAU OIF STANDARDS 1963 -A NRC Technical Summary Report #2471 C THE IMORTANCE OF PRACTICE IN to THE DEVELOPMENT OF STATISTICS...component analysis, bioassay, limits for a ratio, quality control, sampling inspection, non-parametric tests , transformation theory, ARIMA time series...models, sequential tests , cumulative sum charts, data analysis plotting techniques, and a resolution of the Bayes - frequentist controversy. It appears
Transistor step stress testing program for JANTX2N2905A
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2905A manufactured by Texas Instruments and Motorola is reported. A total of 48 samples from each manufacturer was submitted to the process outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests outlined in Table 2 after completing the prior power/temperature step stress point.
iconoclastic . Even at N=1024 these departures were quite appreciable at the testing tails, being greatest for chi-square and least for Z, and becoming worse in all cases at increasingly extreme tail areas. (Author)
A Simulation Comparison of Parametric and Nonparametric Dimensionality Detection Procedures
ERIC Educational Resources Information Center
Mroch, Andrew A.; Bolt, Daniel M.
2006-01-01
Recently, nonparametric methods have been proposed that provide a dimensionally based description of test structure for tests with dichotomous items. Because such methods are based on different notions of dimensionality than are assumed when using a psychometric model, it remains unclear whether these procedures might lead to a different…
NASA Technical Reports Server (NTRS)
Reveley, W. F.; Nuccio, P. P.
1975-01-01
Potable water for the Space Station Prototype life support system is generated by the vapor compression technique of vacuum distillation. A description of a complete three-man modular vapor compression water renovation loop that was built and tested is presented; included are all of the pumps, tankage, chemical post-treatment, instrumentation, and controls necessary to make the loop representative of an automatic, self-monitoring, null gravity system. The design rationale is given and the evolved configuration is described. Presented next are the results of an extensive parametric test during which distilled water was generated from urine and urinal flush water with concentration of solids in the evaporating liquid increasing progressively to 60 percent. Water quality, quantity and production rate are shown together with measured energy consumption rate in terms of watt-hours per kilogram of distilled water produced.
Packaging Technology for SiC High Temperature Circuits Operable up to 500 Degrees Centigrade
NASA Technical Reports Server (NTRS)
Chen, Lian-Yu
2002-01-01
New high temperature low power 8-pin packages have been fabricated using commercial fabrication service. These packages are made of aluminum nitride and 96 percent alumina with Au metallization. The new design of these packages provides the chips inside with EM shielding. Wirebond geometry control has been achieved for precise mechanical tests. Au wirebond samples with 45 degree heel-angle have been tested using wireloop test module. The geometry control improves the consistency of measurement of the wireloop breaking point.Also reported on is a parametric study of the thermomechanical reliability of a Au thick-film based SiC die-attach assembly using nonlinear finite element analysis (FEA) was conducted to optimize the die-attach thermo-mechanical performance for operation at temperatures from room temperature to 500 degrees Centigrade. This parametric study centered on material selection, structure design and process control.
Thermal Testing and Analysis of an Efficient High-Temperature Multi-Screen Internal Insulation
NASA Technical Reports Server (NTRS)
Weiland, Stefan; Handrick, Karin; Daryabeigi, Kamran
2007-01-01
Conventional multi-layer insulations exhibit excellent insulation performance but they are limited to the temperature range to which their components reflective foils and spacer materials are compatible. For high temperature applications, the internal multi-screen insulation IMI has been developed that utilizes unique ceramic material technology to produce reflective screens with high temperature stability. For analytical insulation sizing a parametric material model is developed that includes the main contributors for heat flow which are radiation and conduction. The adaptation of model-parameters based on effective steady-state thermal conductivity measurements performed at NASA Langley Research Center (LaRC) allows for extrapolation to arbitrary stack configurations and temperature ranges beyond the ones that were covered in the conductivity measurements. Experimental validation of the parametric material model was performed during the thermal qualification test of the X-38 Chin-panel, where test results and predictions showed a good agreement.
Local tests of gravitation with Gaia observations of Solar System Objects
NASA Astrophysics Data System (ADS)
Hees, Aurélien; Le Poncin-Lafitte, Christophe; Hestroffer, Daniel; David, Pedro
2018-04-01
In this proceeding, we show how observations of Solar System Objects with Gaia can be used to test General Relativity and to constrain modified gravitational theories. The high number of Solar System objects observed and the variety of their orbital parameters associated with the impressive astrometric accuracy will allow us to perform local tests of General Relativity. In this communication, we present a preliminary sensitivity study of the Gaia observations on dynamical parameters such as the Sun quadrupolar moment and on various extensions to general relativity such as the parametrized post-Newtonian parameters, the fifth force formalism and a violation of Lorentz symmetry parametrized by the Standard-Model extension framework. We take into account the time sequences and the geometry of the observations that are particular to Gaia for its nominal mission (5 years) and for an extended mission (10 years).
Assessing noninferiority in a three-arm trial using the Bayesian approach.
Ghosh, Pulak; Nathoo, Farouk; Gönen, Mithat; Tiwari, Ram C
2011-07-10
Non-inferiority trials, which aim to demonstrate that a test product is not worse than a competitor by more than a pre-specified small amount, are of great importance to the pharmaceutical community. As a result, methodology for designing and analyzing such trials is required, and developing new methods for such analysis is an important area of statistical research. The three-arm trial consists of a placebo, a reference and an experimental treatment, and simultaneously tests the superiority of the reference over the placebo along with comparing this reference to an experimental treatment. In this paper, we consider the analysis of non-inferiority trials using Bayesian methods which incorporate both parametric as well as semi-parametric models. The resulting testing approach is both flexible and robust. The benefit of the proposed Bayesian methods is assessed via simulation, based on a study examining home-based blood pressure interventions. Copyright © 2011 John Wiley & Sons, Ltd.
On the efficacy of procedures to normalize Ex-Gaussian distributions.
Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío
2014-01-01
Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results.
RSRA sixth scale wind tunnel test. Tabulated balance data, volume 2
NASA Technical Reports Server (NTRS)
Ruddell, A.; Flemming, R.
1974-01-01
Summaries are presented of all the force and moment data acquired during the RSRA Sixth Scale Wind Tunnel Test. These data include and supplement the data presented in curve form in previous reports. Each summary includes the model configuration, wing and empennage incidences and deflections, and recorded balance data. The first group of data in each summary presents the force and moment data in full scale parametric form, the dynamic pressure and velocity in the test section, and the powered nacelle fan speed. The second and third groups of data are the balance data in nondimensional coefficient form. The wind axis coefficient data corresponds to the parametric data divided by the wing area for forces and divided by the product of the wing area and wing span or mean aerodynamic chord for moments. The stability axis data resolves the wind axis data with respect to the angle of yaw.
The 32nd CDC: System identification using interval dynamic models
NASA Technical Reports Server (NTRS)
Keel, L. H.; Lew, J. S.; Bhattacharyya, S. P.
1992-01-01
Motivated by the recent explosive development of results in the area of parametric robust control, a new technique to identify a family of uncertain systems is identified. The new technique takes the frequency domain input and output data obtained from experimental test signals and produces an 'interval transfer function' that contains the complete frequency domain behavior with respect to the test signals. This interval transfer function is one of the key concepts in the parametric robust control approach and identification with such an interval model allows one to predict the worst case performance and stability margins using recent results on interval systems. The algorithm is illustrated by applying it to an 18 bay Mini-Mast truss structure.
Electrical Characterization of Hughes HCMP 1852D and RCA CDP1852D 8-bit, CMOS, I/O Ports
NASA Technical Reports Server (NTRS)
Stokes, R. L.
1979-01-01
Twenty-five Hughes HCMP 1852D and 25 RCA CDP1852D 8-bit, CMOS, I/O port microcircuits underwent electrical characterization tests. All electrical measurements were performed on a Tektronix S-3260 Test System. Before electrical testing, the devices were subjected to a 168-hour burn-in at 125 C with the inputs biased at 10V. Four of the Hughes parts became inoperable during testing. They exhibited functional failures and out-of-range parametric measurements after a few runs of the test program.
Entropy-based goodness-of-fit test: Application to the Pareto distribution
NASA Astrophysics Data System (ADS)
Lequesne, Justine
2013-08-01
Goodness-of-fit tests based on entropy have been introduced in [13] for testing normality. The maximum entropy distribution in a class of probability distributions defined by linear constraints induces a Pythagorean equality between the Kullback-Leibler information and an entropy difference. This allows one to propose a goodness-of-fit test for maximum entropy parametric distributions which is based on the Kullback-Leibler information. We will focus on the application of the method to the Pareto distribution. The power of the proposed test is computed through Monte Carlo simulation.
Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.
2016-01-01
In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration (High Freq) performed similarly to non-parametric methods, but had the highest recall values, suggesting that this method could be employed for automatic tremor detection. PMID:27258018
Hwang, Eunjoo; Hu, Jingwen; Chen, Cong; Klein, Katelyn F; Miller, Carl S; Reed, Matthew P; Rupp, Jonathan D; Hallman, Jason J
2016-11-01
Occupant stature and body shape may have significant effects on injury risks in motor vehicle crashes, but the current finite element (FE) human body models (HBMs) only represent occupants with a few sizes and shapes. Our recent studies have demonstrated that, by using a mesh morphing method, parametric FE HBMs can be rapidly developed for representing a diverse population. However, the biofidelity of those models across a wide range of human attributes has not been established. Therefore, the objectives of this study are 1) to evaluate the accuracy of HBMs considering subject-specific geometry information, and 2) to apply the parametric HBMs in a sensitivity analysis for identifying the specific parameters affecting body responses in side impact conditions. Four side-impact tests with two male post-mortem human subjects (PMHSs) were selected to evaluate the accuracy of the geometry and impact responses of the morphed HBMs. For each PMHS test, three HBMs were simulated to compare with the test results: the original Total Human Model for Safety (THUMS) v4.01 (O-THUMS), a parametric THUMS (P-THUMS), and a subject-specific THUMS (S-THUMS). The P-THUMS geometry was predicted from only age, sex, stature, and BMI using our statistical geometry models of skeleton and body shape, while the S-THUMS geometry was based on each PMHS's CT data. The simulation results showed a preliminary trend that the correlations between the PTHUMS- predicted impact responses and the four PMHS tests (mean-CORA: 0.84, 0.78, 0.69, 0.70) were better than those between the O-THUMS and the normalized PMHS responses (mean-CORA: 0.74, 0.72, 0.55, 0.63), while they are similar to the correlations between S-THUMS and the PMHS tests (mean-CORA: 0.85, 0.85, 0.67, 0.72). The sensitivity analysis using the PTHUMS showed that, in side impact conditions, the HBM skeleton and body shape geometries as well as the body posture were more important in modeling the occupant impact responses than the bone and soft tissue material properties and the padding stiffness with the given parameter ranges. More investigations are needed to further support these findings.
Parametric Inlet Tested in Glenn's 10- by 10-Foot Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Slater, John W.; Davis, David O.; Solano, Paul A.
2005-01-01
The Parametric Inlet is an innovative concept for the inlet of a gas-turbine propulsion system for supersonic aircraft. The concept approaches the performance of past inlet concepts, but with less mechanical complexity, lower weight, and greater aerodynamic stability and safety. Potential applications include supersonic cruise aircraft and missiles. The Parametric Inlet uses tailored surfaces to turn the incoming supersonic flow inward toward an axis of symmetry. The terminal shock spans the opening of the subsonic diffuser leading to the engine. The external cowl area is smaller, which reduces cowl drag. The use of only external supersonic compression avoids inlet unstart--an unsafe shock instability present in previous inlet designs that use internal supersonic compression. This eliminates the need for complex mechanical systems to control unstart, which reduces weight. The conceptual design was conceived by TechLand Research, Inc. (North Olmsted, OH), which received funding through NASA s Small-Business Innovation Research program. The Boeing Company (Seattle, WA) also participated in the conceptual design. The NASA Glenn Research Center became involved starting with the preliminary design of a model for testing in Glenn s 10- by 10-Foot Supersonic Wind Tunnel (10 10 SWT). The inlet was sized for a speed of Mach 2.35 while matching requirements of an existing cold pipe used in previous inlet tests. The parametric aspects of the model included interchangeable components for different cowl lip, throat slot, and sidewall leading-edge shapes and different vortex generator configurations. Glenn researchers used computational fluid dynamics (CFD) tools for three-dimensional, turbulent flow analysis to further refine the aerodynamic design.
Bignardi, A B; El Faro, L; Cardoso, V L; Machado, P F; Albuquerque, L G
2009-09-01
The objective of the present study was to estimate milk yield genetic parameters applying random regression models and parametric correlation functions combined with a variance function to model animal permanent environmental effects. A total of 152,145 test-day milk yields from 7,317 first lactations of Holstein cows belonging to herds located in the southeastern region of Brazil were analyzed. Test-day milk yields were divided into 44 weekly classes of days in milk. Contemporary groups were defined by herd-test-day comprising a total of 2,539 classes. The model included direct additive genetic, permanent environmental, and residual random effects. The following fixed effects were considered: contemporary group, age of cow at calving (linear and quadratic regressions), and the population average lactation curve modeled by fourth-order orthogonal Legendre polynomial. Additive genetic effects were modeled by random regression on orthogonal Legendre polynomials of days in milk, whereas permanent environmental effects were estimated using a stationary or nonstationary parametric correlation function combined with a variance function of different orders. The structure of residual variances was modeled using a step function containing 6 variance classes. The genetic parameter estimates obtained with the model using a stationary correlation function associated with a variance function to model permanent environmental effects were similar to those obtained with models employing orthogonal Legendre polynomials for the same effect. A model using a sixth-order polynomial for additive effects and a stationary parametric correlation function associated with a seventh-order variance function to model permanent environmental effects would be sufficient for data fitting.
NASA Astrophysics Data System (ADS)
Gagnard, Xavier; Bonnaud, Olivier
2000-08-01
We have recently published a paper on a new rapid method for the determination of the lifetime of the gate oxide involved in a Bipolar/CMOS/DMOS technology (BCD). Because this previous method was based on a current measurement with gate voltage as a parameter needing several stress voltages, it was applied only by lot sampling. Thus, we tried to find an indicator in order to monitor the gate oxide lifetime during the wafer level parametric test and involving only one measurement of the device on each wafer test cell. Using the Weibull law and Crook model, combined with our recent model, we have developed a new test method needing only one electrical measurement of MOS capacitor to monitor the quality of the gate oxide. Based also on a current measurement, the parameter is the lifetime indicator of the gate oxide. From the analysis of several wafers, we gave evidence of the possibility to detect a low performance wafer, which corresponds to the infantile failure on the Weibull plot. In order to insert this new method in the BCD parametric program, a parametric flowchart was established. This type of measurement is an important challenges, because the actual measurements, breakdown charge, Qbd, and breakdown electric field, Ebd, at parametric level and Ebd and interface states density, Dit during the process cannot guarantee the gate oxide lifetime all along fabrication process. This indicator measurement is the only one, which predicts the lifetime decrease.
Toward an Empirically-Based Parametric Explosion Spectral Model
2010-09-01
estimated (Richards and Kim, 2009). This archive could potentially provide 200 recordings of explosions at Semipalatinsk Test Site of the former Soviet...estimates of explosion yield, and prior work at the Nevada Test Site (NTS) (e.g., Walter et al., 1995) has found that explosions in weak materials have...2007). Corner frequency scaling of regional seismic phases for underground nuclear explosions at the Nevada Test Site , Bull. Seismol. Soc. Am. 97
ERIC Educational Resources Information Center
Touron, Javier; Lizasoain, Luis; Joaristi, Luis
2012-01-01
The aim of this work is to analyze the dimensional structure of the Spanish version of the School and College Ability Test, employed in the process for the identification of students with high intellectual abilities. This test measures verbal and mathematical (or quantitative) abilities at three levels of difficulty: elementary (3rd, 4th, and 5th…
Testing of the Trim Tab Parametric Model in NASA Langley's Unitary Plan Wind Tunnel
NASA Technical Reports Server (NTRS)
Murphy, Kelly J.; Watkins, Anthony N.; Korzun, Ashley M.; Edquist, Karl T.
2013-01-01
In support of NASA's Entry, Descent, and Landing technology development efforts, testing of Langley's Trim Tab Parametric Models was conducted in Test Section 2 of NASA Langley's Unitary Plan Wind Tunnel. The objectives of these tests were to generate quantitative aerodynamic data and qualitative surface pressure data for experimental and computational validation and aerodynamic database development. Six component force-and-moment data were measured on 38 unique, blunt body trim tab configurations at Mach numbers of 2.5, 3.5, and 4.5, angles of attack from -4deg to +20deg, and angles of sideslip from 0deg to +8deg. Configuration parameters investigated in this study were forebody shape, tab area, tab cant angle, and tab aspect ratio. Pressure Sensitive Paint was used to provide qualitative surface pressure mapping for a subset of these flow and configuration variables. Over the range of parameters tested, the effects of varying tab area and tab cant angle were found to be much more significant than varying tab aspect ratio relative to key aerodynamic performance requirements. Qualitative surface pressure data supported the integrated aerodynamic data and provided information to aid in future analyses of localized phenomena for trim tab configurations.
Selected Parametric Effects on Materials Flammability Limits
NASA Technical Reports Server (NTRS)
Hirsch, David B.; Juarez, Alfredo; Peyton, Gary J.; Harper, Susana A.; Olson, Sandra L.
2011-01-01
NASA-STD-(I)-6001B Test 1 is currently used to evaluate the flammability of materials intended for use in habitable environments of U.S. spacecraft. The method is a pass/fail upward flame propagation test conducted in the worst case configuration, which is defined as a combination of a material s thickness, test pressure, oxygen concentration, and temperature that make the material most flammable. Although simple parametric effects may be intuitive (such as increasing oxygen concentrations resulting in increased flammability), combinations of multi-parameter effects could be more complex. In addition, there are a variety of material configurations used in spacecraft. Such configurations could include, for example, exposed free edges where fire propagation may be different when compared to configurations commonly employed in standard testing. Studies involving combined oxygen concentration, pressure, and temperature on flammability limits have been conducted and are summarized in this paper. Additional effects on flammability limits of a material s thickness, mode of ignition, burn-length criteria, and exposed edges are presented. The information obtained will allow proper selection of ground flammability test conditions, support further studies comparing flammability in 1-g with microgravity and reduced gravity environments, and contribute to persuasive scientific cases for rigorous space system fire risk assessments.
Multiple Hypothesis Testing for Experimental Gingivitis Based on Wilcoxon Signed Rank Statistics
Preisser, John S.; Sen, Pranab K.; Offenbacher, Steven
2011-01-01
Dental research often involves repeated multivariate outcomes on a small number of subjects for which there is interest in identifying outcomes that exhibit change in their levels over time as well as to characterize the nature of that change. In particular, periodontal research often involves the analysis of molecular mediators of inflammation for which multivariate parametric methods are highly sensitive to outliers and deviations from Gaussian assumptions. In such settings, nonparametric methods may be favored over parametric ones. Additionally, there is a need for statistical methods that control an overall error rate for multiple hypothesis testing. We review univariate and multivariate nonparametric hypothesis tests and apply them to longitudinal data to assess changes over time in 31 biomarkers measured from the gingival crevicular fluid in 22 subjects whereby gingivitis was induced by temporarily withholding tooth brushing. To identify biomarkers that can be induced to change, multivariate Wilcoxon signed rank tests for a set of four summary measures based upon area under the curve are applied for each biomarker and compared to their univariate counterparts. Multiple hypothesis testing methods with choice of control of the false discovery rate or strong control of the family-wise error rate are examined. PMID:21984957
ERIC Educational Resources Information Center
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
NASA Astrophysics Data System (ADS)
Herrera-Grimaldi, Pascual; García-Marín, Amanda; Ayuso-Muñoz, José Luís; Flamini, Alessia; Morbidelli, Renato; Ayuso-Ruíz, José Luís
2018-02-01
The increase of air surface temperature at global scale is a fact with values around 0.85 °C since the late nineteen century. Nevertheless, the increase is not equally distributed all over the world, varying from one region to others. Thus, it becomes interesting to study the evolution of temperature indices for a certain area in order to analyse the existence of climatic trend in it. In this work, monthly temperature time series from two Mediterranean areas are used: the Umbria region in Italy, and the Guadalquivir Valley in southern Spain. For the available stations, six temperature indices (three annual and three monthly) of mean, average maximum and average minimum temperature have been obtained, and the existence of trends has been studied by applying the non-parametric Mann-Kendall test. Both regions show a general increase in all temperature indices, being the pattern of the trends clearer in Spain than in Italy. The Italian area is the only one at which some negative trends are detected. The presence of break points in the temperature series has been also studied by using the non-parametric Pettit test and the parametric standard normal homogeneity test (SNHT), most of which may be due to natural phenomena.
BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs
Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen
2014-01-01
Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471
A Versatile Omnibus Test for Detecting Mean and Variance Heterogeneity
Bailey, Matthew; Kauwe, John S. K.; Maxwell, Taylor J.
2014-01-01
Recent research has revealed loci that display variance heterogeneity through various means such as biological disruption, linkage disequilibrium (LD), gene-by-gene (GxG), or gene-by-environment (GxE) interaction. We propose a versatile likelihood ratio test that allows joint testing for mean and variance heterogeneity (LRTMV) or either effect alone (LRTM or LRTV) in the presence of covariates. Using extensive simulations for our method and others we found that all parametric tests were sensitive to non-normality regardless of any trait transformations. Coupling our test with the parametric bootstrap solves this issue. Using simulations and empirical data from a known mean-only functional variant we demonstrate how linkage disequilibrium (LD) can produce variance-heterogeneity loci (vQTL) in a predictable fashion based on differential allele frequencies, high D’ and relatively low r2 values. We propose that a joint test for mean and variance heterogeneity is more powerful than a variance only test for detecting vQTL. This takes advantage of loci that also have mean effects without sacrificing much power to detect variance only effects. We discuss using vQTL as an approach to detect gene-by-gene interactions and also how vQTL are related to relationship loci (rQTL) and how both can create prior hypothesis for each other and reveal the relationships between traits and possibly between components of a composite trait. PMID:24482837
A Robust Semi-Parametric Test for Detecting Trait-Dependent Diversification.
Rabosky, Daniel L; Huang, Huateng
2016-03-01
Rates of species diversification vary widely across the tree of life and there is considerable interest in identifying organismal traits that correlate with rates of speciation and extinction. However, it has been challenging to develop methodological frameworks for testing hypotheses about trait-dependent diversification that are robust to phylogenetic pseudoreplication and to directionally biased rates of character change. We describe a semi-parametric test for trait-dependent diversification that explicitly requires replicated associations between character states and diversification rates to detect effects. To use the method, diversification rates are reconstructed across a phylogenetic tree with no consideration of character states. A test statistic is then computed to measure the association between species-level traits and the corresponding diversification rate estimates at the tips of the tree. The empirical value of the test statistic is compared to a null distribution that is generated by structured permutations of evolutionary rates across the phylogeny. The test is applicable to binary discrete characters as well as continuous-valued traits and can accommodate extremely sparse sampling of character states at the tips of the tree. We apply the test to several empirical data sets and demonstrate that the method has acceptable Type I error rates. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ku band low noise parametric amplifier
NASA Technical Reports Server (NTRS)
1976-01-01
A low noise, K sub u-band, parametric amplifier (paramp) was developed. The unit is a spacecraft-qualifiable, prototype, parametric amplifier for eventual application in the shuttle orbiter. The amplifier was required to have a noise temperature of less than 150 K. A noise temperature of less than 120 K at a gain level of 17 db was achieved. A 3-db bandwidth in excess of 350 MHz was attained, while deviation from phase linearity of about + or - 1 degree over 50 MHz was achieved. The paramp operates within specification over an ambient temperature range of -5 C to +50 C. The performance requirements and the operation of the K sub u-band parametric amplifier system are described. The final test results are also given.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Five-Point Likert Items: t Test versus Mann-Whitney-Wilcoxon
ERIC Educational Resources Information Center
de Winter, Joost C. F.; Dodou, Dimitra
2010-01-01
Likert questionnaires are widely used in survey research, but it is unclear whether the item data should be investigated by means of parametric or nonparametric procedures. This study compared the Type I and II error rates of the "t" test versus the Mann-Whitney-Wilcoxon (MWW) for five-point Likert items. Fourteen population…
Bootstrapping in Applied Linguistics: Assessing Its Potential Using Shared Data
ERIC Educational Resources Information Center
Plonsky, Luke; Egbert, Jesse; Laflair, Geoffrey T.
2015-01-01
Parametric analyses such as t tests and ANOVAs are the norm--if not the default--statistical tests found in quantitative applied linguistics research (Gass 2009). Applied statisticians and one applied linguist (Larson-Hall 2010, 2012; Larson-Hall and Herrington 2010), however, have argued that this approach may not be appropriate for small samples…
The report gives results of a comprehensive, pilot, dry, SO2 scrubbing test program to determine the effects of process variables on SO2 removal. In the spray dryer, stoichiometric ratio, flue gas temperature approach to adiabatic saturation, and temperature drop across the spray...
2016-04-30
support contractor , Infoscitex, conducted a series of tests to identify the performance capabilities of the Vertical Impact Device (VID). The VID is a...C. Table 3. AFD Evaluation with Red IMPAC Programmer: Data Summary Showing Means and Standard Deviations Test Cell Drop Ht . (in) Mean Peak
Assessment of Adolescent Perceptions on Parental Attitudes on Different Variables
ERIC Educational Resources Information Center
Ersoy, Evren
2015-01-01
The purpose of this study is to examine secondary school student perceptions of parental attitudes with regards to specific variables. Independent samples t test for parametric distributions and one-way variance analysis (ANOVA) was used for analyzing the data, when the ANOVA analyses were significant Scheffe test was conducted on homogeneous…
Density Fluctuations in the Solar Wind Driven by Alfvén Wave Parametric Decay
NASA Astrophysics Data System (ADS)
Bowen, Trevor A.; Badman, Samuel; Hellinger, Petr; Bale, Stuart D.
2018-02-01
Measurements and simulations of inertial compressive turbulence in the solar wind are characterized by anti-correlated magnetic fluctuations parallel to the mean field and density structures. This signature has been interpreted as observational evidence for non-propagating pressure balanced structures, kinetic ion-acoustic waves, as well as the MHD slow-mode. Given the high damping rates of parallel propagating compressive fluctuations, their ubiquity in satellite observations is surprising and suggestive of a local driving process. One possible candidate for the generation of compressive fluctuations in the solar wind is the Alfvén wave parametric instability. Here, we test the parametric decay process as a source of compressive waves in the solar wind by comparing the collisionless damping rates of compressive fluctuations with growth rates of the parametric decay instability daughter waves. Our results suggest that generation of compressive waves through parametric decay is overdamped at 1 au, but that the presence of slow-mode-like density fluctuations is correlated with the parametric decay of Alfvén waves.
Diode step stress program for JANTX1N5615
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the switching diode JANTX1N5615 manufactured by Semtech and Micro semiconductor was examined. A total of 48 samples from each manufacturer were submitted to the process. In addition, two control sample units were maintained for verification of the electrical parametric testing. All test samples were subjected to the electrical tests after completing the prior power/temperature step stress point. Results are presented.
A parametric study of perforated muzzle brakes
NASA Astrophysics Data System (ADS)
Dillon, Robert E., Jr.; Nagamatsu, H. T.
1993-07-01
A firing test was conducted to study the parameters influencing the recoil efficiency and the blast characteristics of perforated muzzle brakes. Several scaled (20 mm) devices were tested as candidates for the 105 mm Armored Gun System (AGS). Recoil impulse, blast overpressures, muzzle velocity, sequential spark shadowgraphs, and photographs of the muzzle flash were obtained. A total of nine different perforated brakes were tested as well as a scaled M 198 double muzzle brake.
2017-01-04
response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and maintaining...configurations with a restrained manikin, was evaluated in four different test series . Test Series 1 was conducted to determine the materials and...5 ms TTP. Test Series 2 was conducted to determine the materials and drop heights required for energy attenuation of the seat pan to generate a 4 m
NASA Astrophysics Data System (ADS)
Franzini, Guilherme Rosa; Santos, Rebeca Caramêz Saraiva; Pesce, Celso Pupo
2017-12-01
This paper aims to numerically investigate the effects of parametric instability on piezoelectric energy harvesting from the transverse galloping of a square prism. A two degrees-of-freedom reduced-order model for this problem is proposed and numerically integrated. A usual quasi-steady galloping model is applied, where the transverse force coefficient is adopted as a cubic polynomial function with respect to the angle of attack. Time-histories of nondimensional prism displacement, electric voltage and power dissipated at both the dashpot and the electrical resistance are obtained as functions of the reduced velocity. Both, oscillation amplitude and electric voltage, increased with the reduced velocity for all parametric excitation conditions tested. For low values of reduced velocity, 2:1 parametric excitation enhances the electric voltage. On the other hand, for higher reduced velocities, a 1:1 parametric excitation (i.e., the same as the natural frequency) enhances both oscillation amplitude and electric voltage. It has been also found that, depending on the parametric excitation frequency, the harvested electrical power can be amplified in 70% when compared to the case under no parametric excitation.
NASA Astrophysics Data System (ADS)
Lototzis, M.; Papadopoulos, G. K.; Droulia, F.; Tseliou, A.; Tsiros, I. X.
2018-04-01
There are several cases where a circular variable is associated with a linear one. A typical example is wind direction that is often associated with linear quantities such as air temperature and air humidity. The analysis of a statistical relationship of this kind can be tested by the use of parametric and non-parametric methods, each of which has its own advantages and drawbacks. This work deals with correlation analysis using both the parametric and the non-parametric procedure on a small set of meteorological data of air temperature and wind direction during a summer period in a Mediterranean climate. Correlations were examined between hourly, daily and maximum-prevailing values, under typical and non-typical meteorological conditions. Both tests indicated a strong correlation between mean hourly wind directions and mean hourly air temperature, whereas mean daily wind direction and mean daily air temperature do not seem to be correlated. In some cases, however, the two procedures were found to give quite dissimilar levels of significance on the rejection or not of the null hypothesis of no correlation. The simple statistical analysis presented in this study, appropriately extended in large sets of meteorological data, may be a useful tool for estimating effects of wind on local climate studies.
Testing the cosmic conservation of photon number with type Ia supernovae and ages of old objects
NASA Astrophysics Data System (ADS)
Jesus, J. F.; Holanda, R. F. L.; Dantas, M. A.
2017-12-01
In this paper, we obtain luminosity distances by using ages of 32 old passive galaxies distributed over the redshift interval 0.11< z < 1.84 and test the cosmic conservation of photon number by comparing them with 580 distance moduli of type Ia supernovae (SNe Ia) from the so-called Union 2.1 compilation. Our analyses are based on the fact that the method of obtaining ages of galaxies relies on the detailed shape of galaxy spectra but not on galaxy luminosity. Possible departures from cosmic conservation of photon number is parametrized by τ (z) = 2 ɛ z and τ (z) = ɛ z/(1+z) (for ɛ =0 the conservation of photon number is recovered). We find ɛ =0.016^{+0.078}_{-0.075} from the first parametrization and ɛ =- 0.18^{+0.25}_{-0.24} from the second parametrization, both limits at 95% c.l. In this way, no significant departure from cosmic conservation of photon number is verified. In addition, by considering the total age as inferred from Planck (2015) analysis, we find the incubation time t_{inc}=1.66± 0.29 Gyr and t_{inc}=1.23± 0.27 Gyr at 68% c.l. for each parametrization, respectively.
Accelerated stress testing of terrestrial solar cells
NASA Technical Reports Server (NTRS)
Prince, J. L.; Lathrop, J. W.
1979-01-01
A program to investigate the reliability characteristics of unencapsulated low-cost terrestrial solar cells using accelerated stress testing is described. Reliability (or parametric degradation) factors appropriate to the cell technologies and use conditions were studied and a series of accelerated stress tests was synthesized. An electrical measurement procedure and a data analysis and management system was derived, and stress test fixturing and material flow procedures were set up after consideration was given to the number of cells to be stress tested and measured and the nature of the information to be obtained from the process. Selected results and conclusions are presented.
Computational Test Cases for a Rectangular Supercritical Wing Undergoing Pitching Oscillations
NASA Technical Reports Server (NTRS)
Bennett, Robert M.; Walker, Charlotte E.
1999-01-01
Proposed computational test cases have been selected from the data set for a rectangular wing of panel aspect ratio two with a twelve-percent-thick supercritical airfoil section that was tested in the NASA Langley Transonic Dynamics Tunnel. The test cases include parametric variation of static angle of attack, pitching oscillation frequency, and Mach numbers from subsonic to transonic with strong shocks. Tables and plots of the measured pressures are presented for each case. This report provides an early release of test cases that have been proposed for a document that supplements the cases presented in AGARD Report 702.
Nonparametric estimation and testing of fixed effects panel data models
Henderson, Daniel J.; Carroll, Raymond J.; Li, Qi
2009-01-01
In this paper we consider the problem of estimating nonparametric panel data models with fixed effects. We introduce an iterative nonparametric kernel estimator. We also extend the estimation method to the case of a semiparametric partially linear fixed effects model. To determine whether a parametric, semiparametric or nonparametric model is appropriate, we propose test statistics to test between the three alternatives in practice. We further propose a test statistic for testing the null hypothesis of random effects against fixed effects in a nonparametric panel data regression model. Simulations are used to examine the finite sample performance of the proposed estimators and the test statistics. PMID:19444335
Jacher, Joseph E.; Martin, Lisa J.; Chung, Wendy K.; Loyd, James E.; Nichols, William C.
2017-01-01
Pulmonary arterial hypertension (PAH) is characterized by obstruction of pre-capillary pulmonary arteries, which leads to sustained elevation of pulmonary arterial pressure. Identifying those at risk through early interventions, such as genetic testing, may mitigate disease course. Current practice guidelines recommend genetic counseling and offering genetic testing to individuals with heritable PAH, idiopathic PAH, and their family members. However, it is unclear if PAH specialists follow these recommendations. Thus, our research objective was to determine PAH specialists’ knowledge, utilization, and perceptions about genetic counseling and genetic testing. A survey was designed and distributed to PAH specialists who primarily work in the USA to assess their knowledge, practices, and attitudes about the genetics of PAH. Participants’ responses were analyzed using parametric and non-parametric statistics and groups were compared using the Wilcoxon rank sum test. PAH specialists had low perceived and actual knowledge of the genetics of PAH, with 13.2% perceiving themselves as knowledgeable and 27% actually being knowledgeable. Although these specialists had positive or ambivalent attitudes about genetic testing and genetic counseling, they had poor utilization of these genetic services, with almost 80% of participants never or rarely ordering genetic testing or referring their patients with PAH for genetic counseling. Physicians were more knowledgeable, but had lower perceptions of the value of genetic testing and genetic counseling compared to non-physicians (P < 0.05). The results suggest that increased education and awareness is needed about the genetics of PAH as well as the benefits of genetic testing and genetic counseling for individuals who treat patients with PAH. PMID:28597770
Code of Federal Regulations, 2011 CFR
2011-07-01
... performance test deadline for PM CEMS. Relative accuracy testing for other CEMS need not be repeated if that... system. (i) Installation of the continuous monitoring system sampling probe or other interface at a... equipment specifications for the sample interface, the pollutant concentration or parametric signal analyzer...
A Semiparametric Model for Jointly Analyzing Response Times and Accuracy in Computerized Testing
ERIC Educational Resources Information Center
Wang, Chun; Fan, Zhewen; Chang, Hua-Hua; Douglas, Jeffrey A.
2013-01-01
The item response times (RTs) collected from computerized testing represent an underutilized type of information about items and examinees. In addition to knowing the examinees' responses to each item, we can investigate the amount of time examinees spend on each item. Current models for RTs mainly focus on parametric models, which have the…
ERIC Educational Resources Information Center
Oliveri, Maria Elena; Olson, Brent F.; Ercikan, Kadriye; Zumbo, Bruno D.
2012-01-01
In this study, the Canadian English and French versions of the Problem-Solving Measure of the Programme for International Student Assessment 2003 were examined to investigate their degree of measurement comparability at the item- and test-levels. Three methods of differential item functioning (DIF) were compared: parametric and nonparametric item…
The urban heat island in Rio de Janeiro, Brazil, in the last 30 years using remote sensing data
NASA Astrophysics Data System (ADS)
Peres, Leonardo de Faria; Lucena, Andrews José de; Rotunno Filho, Otto Corrêa; França, José Ricardo de Almeida
2018-02-01
The aim of this work is to study urban heat island (UHI) in Metropolitan Area of Rio de Janeiro (MARJ) based on the analysis of land-surface temperature (LST) and land-use patterns retrieved from Landsat-5/Thematic Mapper (TM), Landsat-7/Enhanced Thematic Mapper Plus (ETM+) and Landsat-8/Operational Land Imager (OLI) and Thermal Infrared Sensors (TIRS) data covering a 32-year period between 1984 and 2015. LST temporal evolution is assessed by comparing the average LST composites for 1984-1999 and 2000-2015 where the parametric Student t-test was conducted at 5% significance level to map the pixels where LST for the more recent period is statistically significantly greater than the previous one. The non-parametric Mann-Whitney-Wilcoxon rank sum test has also confirmed at the same 5% significance level that the more recent period (2000-2015) has higher LST values. UHI intensity between ;urban; and ;rural/urban low density; (;vegetation;) areas for 1984-1999 and 2000-2015 was established and confirmed by both parametric and non-parametric tests at 1% significance level as 3.3 °C (5.1 °C) and 4.4 °C (7.1 °C), respectively. LST has statistically significantly (p-value < 0.01) increased over time in two of three land cover classes (;urban; and ;urban low density;), respectively by 1.9 °C and 0.9 °C, except in ;vegetation; class. A spatial analysis was also performed to identify the urban pixels within MARJ where UHI is more intense by subtracting the LST of these pixels from the LST mean value of ;vegetation; land-use class.
Projecting LED product life based on application
NASA Astrophysics Data System (ADS)
Narendran, Nadarajah; Liu, Yi-wei; Mou, Xi; Thotagamuwa, Dinusha R.; Eshwarage, Oshadhi V. Madihe
2016-09-01
LED products have started to displace traditional light sources in many lighting applications. One of the commonly claimed benefits for LED lighting products is their long useful lifetime in applications. Today there are many replacement lamp products using LEDs in the marketplace. Typically, lifetime claims of these replacement lamps are in the 25,000-hour range. According to current industry practice, the time for the LED light output to reach the 70% value is estimated according to IESNA LM-80 and TM-21 procedures and the resulting value is reported as the whole system life. LED products generally experience different thermal environments and switching (on-off cycling) patterns when used in applications. Current industry test methods often do not produce accurate lifetime estimates for LED systems because only one component of the system, namely the LED, is tested under a continuous-on burning condition without switching on and off, and because they estimate for only one failure type, lumen depreciation. The objective of the study presented in this manuscript was to develop a test method that could help predict LED system life in any application by testing the whole LED system, including on-off power cycling with sufficient dwell time, and considering both failure types, catastrophic and parametric. The study results showed for the LED A-lamps tested in this study, both failure types, catastrophic and parametric, exist. The on-off cycling encourages catastrophic failure, and maximum operating temperature influences the lumen depreciation rate and parametric failure time. It was also clear that LED system life is negatively affected by on-off switching, contrary to commonly held belief. In addition, the study results showed that most of the LED systems failed catastrophically much ahead of the LED light output reaching the 70% value. This emphasizes the fact that life testing of LED systems must consider catastrophic failure in addition to lumen depreciation, and the shorter of the two failure modes must be selected as the system life. The results of this study show a shorter time test procedure can be developed to accurately predict LED system life in any application by knowing the LED temperature and the switching cycle.
NASA Technical Reports Server (NTRS)
Bradley, D.; Buchholz, R. E.
1971-01-01
A 0.015 scale model of a modified version of the MDAC space shuttle booster was tested in the Naval Ship Research and Development Center 7 x 10 foot transonic wind tunnel, to obtain force, static stability, and control effectiveness data. Data were obtained for a cruise Mach Number of 0.38, altitude of 10,000 ft, and Reynolds Number per foot of approximately 2 x one million. The model was tested through an angle of attack range of -4 deg to 15 deg at zero degree angle of sideslip, and at an angle of sideslip range of -6 deg to 6 deg at fixed angles of attack of 0 deg, 6 deg, and 15 deg. Other test variables were elevon deflections, canard deflections, aileron deflections, rudder deflections, wing dihedral angle, canard incidence angle, wing incidence angle, canard position, wing position, wing and canard control flap size and dorsal fin size.
On the efficacy of procedures to normalize Ex-Gaussian distributions
Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío
2015-01-01
Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results. PMID:25709588
Thermal effects in an ultrafast BiB 3O 6 optical parametric oscillator at high average powers
Petersen, T.; Zuegel, J. D.; Bromage, J.
2017-08-15
An ultrafast, high-average-power, extended-cavity, femtosecond BiB 3O 6 optical parametric oscillator was constructed as a test bed for investigating the scalability of infrared parametric devices. Despite the high pulse energies achieved by this system, the reduction in slope efficiency near the maximum-available pump power prompted the investigation of thermal effects in the crystal during operation. Furthermore, the local heating effects in the crystal were used to determine the impact on both phase matching and thermal lensing to understand limitations that must be overcome to achieve microjoule-level pulse energies at high repetition rates.
Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.
Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben
2017-06-06
Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.
Thermal effects in an ultrafast BiB 3O 6 optical parametric oscillator at high average powers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petersen, T.; Zuegel, J. D.; Bromage, J.
An ultrafast, high-average-power, extended-cavity, femtosecond BiB 3O 6 optical parametric oscillator was constructed as a test bed for investigating the scalability of infrared parametric devices. Despite the high pulse energies achieved by this system, the reduction in slope efficiency near the maximum-available pump power prompted the investigation of thermal effects in the crystal during operation. Furthermore, the local heating effects in the crystal were used to determine the impact on both phase matching and thermal lensing to understand limitations that must be overcome to achieve microjoule-level pulse energies at high repetition rates.
Parametric Testing of Launch Vehicle FDDR Models
NASA Technical Reports Server (NTRS)
Schumann, Johann; Bajwa, Anupa; Berg, Peter; Thirumalainambi, Rajkumar
2011-01-01
For the safe operation of a complex system like a (manned) launch vehicle, real-time information about the state of the system and potential faults is extremely important. The on-board FDDR (Failure Detection, Diagnostics, and Response) system is a software system to detect and identify failures, provide real-time diagnostics, and to initiate fault recovery and mitigation. The ERIS (Evaluation of Rocket Integrated Subsystems) failure simulation is a unified Matlab/Simulink model of the Ares I Launch Vehicle with modular, hierarchical subsystems and components. With this model, the nominal flight performance characteristics can be studied. Additionally, failures can be injected to see their effects on vehicle state and on vehicle behavior. A comprehensive test and analysis of such a complicated model is virtually impossible. In this paper, we will describe, how parametric testing (PT) can be used to support testing and analysis of the ERIS failure simulation. PT uses a combination of Monte Carlo techniques with n-factor combinatorial exploration to generate a small, yet comprehensive set of parameters for the test runs. For the analysis of the high-dimensional simulation data, we are using multivariate clustering to automatically find structure in this high-dimensional data space. Our tools can generate detailed HTML reports that facilitate the analysis.
Karulin, Alexey Y; Caspell, Richard; Dittrich, Marcus; Lehmann, Paul V
2015-03-02
Accurate assessment of positive ELISPOT responses for low frequencies of antigen-specific T-cells is controversial. In particular, it is still unknown whether ELISPOT counts within replicate wells follow a theoretical distribution function, and thus whether high power parametric statistics can be used to discriminate between positive and negative wells. We studied experimental distributions of spot counts for up to 120 replicate wells of IFN-γ production by CD8+ T-cell responding to EBV LMP2A (426 - 434) peptide in human PBMC. The cells were tested in serial dilutions covering a wide range of average spot counts per condition, from just a few to hundreds of spots per well. Statistical analysis of the data using diagnostic Q-Q plots and the Shapiro-Wilk normality test showed that in the entire dynamic range of ELISPOT spot counts within replicate wells followed a normal distribution. This result implies that the Student t-Test and ANOVA are suited to identify positive responses. We also show experimentally that borderline responses can be reliably detected by involving more replicate wells, plating higher numbers of PBMC, addition of IL-7, or a combination of these. Furthermore, we have experimentally verified that the number of replicates needed for detection of weak responses can be calculated using parametric statistics.
Biostatistics Series Module 3: Comparing Groups: Numerical Variables.
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Numerical data that are normally distributed can be analyzed with parametric tests, that is, tests which are based on the parameters that define a normal distribution curve. If the distribution is uncertain, the data can be plotted as a normal probability plot and visually inspected, or tested for normality using one of a number of goodness of fit tests, such as the Kolmogorov-Smirnov test. The widely used Student's t-test has three variants. The one-sample t-test is used to assess if a sample mean (as an estimate of the population mean) differs significantly from a given population mean. The means of two independent samples may be compared for a statistically significant difference by the unpaired or independent samples t-test. If the data sets are related in some way, their means may be compared by the paired or dependent samples t-test. The t-test should not be used to compare the means of more than two groups. Although it is possible to compare groups in pairs, when there are more than two groups, this will increase the probability of a Type I error. The one-way analysis of variance (ANOVA) is employed to compare the means of three or more independent data sets that are normally distributed. Multiple measurements from the same set of subjects cannot be treated as separate, unrelated data sets. Comparison of means in such a situation requires repeated measures ANOVA. It is to be noted that while a multiple group comparison test such as ANOVA can point to a significant difference, it does not identify exactly between which two groups the difference lies. To do this, multiple group comparison needs to be followed up by an appropriate post hoc test. An example is the Tukey's honestly significant difference test following ANOVA. If the assumptions for parametric tests are not met, there are nonparametric alternatives for comparing data sets. These include Mann-Whitney U-test as the nonparametric counterpart of the unpaired Student's t-test, Wilcoxon signed-rank test as the counterpart of the paired Student's t-test, Kruskal-Wallis test as the nonparametric equivalent of ANOVA and the Friedman's test as the counterpart of repeated measures ANOVA.
Non-parametric combination and related permutation tests for neuroimaging.
Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E
2016-04-01
In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
One-dimensional statistical parametric mapping in Python.
Pataky, Todd C
2012-01-01
Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.
Empirical Prediction of Aircraft Landing Gear Noise
NASA Technical Reports Server (NTRS)
Golub, Robert A. (Technical Monitor); Guo, Yue-Ping
2005-01-01
This report documents a semi-empirical/semi-analytical method for landing gear noise prediction. The method is based on scaling laws of the theory of aerodynamic noise generation and correlation of these scaling laws with current available test data. The former gives the method a sound theoretical foundation and the latter quantitatively determines the relations between the parameters of the landing gear assembly and the far field noise, enabling practical predictions of aircraft landing gear noise, both for parametric trends and for absolute noise levels. The prediction model is validated by wind tunnel test data for an isolated Boeing 737 landing gear and by flight data for the Boeing 777 airplane. In both cases, the predictions agree well with data, both in parametric trends and in absolute noise levels.
Nickel hydrogen cell tests. [recharging
NASA Technical Reports Server (NTRS)
Mueller, V. C.
1981-01-01
Some parametric tests followed by cycling tests are described for the characterization of the service life of nickel hydrogen cells. Three cells were automatically cycled in simulated low Earth orbit in 35 minute discharge, 55 minute charge, with charging voltage limited, temperature compensated. The cells were mounted in a fixture that conducts heat to an aluminum baseplate. The baseplate in turn, is bounded in a temperature controlled bath to remove the heat from the mounted fixture. One cell was tested with a zircar separator, which failed after 2473 cyles. Two other cells were tested one with a zircar separator; the other with asbestos. More than 400 cycles were achieved.
NASA Technical Reports Server (NTRS)
Bennett, Robert M.; Walker, Charlotte E.
1999-01-01
Computational test cases have been selected from the data set for a clipped delta wing with a six-percent-thick circular-arc airfoil section that was tested in the NASA Langley Transonic Dynamics Tunnel. The test cases include parametric variation of static angle of attack, pitching oscillation frequency, trailing-edge control surface oscillation frequency, and Mach numbers from subsonic to low supersonic values. Tables and plots of the measured pressures are presented for each case. This report provides an early release of test cases that have been proposed for a document that supplements the cases presented in AGARD Report 702.
Normality Tests for Statistical Analysis: A Guide for Non-Statisticians
Ghasemi, Asghar; Zahediasl, Saleh
2012-01-01
Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808
The Effects of Non-Normality on Type III Error for Comparing Independent Means
ERIC Educational Resources Information Center
Mendes, Mehmet
2007-01-01
The major objective of this study was to investigate the effects of non-normality on Type III error rates for ANOVA F its three commonly recommended parametric counterparts namely Welch, Brown-Forsythe, and Alexander-Govern test. Therefore these tests were compared in terms of Type III error rates across the variety of population distributions,…
Transistor step stress testing program for JANTX2N2484
NASA Technical Reports Server (NTRS)
1979-01-01
The effect of power/temperature step stress when applied to the transistor JANTX2N2484, manufactured by Raytheon and Teledyne was evaluated. Forty-eight samples from each manufacturer were divided equally (16 per group) into three groups and submitted to the processes outlined. In addition, two control sample units were maintained for verification of the electrical parametric testing.
The Rasch Model and Missing Data, with an Emphasis on Tailoring Test Items.
ERIC Educational Resources Information Center
de Gruijter, Dato N. M.
Many applications of educational testing have a missing data aspect (MDA). This MDA is perhaps most pronounced in item banking, where each examinee responds to a different subtest of items from a large item pool and where both person and item parameter estimates are needed. The Rasch model is emphasized, and its non-parametric counterpart (the…
Konietschke, Frank; Libiger, Ondrej; Hothorn, Ludwig A
2012-01-01
Statistical association between a single nucleotide polymorphism (SNP) genotype and a quantitative trait in genome-wide association studies is usually assessed using a linear regression model, or, in the case of non-normally distributed trait values, using the Kruskal-Wallis test. While linear regression models assume an additive mode of inheritance via equi-distant genotype scores, Kruskal-Wallis test merely tests global differences in trait values associated with the three genotype groups. Both approaches thus exhibit suboptimal power when the underlying inheritance mode is dominant or recessive. Furthermore, these tests do not perform well in the common situations when only a few trait values are available in a rare genotype category (disbalance), or when the values associated with the three genotype categories exhibit unequal variance (variance heterogeneity). We propose a maximum test based on Marcus-type multiple contrast test for relative effect sizes. This test allows model-specific testing of either dominant, additive or recessive mode of inheritance, and it is robust against variance heterogeneity. We show how to obtain mode-specific simultaneous confidence intervals for the relative effect sizes to aid in interpreting the biological relevance of the results. Further, we discuss the use of a related all-pairwise comparisons contrast test with range preserving confidence intervals as an alternative to Kruskal-Wallis heterogeneity test. We applied the proposed maximum test to the Bogalusa Heart Study dataset, and gained a remarkable increase in the power to detect association, particularly for rare genotypes. Our simulation study also demonstrated that the proposed non-parametric tests control family-wise error rate in the presence of non-normality and variance heterogeneity contrary to the standard parametric approaches. We provide a publicly available R library nparcomp that can be used to estimate simultaneous confidence intervals or compatible multiplicity-adjusted p-values associated with the proposed maximum test.
Mutel, Christopher L; de Baan, Laura; Hellweg, Stefanie
2013-06-04
Comprehensive sensitivity analysis is a significant tool to interpret and improve life cycle assessment (LCA) models, but is rarely performed. Sensitivity analysis will increase in importance as inventory databases become regionalized, increasing the number of system parameters, and parametrized, adding complexity through variables and nonlinear formulas. We propose and implement a new two-step approach to sensitivity analysis. First, we identify parameters with high global sensitivities for further examination and analysis with a screening step, the method of elementary effects. Second, the more computationally intensive contribution to variance test is used to quantify the relative importance of these parameters. The two-step sensitivity test is illustrated on a regionalized, nonlinear case study of the biodiversity impacts from land use of cocoa production, including a worldwide cocoa products trade model. Our simplified trade model can be used for transformable commodities where one is assessing market shares that vary over time. In the case study, the highly uncertain characterization factors for the Ivory Coast and Ghana contributed more than 50% of variance for almost all countries and years examined. The two-step sensitivity test allows for the interpretation, understanding, and improvement of large, complex, and nonlinear LCA systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panaccione, Charles; Staab, Greg; Meuleman, Erik
ION has developed a mathematically driven model for a contacting device incorporating mass transfer, heat transfer, and computational fluid dynamics. This model is based upon a parametric structure for purposes of future commercialization. The most promising design from modeling was 3D printed and tested in a bench scale CO 2 capture unit and compared to commercially available structured packing tested in the same unit.
Design, fabrication and testing of a thermal diode
NASA Technical Reports Server (NTRS)
Swerdling, B.; Kosson, R.
1972-01-01
Heat pipe diode types are discussed. The design, fabrication and test of a flight qualified diode for the Advanced Thermal Control Flight Experiment (ATFE) are described. The review covers the use of non-condensable gas, freezing, liquid trap, and liquid blockage techniques. Test data and parametric performance are presented for the liquid trap and liquid blockage techniques. The liquid blockage technique was selected for the ATFE diode on the basis of small reservoir size, low reverse mode heat transfer, and apparent rapid shut-off.
A simple randomisation procedure for validating discriminant analysis: a methodological note.
Wastell, D G
1987-04-01
Because the goal of discriminant analysis (DA) is to optimise classification, it designedly exaggerates between-group differences. This bias complicates validation of DA. Jack-knifing has been used for validation but is inappropriate when stepwise selection (SWDA) is employed. A simple randomisation test is presented which is shown to give correct decisions for SWDA. The general superiority of randomisation tests over orthodox significance tests is discussed. Current work on non-parametric methods of estimating the error rates of prediction rules is briefly reviewed.
2016-07-01
711 HPW/RHCPT) and their in-house technical support contractor , Infoscitex, conducted a series of tests to identify the performance capabilities of...Cell Seat Configuration Drop Ht . (in) Mean Peak Acceleration (G) Mean Velocity Change (ft/s) SH1 WS1 20 80.08 ± 3.71 13.54 ± 0.49 SH2...6. Test Matrix for VID Response with WS2 Test Cell Seat (Felt) Configuration Drop Ht . (in) Mean Peak Acceleration (G
[Do we always correctly interpret the results of statistical nonparametric tests].
Moczko, Jerzy A
2014-01-01
Mann-Whitney, Wilcoxon, Kruskal-Wallis and Friedman tests create a group of commonly used tests to analyze the results of clinical and laboratory data. These tests are considered to be extremely flexible and their asymptotic relative efficiency exceeds 95 percent. Compared with the corresponding parametric tests they do not require checking the fulfillment of the conditions such as the normality of data distribution, homogeneity of variance, the lack of correlation means and standard deviations, etc. They can be used both in the interval and or-dinal scales. The article presents an example Mann-Whitney test, that does not in any case the choice of these four nonparametric tests treated as a kind of gold standard leads to correct inference.
A Kolmogorov-Smirnov test for the molecular clock based on Bayesian ensembles of phylogenies
Antoneli, Fernando; Passos, Fernando M.; Lopes, Luciano R.
2018-01-01
Divergence date estimates are central to understand evolutionary processes and depend, in the case of molecular phylogenies, on tests of molecular clocks. Here we propose two non-parametric tests of strict and relaxed molecular clocks built upon a framework that uses the empirical cumulative distribution (ECD) of branch lengths obtained from an ensemble of Bayesian trees and well known non-parametric (one-sample and two-sample) Kolmogorov-Smirnov (KS) goodness-of-fit test. In the strict clock case, the method consists in using the one-sample Kolmogorov-Smirnov (KS) test to directly test if the phylogeny is clock-like, in other words, if it follows a Poisson law. The ECD is computed from the discretized branch lengths and the parameter λ of the expected Poisson distribution is calculated as the average branch length over the ensemble of trees. To compensate for the auto-correlation in the ensemble of trees and pseudo-replication we take advantage of thinning and effective sample size, two features provided by Bayesian inference MCMC samplers. Finally, it is observed that tree topologies with very long or very short branches lead to Poisson mixtures and in this case we propose the use of the two-sample KS test with samples from two continuous branch length distributions, one obtained from an ensemble of clock-constrained trees and the other from an ensemble of unconstrained trees. Moreover, in this second form the test can also be applied to test for relaxed clock models. The use of a statistically equivalent ensemble of phylogenies to obtain the branch lengths ECD, instead of one consensus tree, yields considerable reduction of the effects of small sample size and provides a gain of power. PMID:29300759
Reference interval computation: which method (not) to choose?
Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C
2012-07-11
When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.
Je, Yub; Lee, Haksue; Park, Jongkyu; Moon, Wonkyu
2010-06-01
An ultrasonic radiator is developed to generate a difference frequency sound from two frequencies of ultrasound in air with a parametric array. A design method is proposed for an ultrasonic radiator capable of generating highly directive, high-amplitude ultrasonic sound beams at two different frequencies in air based on a modification of the stepped-plate ultrasonic radiator. The stepped-plate ultrasonic radiator was introduced by Gallego-Juarez et al. [Ultrasonics 16, 267-271 (1978)] in their previous study and can effectively generate highly directive, large-amplitude ultrasonic sounds in air, but only at a single frequency. Because parametric array sources must be able to generate sounds at more than one frequency, a design modification is crucial to the application of a stepped-plate ultrasonic radiator as a parametric array source in air. The aforementioned method was employed to design a parametric radiator for use in air. A prototype of this design was constructed and tested to determine whether it could successfully generate a difference frequency sound with a parametric array. The results confirmed that the proposed single small-area transducer was suitable as a parametric radiator in air.
Direct adaptive robust tracking control for 6 DOF industrial robot with enhanced accuracy.
Yin, Xiuxing; Pan, Li
2018-01-01
A direct adaptive robust tracking control is proposed for trajectory tracking of 6 DOF industrial robot in the presence of parametric uncertainties, external disturbances and uncertain nonlinearities. The controller is designed based on the dynamic characteristics in the working space of the end-effector of the 6 DOF robot. The controller includes robust control term and model compensation term that is developed directly based on the input reference or desired motion trajectory. A projection-type parametric adaptation law is also designed to compensate for parametric estimation errors for the adaptive robust control. The feasibility and effectiveness of the proposed direct adaptive robust control law and the associated projection-type parametric adaptation law have been comparatively evaluated based on two 6 DOF industrial robots. The test results demonstrate that the proposed control can be employed to better maintain the desired trajectory tracking even in the presence of large parametric uncertainties and external disturbances as compared with PD controller and nonlinear controller. The parametric estimates also eventually converge to the real values along with the convergence of tracking errors, which further validate the effectiveness of the proposed parametric adaption law. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Modeling personnel turnover in the parametric organization
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1991-01-01
A model is developed for simulating the dynamics of a newly formed organization, credible during all phases of organizational development. The model development process is broken down into the activities of determining the tasks required for parametric cost analysis (PCA), determining the skills required for each PCA task, determining the skills available in the applicant marketplace, determining the structure of the model, implementing the model, and testing it. The model, parameterized by the likelihood of job function transition, has demonstrated by the capability to represent the transition of personnel across functional boundaries within a parametric organization using a linear dynamical system, and the ability to predict required staffing profiles to meet functional needs at the desired time. The model can be extended by revisions of the state and transition structure to provide refinements in functional definition for the parametric and extended organization.
NASA Technical Reports Server (NTRS)
Stokes, R. L.
1979-01-01
Electrical characterization tests were performed on two different manufactured types of integrated circuits. The devices were subjected to functional and AC and DC parametric tests at ambient temperatures of -55 C, -20 C, 25 C, 85 C, and 125 C. The data were analyzed and tabulated to show the effect of operating conditions on performance and to indicate parameter deviations among devices in each group. Accuracy was given precedence over test time efficiency where practical, and tests were designed to measure worst case performance.
Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin
2015-05-05
Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Lauritsen, Maj-Britt Glenn; Söderström, Margareta; Kreiner, Svend; Dørup, Jens; Lous, Jørgen
2016-01-01
We tested "the Galker test", a speech reception in noise test developed for primary care for Danish preschool children, to explore if the children's ability to hear and understand speech was associated with gender, age, middle ear status, and the level of background noise. The Galker test is a 35-item audio-visual, computerized word discrimination test in background noise. Included were 370 normally developed children attending day care center. The children were examined with the Galker test, tympanometry, audiometry, and the Reynell test of verbal comprehension. Parents and daycare teachers completed questionnaires on the children's ability to hear and understand speech. As most of the variables were not assessed using interval scales, non-parametric statistics (Goodman-Kruskal's gamma) were used for analyzing associations with the Galker test score. For comparisons, analysis of variance (ANOVA) was used. Interrelations were adjusted for using a non-parametric graphic model. In unadjusted analyses, the Galker test was associated with gender, age group, language development (Reynell revised scale), audiometry, and tympanometry. The Galker score was also associated with the parents' and day care teachers' reports on the children's vocabulary, sentence construction, and pronunciation. Type B tympanograms were associated with a mean hearing 5-6dB below that of than type A, C1, or C2. In the graphic analysis, Galker scores were closely and significantly related to Reynell test scores (Gamma (G)=0.35), the children's age group (G=0.33), and the day care teachers' assessment of the children's vocabulary (G=0.26). The Galker test of speech reception in noise appears promising as an easy and quick tool for evaluating preschool children's understanding of spoken words in noise, and it correlated well with the day care teachers' reports and less with the parents' reports. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Development and fabrication of S-band chip varactor parametric amplifier
NASA Technical Reports Server (NTRS)
Kramer, E.
1974-01-01
A noncryogenic, S-band parametric amplifier operating in the 2.2 to 2.3 GHz band and having an average input noise temperature of less than 30 K was built and tested. The parametric amplifier module occupies a volume of less than 1-1/4 cubic feet and weighs less than 60 pounds. The module is designed for use in various NASA ground stations to replace larger, more complex cryogenic units which require considerably more maintenance because of the cryogenic refrigeration system employed. The amplifier can be located up to 15 feet from the power supply unit. Optimum performance was achieved through the use of high-quality unpackaged (chip) varactors in the amplifier design.
Parametric study of extended end-plate connection using finite element modeling
NASA Astrophysics Data System (ADS)
Mureşan, Ioana Cristina; Bâlc, Roxana
2017-07-01
End-plate connections with preloaded high strength bolts represent a convenient, fast and accurate solution for beam-to-column joints. The behavior of framework joints build up with this type of connection are sensitive dependent on geometrical and material characteristics of the elements connected. This paper presents results of parametric analyses on the behavior of a bolted extended end-plate connection using finite element modeling program Abaqus. This connection was experimentally tested in the Laboratory of Faculty of Civil Engineering from Cluj-Napoca and the results are briefly reviewed in this paper. The numerical model of the studied connection was described in detail in [1] and provides data for this parametric study.
Bayesian non-parametric inference for stochastic epidemic models using Gaussian Processes.
Xu, Xiaoguang; Kypraios, Theodore; O'Neill, Philip D
2016-10-01
This paper considers novel Bayesian non-parametric methods for stochastic epidemic models. Many standard modeling and data analysis methods use underlying assumptions (e.g. concerning the rate at which new cases of disease will occur) which are rarely challenged or tested in practice. To relax these assumptions, we develop a Bayesian non-parametric approach using Gaussian Processes, specifically to estimate the infection process. The methods are illustrated with both simulated and real data sets, the former illustrating that the methods can recover the true infection process quite well in practice, and the latter illustrating that the methods can be successfully applied in different settings. © The Author 2016. Published by Oxford University Press.
A Nonparametric Approach for Assessing Goodness-of-Fit of IRT Models in a Mixed Format Test
ERIC Educational Resources Information Center
Liang, Tie; Wells, Craig S.
2015-01-01
Investigating the fit of a parametric model plays a vital role in validating an item response theory (IRT) model. An area that has received little attention is the assessment of multiple IRT models used in a mixed-format test. The present study extends the nonparametric approach, proposed by Douglas and Cohen (2001), to assess model fit of three…
NASA Technical Reports Server (NTRS)
Van Dyke, Michael B.
2013-01-01
Present preliminary work using lumped parameter models to approximate dynamic response of electronic units to random vibration; Derive a general N-DOF model for application to electronic units; Illustrate parametric influence of model parameters; Implication of coupled dynamics for unit/board design; Demonstrate use of model to infer printed wiring board (PWB) dynamics from external chassis test measurement.
NASA Astrophysics Data System (ADS)
Kurpiel, Artur; Wysokowski, Adam
2015-03-01
The creep test under the static loading, that allows to determine rheological properties of asphalt based on the creep curve, is the most effective test nowadays. Applied loads are non-destructive and allow to observe the course of the strain after the test load. The test can be carried out on compressing, shearing, bending as well as on triaxial test, that depends on the applied apparatus implementing different intensity [1, 2, 3, 4, 5, 6]. Based on the creep test, the stress of different properties can be specified. Among them there are valuable rheological properties based on selected viscoelascity models [1]. The properties of the viscoelascity models are relevant indexes depicting resistance to deformation. They can be used to forecast the wheel-truck in the accepted rheological model [1]. In this article it is shown the impact of different rheological properties of the viscoelacity model on the wheel-truck as well as the impact of different properties on shape and the course of the creep curve. The asphalt mixtures presented in this article are characterized by variable rheological properties. It is therefore difficult to determine which property mostly affects the size of the strain. However, the authors of this article attempted to analyse the change of the asphalt strain value of the different variables in particular rheological model, called Bürgers's model. Badanie pełzania pod obciążeniem statycznym jest obecnie najbardziej efektywnym badaniem pozwalającym na określenie reologicznych parametrów mieszanek mineralno - asfaltowych na podstawie krzywej pełzania. Stosowane obciążenia mają poziom nieniszczący i pozwalają na obserwację przebiegu odkształceń w czasie również po odciążeniu. Badanie może być realizowane przy ściskaniu, ścinaniu, rozciąganiu i zginaniu, a także w zakresie trójosiowym, w zależności od stosowanego aparatu realizującego zadany schemat naprężeń [1, 2, 3, 4, 5, 6]. Na podstawie badania pełzania można określić parametry oparte o różne teorie pełzania a szczególnie cenne parametry reologiczne w oparciu o wybrane modele lepkosprężyste [1]. Parametry z modeli lepkosprężystych są miarodajnymi wskaźnikami obrazującymi odporność mieszanek na deformacje. Można za ich pomocą prognozować głębokości koleiny w przyjętym modelu reologicznym [1]. W niniejszym artykule przedstawiono jaki wpływ na głębokość koleiny mają różne wartości parametrów reologicznych z analizowanego modelu lepkosprężystego oraz wpływ parametrów na kształt i przebieg krzywej pełzania. Przedstawione w artykule mieszanki mineralno - asfaltowe charakteryzują się zmiennymi parametrami reologicznymi, zatem trudno jest określić, który parametr decyduje o wielkości odkształcenia danej mieszanki. Mając na uwadze powyższe, w artykule podjęto próbę analizy zmiany wartości odkształcenia mieszanki mineralno - asfaltowej przy zmianie jednego oraz dwóch parametrów w danym modelu reologicznym - w tym przypadku - Bürgersa.
Trend analysis of Arctic sea ice extent
NASA Astrophysics Data System (ADS)
Silva, M. E.; Barbosa, S. M.; Antunes, Luís; Rocha, Conceição
2009-04-01
The extent of Arctic sea ice is a fundamental parameter of Arctic climate variability. In the context of climate change, the area covered by ice in the Arctic is a particularly useful indicator of recent changes in the Arctic environment. Climate models are in near universal agreement that Arctic sea ice extent will decline through the 21st century as a consequence of global warming and many studies predict a ice free Arctic as soon as 2012. Time series of satellite passive microwave observations allow to assess the temporal changes in the extent of Arctic sea ice. Much of the analysis of the ice extent time series, as in most climate studies from observational data, have been focussed on the computation of deterministic linear trends by ordinary least squares. However, many different processes, including deterministic, unit root and long-range dependent processes can engender trend like features in a time series. Several parametric tests have been developed, mainly in econometrics, to discriminate between stationarity (no trend), deterministic trend and stochastic trends. Here, these tests are applied in the trend analysis of the sea ice extent time series available at National Snow and Ice Data Center. The parametric stationary tests, Augmented Dickey-Fuller (ADF), Phillips-Perron (PP) and the KPSS, do not support an overall deterministic trend in the time series of Arctic sea ice extent. Therefore, alternative parametrizations such as long-range dependence should be considered for characterising long-term Arctic sea ice variability.
Chiral symmetry constraints on resonant amplitudes
NASA Astrophysics Data System (ADS)
Bruns, Peter C.; Mai, Maxim
2018-03-01
We discuss the impact of chiral symmetry constraints on the quark-mass dependence of meson resonance pole positions, which are encoded in non-perturbative parametrizations of meson scattering amplitudes. Model-independent conditions on such parametrizations are derived, which are shown to guarantee the correct functional form of the leading quark-mass corrections to the resonance pole positions. Some model amplitudes for ππ scattering, widely used for the determination of ρ and σ resonance properties from results of lattice simulations, are tested explicitly with respect to these conditions.
Estimating survival of radio-tagged birds
Bunck, C.M.; Pollock, K.H.; Lebreton, J.-D.; North, P.M.
1993-01-01
Parametric and nonparametric methods for estimating survival of radio-tagged birds are described. The general assumptions of these methods are reviewed. An estimate based on the assumption of constant survival throughout the period is emphasized in the overview of parametric methods. Two nonparametric methods, the Kaplan-Meier estimate of the survival funcrion and the log rank test, are explained in detail The link between these nonparametric methods and traditional capture-recapture models is discussed aloag with considerations in designing studies that use telemetry techniques to estimate survival.
NASA Astrophysics Data System (ADS)
Penkov, V. B.; Ivanychev, D. A.; Novikova, O. S.; Levina, L. V.
2018-03-01
The article substantiates the possibility of building full parametric analytical solutions of mathematical physics problems in arbitrary regions by means of computer systems. The suggested effective means for such solutions is the method of boundary states with perturbations, which aptly incorporates all parameters of an orthotropic medium in a general solution. We performed check calculations of elastic fields of an anisotropic rectangular region (test and calculation problems) for a generalized plane stress state.
Laser-Based Remote Sensing of Explosives by a Differential Absorption and Scattering Method
NASA Astrophysics Data System (ADS)
Ayrapetyan, V. S.
2018-01-01
A multifunctional IR parametric laser system is developed and tested for remote detection and identification of atmospheric gases, including explosive and chemically aggressive substances. Calculations and experimental studies of remote determination of the spectroscopic parameters of the best known explosive substances TNT, RDX, and PETN are carried out. The feasibility of high sensitivity detection ( 1 ppm) of these substances with the aid of a multifunctional IR parametric light source by differential absorption and scattering is demonstrated.
Parametric and nonparametric Granger causality testing: Linkages between international stock markets
NASA Astrophysics Data System (ADS)
De Gooijer, Jan G.; Sivarajasingham, Selliah
2008-04-01
This study investigates long-term linear and nonlinear causal linkages among eleven stock markets, six industrialized markets and five emerging markets of South-East Asia. We cover the period 1987-2006, taking into account the on-set of the Asian financial crisis of 1997. We first apply a test for the presence of general nonlinearity in vector time series. Substantial differences exist between the pre- and post-crisis period in terms of the total number of significant nonlinear relationships. We then examine both periods, using a new nonparametric test for Granger noncausality and the conventional parametric Granger noncausality test. One major finding is that the Asian stock markets have become more internationally integrated after the Asian financial crisis. An exception is the Sri Lankan market with almost no significant long-term linear and nonlinear causal linkages with other markets. To ensure that any causality is strictly nonlinear in nature, we also examine the nonlinear causal relationships of VAR filtered residuals and VAR filtered squared residuals for the post-crisis sample. We find quite a few remaining significant bi- and uni-directional causal nonlinear relationships in these series. Finally, after filtering the VAR-residuals with GARCH-BEKK models, we show that the nonparametric test statistics are substantially smaller in both magnitude and statistical significance than those before filtering. This indicates that nonlinear causality can, to a large extent, be explained by simple volatility effects.
Maharjan, Ashim; Wang, Eunice; Peng, Mei; Cakmak, Yusuf O.
2018-01-01
In past literature on animal models, invasive vagal nerve stimulation using high frequencies has shown to be effective at modulating the activity of the olfactory bulb (OB). Recent advances in invasive vagal nerve stimulation in humans, despite previous findings in animal models, used low frequency stimulation and found no effect on the olfactory functioning. The present article aimed to test potential effects of non-invasive, high and low frequency vagal nerve stimulation in humans, with supplementary exploration of the orbitofrontal cortex using near-infrared spectroscopy (NIRS). Healthy, male adult participants (n = 18) performed two olfactory tests [odor threshold test (OTT) and supra-threshold test (STT)] before and after receiving high-, low frequency vagal nerve stimulation and placebo (no stimulation). Participant's olfactory functioning was monitored using NIRS, and assessed with two behavioral olfactory tests. NIRS data of separate stimulation parameters were statistically analyzed using repeated-measures ANOVA across different stages. Data from olfactory tests were analyzed using paired parametric and non-parametric statistical tests. Only high frequency, non-invasive vagal nerve stimulation was able to positively modulate the performance of the healthy participants in the STT (p = 0.021, Wilcoxon sign-ranked test), with significant differences in NIRS (p = 0.014, post-hoc with Bonferroni correction) recordings of the right hemispheric, orbitofrontal cortex. The results from the current article implore further exploration of the neurocircuitry involved under vagal nerve stimulation and the effects of non-invasive, high frequency, vagal nerve stimulation toward olfactory dysfunction which showcase in Parkinson's and Alzheimer's Diseases. Despite the sufficient effect size (moderate effect, correlation coefficient (r): 0.39 for the STT) of the current study, future research should replicate the current findings with a larger cohort. PMID:29740266
Non‐parametric combination and related permutation tests for neuroimaging
Webster, Matthew A.; Brooks, Jonathan C.; Tracey, Irene; Smith, Stephen M.; Nichols, Thomas E.
2016-01-01
Abstract In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well‐known definition of union‐intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume‐based representations of the brain, including non‐imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non‐parametric combination (NPC) methodology, such that instead of a two‐phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one‐way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. Hum Brain Mapp 37:1486‐1511, 2016. © 2016 Wiley Periodicals, Inc. PMID:26848101
Maharjan, Ashim; Wang, Eunice; Peng, Mei; Cakmak, Yusuf O
2018-01-01
In past literature on animal models, invasive vagal nerve stimulation using high frequencies has shown to be effective at modulating the activity of the olfactory bulb (OB). Recent advances in invasive vagal nerve stimulation in humans, despite previous findings in animal models, used low frequency stimulation and found no effect on the olfactory functioning. The present article aimed to test potential effects of non-invasive, high and low frequency vagal nerve stimulation in humans, with supplementary exploration of the orbitofrontal cortex using near-infrared spectroscopy (NIRS). Healthy, male adult participants ( n = 18) performed two olfactory tests [odor threshold test (OTT) and supra-threshold test (STT)] before and after receiving high-, low frequency vagal nerve stimulation and placebo (no stimulation). Participant's olfactory functioning was monitored using NIRS, and assessed with two behavioral olfactory tests. NIRS data of separate stimulation parameters were statistically analyzed using repeated-measures ANOVA across different stages. Data from olfactory tests were analyzed using paired parametric and non-parametric statistical tests. Only high frequency, non-invasive vagal nerve stimulation was able to positively modulate the performance of the healthy participants in the STT ( p = 0.021, Wilcoxon sign-ranked test), with significant differences in NIRS ( p = 0.014, post-hoc with Bonferroni correction ) recordings of the right hemispheric, orbitofrontal cortex. The results from the current article implore further exploration of the neurocircuitry involved under vagal nerve stimulation and the effects of non-invasive, high frequency, vagal nerve stimulation toward olfactory dysfunction which showcase in Parkinson's and Alzheimer's Diseases. Despite the sufficient effect size (moderate effect, correlation coefficient (r): 0.39 for the STT) of the current study, future research should replicate the current findings with a larger cohort.
Functional form diagnostics for Cox's proportional hazards model.
León, Larry F; Tsai, Chih-Ling
2004-03-01
We propose a new type of residual and an easily computed functional form test for the Cox proportional hazards model. The proposed test is a modification of the omnibus test for testing the overall fit of a parametric regression model, developed by Stute, González Manteiga, and Presedo Quindimil (1998, Journal of the American Statistical Association93, 141-149), and is based on what we call censoring consistent residuals. In addition, we develop residual plots that can be used to identify the correct functional forms of covariates. We compare our test with the functional form test of Lin, Wei, and Ying (1993, Biometrika80, 557-572) in a simulation study. The practical application of the proposed residuals and functional form test is illustrated using both a simulated data set and a real data set.
NASA Astrophysics Data System (ADS)
Riera-Palou, Felip; den Brinker, Albertus C.
2007-12-01
This paper introduces a new audio and speech broadband coding technique based on the combination of a pulse excitation coder and a standardized parametric coder, namely, MPEG-4 high-quality parametric coder. After presenting a series of enhancements to regular pulse excitation (RPE) to make it suitable for the modeling of broadband signals, it is shown how pulse and parametric codings complement each other and how they can be merged to yield a layered bit stream scalable coder able to operate at different points in the quality bit rate plane. The performance of the proposed coder is evaluated in a listening test. The major result is that the extra functionality of the bit stream scalability does not come at the price of a reduced performance since the coder is competitive with standardized coders (MP3, AAC, SSC).
Coupled oscillators in identification of nonlinear damping of a real parametric pendulum
NASA Astrophysics Data System (ADS)
Olejnik, Paweł; Awrejcewicz, Jan
2018-01-01
A damped parametric pendulum with friction is identified twice by means of its precise and imprecise mathematical model. A laboratory test stand designed for experimental investigations of nonlinear effects determined by a viscous resistance and the stick-slip phenomenon serves as the model mechanical system. An influence of accurateness of mathematical modeling on the time variability of the nonlinear damping coefficient of the oscillator is proved. A free decay response of a precisely and imprecisely modeled physical pendulum is dependent on two different time-varying coefficients of damping. The coefficients of the analyzed parametric oscillator are identified with the use of a new semi-empirical method based on a coupled oscillators approach, utilizing the fractional order derivative of the discrete measurement series treated as an input to the numerical model. Results of application of the proposed method of identification of the nonlinear coefficients of the damped parametric oscillator have been illustrated and extensively discussed.
The measurement of acoustic properties of limited size panels by use of a parametric source
NASA Astrophysics Data System (ADS)
Humphrey, V. F.
1985-01-01
A method of measuring the acoustic properties of limited size panels immersed in water, with a truncated parametric array used as the acoustic source, is described. The insertion loss and reflection loss of thin metallic panels, typically 0·45 m square, were measured at normal incidence by using this technique. Results were obtained for a wide range of frequencies (10 to 100 kHz) and were found to be in good agreement with the theoretical predictions for plane waves. Measurements were also made of the insertion loss of aluminium, Perspex and G.R.P. panels for angles of incidence up to 50°. The broad bandwidth available from the parametric source permitted detailed measurements to be made over a wide frequency range using a single transmitting transducer. The small spot sizes obtainable with the parametric source also helped to reduce the significance of diffraction from edges of the panel under test.
Transformation-invariant and nonparametric monotone smooth estimation of ROC curves.
Du, Pang; Tang, Liansheng
2009-01-30
When a new diagnostic test is developed, it is of interest to evaluate its accuracy in distinguishing diseased subjects from non-diseased subjects. The accuracy of the test is often evaluated by receiver operating characteristic (ROC) curves. Smooth ROC estimates are often preferable for continuous test results when the underlying ROC curves are in fact continuous. Nonparametric and parametric methods have been proposed by various authors to obtain smooth ROC curve estimates. However, there are certain drawbacks with the existing methods. Parametric methods need specific model assumptions. Nonparametric methods do not always satisfy the inherent properties of the ROC curves, such as monotonicity and transformation invariance. In this paper we propose a monotone spline approach to obtain smooth monotone ROC curves. Our method ensures important inherent properties of the underlying ROC curves, which include monotonicity, transformation invariance, and boundary constraints. We compare the finite sample performance of the newly proposed ROC method with other ROC smoothing methods in large-scale simulation studies. We illustrate our method through a real life example. Copyright (c) 2008 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Greathouse, James S.; Schwing, Alan M.
2015-01-01
This paper explores use of computational fluid dynamics to study the e?ect of geometric porosity on static stability and drag for NASA's Multi-Purpose Crew Vehicle main parachute. Both of these aerodynamic characteristics are of interest to in parachute design, and computational methods promise designers the ability to perform detailed parametric studies and other design iterations with a level of control previously unobtainable using ground or flight testing. The approach presented here uses a canopy structural analysis code to define the inflated parachute shapes on which structured computational grids are generated. These grids are used by the computational fluid dynamics code OVERFLOW and are modeled as rigid, impermeable bodies for this analysis. Comparisons to Apollo drop test data is shown as preliminary validation of the technique. Results include several parametric sweeps through design variables in order to better understand the trade between static stability and drag. Finally, designs that maximize static stability with a minimal loss in drag are suggested for further study in subscale ground and flight testing.
NASA Technical Reports Server (NTRS)
Holland, Scott D.
1993-01-01
Three-dimensional sidewall-compression scramjet inlets with leading-edge sweeps of 30 deg and 70 deg were tested in the Langley Hypersonic CF4 Tunnel at a Mach number of 6 and a free-stream ratio of specific heats of 1.2. The parametric effects of leading-edge sweep, cowl position, contraction ratio, and Reynolds number were investigated. The models were instrumented with static pressure orifices distributed on the sidewalls, baseplate, and cowl. Schlieren movies were made of selected tunnel runs for flow visualization of the entrance plane and cowl region. Although these movies could not show the internal flow, the effect of the internal flow on the external flow was evident by way of spillage. The purpose is to provide a preliminary data release for the investigation. The models, facility, and testing methods are described, and the test matrix and a tabulation of tunnel runs are provided. Line plots highlighting the stated parametric effects and a representative set of schlieren photographs are presented without analysis.
Strong stabilization servo controller with optimization of performance criteria.
Sarjaš, Andrej; Svečko, Rajko; Chowdhury, Amor
2011-07-01
Synthesis of a simple robust controller with a pole placement technique and a H(∞) metrics is the method used for control of a servo mechanism with BLDC and BDC electric motors. The method includes solving a polynomial equation on the basis of the chosen characteristic polynomial using the Manabe standard polynomial form and parametric solutions. Parametric solutions are introduced directly into the structure of the servo controller. On the basis of the chosen parametric solutions the robustness of a closed-loop system is assessed through uncertainty models and assessment of the norm ‖•‖(∞). The design procedure and the optimization are performed with a genetic algorithm differential evolution - DE. The DE optimization method determines a suboptimal solution throughout the optimization on the basis of a spectrally square polynomial and Šiljak's absolute stability test. The stability of the designed controller during the optimization is being checked with Lipatov's stability condition. Both utilized approaches: Šiljak's test and Lipatov's condition, check the robustness and stability characteristics on the basis of the polynomial's coefficients, and are very convenient for automated design of closed-loop control and for application in optimization algorithms such as DE. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Cai, Li
2006-02-01
A permutation test typically requires fewer assumptions than does a comparable parametric counterpart. The multi-response permutation procedure (MRPP) is a class of multivariate permutation tests of group difference useful for the analysis of experimental data. However, psychologists seldom make use of the MRPP in data analysis, in part because the MRPP is not implemented in popular statistical packages that psychologists use. A set of SPSS macros implementing the MRPP test is provided in this article. The use of the macros is illustrated by analyzing example data sets.
Some space shuttle tile/strain-isolator-pad sinusoidal vibration tests
NASA Technical Reports Server (NTRS)
Miserentino, R.; Pinson, L. D.; Leadbetter, S. A.
1980-01-01
Vibration tests were performed on the tile/strain-isolator-pad system used as thermal protection for the space shuttle orbiter. Experimental data on normal and in-plane vibration response and damping properties are presented. Three test specimens exhibited shear type motion during failures that occurred in the tile near the tile/strain-isolator-pad bond-line. A dynamic instability is described which has large in-plane motion at a frequency one-half that of the nominal driving frequency. Analysis shows that this phenomenon is a parametric response.
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
Preprototype nitrogen supply subsystem development
NASA Technical Reports Server (NTRS)
Heppner, D. B.; Fort, J. H.; Schubert, F. H.
1982-01-01
The design and development of a test stand for the Nitrogen Generation Module (NGM) and a series of tests which verified its operation and performance capability are described. Over 900 hours of parametric testing were achieved. The results from this testing were then used to design an advanced NGM and a self contained, preprototype Nitrogen Supply Subsystem. The NGM consists of three major components: nitrogen generation module, pressure controller and hydrazine storage tank and ancillary components. The most important improvement is the elimination of all sealing surfaces, achieved with a total welded or brazed construction. Additionally, performance was improved by increasing hydrogen separating capability by 20% with no increase in overall packaging size.
Experimental tests of relativistic gravitation theories
NASA Technical Reports Server (NTRS)
Anderson, J. D.
1971-01-01
Experimental tests were studied for determining the potential uses of future deep space missions in studies of relativistic gravity. The extensions to the parametrized post-Newtonian framework to take explicit account of the solar system's center of mass relative to the mean rest frame of the Universe is reported. Discoveries reported include the Machian effects of motion relative to the universal rest frame. Summaries of the JPL research are included.
NASA Astrophysics Data System (ADS)
Stark, Dominic; Launet, Barthelemy; Schawinski, Kevin; Zhang, Ce; Koss, Michael; Turp, M. Dennis; Sartori, Lia F.; Zhang, Hantian; Chen, Yiru; Weigel, Anna K.
2018-06-01
The study of unobscured active galactic nuclei (AGN) and quasars depends on the reliable decomposition of the light from the AGN point source and the extended host galaxy light. The problem is typically approached using parametric fitting routines using separate models for the host galaxy and the point spread function (PSF). We present a new approach using a Generative Adversarial Network (GAN) trained on galaxy images. We test the method using Sloan Digital Sky Survey r-band images with artificial AGN point sources added that are then removed using the GAN and with parametric methods using GALFIT. When the AGN point source is more than twice as bright as the host galaxy, we find that our method, PSFGAN, can recover point source and host galaxy magnitudes with smaller systematic error and a lower average scatter (49 per cent). PSFGAN is more tolerant to poor knowledge of the PSF than parametric methods. Our tests show that PSFGAN is robust against a broadening in the PSF width of ± 50 per cent if it is trained on multiple PSFs. We demonstrate that while a matched training set does improve performance, we can still subtract point sources using a PSFGAN trained on non-astronomical images. While initial training is computationally expensive, evaluating PSFGAN on data is more than 40 times faster than GALFIT fitting two components. Finally, PSFGAN is more robust and easy to use than parametric methods as it requires no input parameters.
Chaotic map clustering algorithm for EEG analysis
NASA Astrophysics Data System (ADS)
Bellotti, R.; De Carlo, F.; Stramaglia, S.
2004-03-01
The non-parametric chaotic map clustering algorithm has been applied to the analysis of electroencephalographic signals, in order to recognize the Huntington's disease, one of the most dangerous pathologies of the central nervous system. The performance of the method has been compared with those obtained through parametric algorithms, as K-means and deterministic annealing, and supervised multi-layer perceptron. While supervised neural networks need a training phase, performed by means of data tagged by the genetic test, and the parametric methods require a prior choice of the number of classes to find, the chaotic map clustering gives a natural evidence of the pathological class, without any training or supervision, thus providing a new efficient methodology for the recognition of patterns affected by the Huntington's disease.
Theory and experiment in gravitational physics
NASA Technical Reports Server (NTRS)
Will, C. M.
1981-01-01
New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.
Theory and experiment in gravitational physics
NASA Astrophysics Data System (ADS)
Will, C. M.
New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.
Privacy-preserving Kruskal-Wallis test.
Guo, Suxin; Zhong, Sheng; Zhang, Aidong
2013-10-01
Statistical tests are powerful tools for data analysis. Kruskal-Wallis test is a non-parametric statistical test that evaluates whether two or more samples are drawn from the same distribution. It is commonly used in various areas. But sometimes, the use of the method is impeded by privacy issues raised in fields such as biomedical research and clinical data analysis because of the confidential information contained in the data. In this work, we give a privacy-preserving solution for the Kruskal-Wallis test which enables two or more parties to coordinately perform the test on the union of their data without compromising their data privacy. To the best of our knowledge, this is the first work that solves the privacy issues in the use of the Kruskal-Wallis test on distributed data. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Chen, Qi; Chen, Quan; Luo, Xiaobing
2014-09-01
In recent years, due to the fast development of high power light-emitting diode (LED), its lifetime prediction and assessment have become a crucial issue. Although the in situ measurement has been widely used for reliability testing in laser diode community, it has not been applied commonly in LED community. In this paper, an online testing method for LED life projection under accelerated reliability test was proposed and the prototype was built. The optical parametric data were collected. The systematic error and the measuring uncertainty were calculated to be within 0.2% and within 2%, respectively. With this online testing method, experimental data can be acquired continuously and sufficient amount of data can be gathered. Thus, the projection fitting accuracy can be improved (r(2) = 0.954) and testing duration can be shortened.
Shi, Yang; Chinnaiyan, Arul M; Jiang, Hui
2015-07-01
High-throughput sequencing of transcriptomes (RNA-Seq) has become a powerful tool to study gene expression. Here we present an R package, rSeqNP, which implements a non-parametric approach to test for differential expression and splicing from RNA-Seq data. rSeqNP uses permutation tests to access statistical significance and can be applied to a variety of experimental designs. By combining information across isoforms, rSeqNP is able to detect more differentially expressed or spliced genes from RNA-Seq data. The R package with its source code and documentation are freely available at http://www-personal.umich.edu/∼jianghui/rseqnp/. jianghui@umich.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Parametric Testing of Chevrons on Single Flow Hot Jets
NASA Technical Reports Server (NTRS)
Bridges, James; Brown, Clifford A.
2004-01-01
A parametric family of chevron nozzles have been studied, looking for relationships between chevron geometric parameters, flow characteristics, and far-field noise. Both cold and hot conditions have been run at acoustic Mach number 0.9. Ten models have been tested, varying chevron count, penetration, length, and chevron symmetry. Four comparative studies were defined from these datasets which show: that chevron length is not a major impact on either flow or sound; that chevron penetration increases noise at high frequency and lowers it at low frequency, especially for low chevron counts; that chevron count is a strong player with good low frequency reductions being achieved with high chevron count without strong high frequency penalty; and that chevron asymmetry slightly reduces the impact of the chevron. Finally, it is shown that although the hot jets differ systematically from the cold one, the overall trends with chevron parameters is the same.
NASA Technical Reports Server (NTRS)
To, Wing H.
2005-01-01
Quantum optical experiments require all the components involved to be extremely stable relative to each other. The stability can be "measured" by using an interferometric experiment. A pair of coherent photons produced by parametric down-conversion could be chosen to be orthogonally polarized initially. By rotating the polarization of one of the wave packets, they can be recombined at a beam splitter such that interference will occur. Theoretically, the interference will create four terms in the wave function. Two terms with both photons going to the same detector, and two terms will have the photons each going to different detectors. However, the latter will cancel each other out, thus no photons will arrive at the two detectors simultaneously under ideal conditions. The stability Of the test-bed can then be inferred by the dependence of coincidence count on the rotation angle.
On-Line Robust Modal Stability Prediction using Wavelet Processing
NASA Technical Reports Server (NTRS)
Brenner, Martin J.; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.
Harsha, Madhavareddy Sri; Praffulla, Mynampati; Babu, Mandava Ramesh; Leneena, Gudugunta; Krishna, Tejavath Sai; Divya, G
2017-05-01
Cavity preparations of posterior teeth have been frequently associated with decreased fracture strength of the teeth. Choosing the correct indirect restoration and the cavity design when restoring the posterior teeth i.e., premolars was difficult as it involves aesthetic, biomechanical and anatomical considerations. To evaluate the fracture resistance and failure pattern of three different cavity designs restored with monolithic zirconia. Human maxillary premolars atraumatically extracted for orthodontic reasons were chosen. A total of 40 teeth were selected and divided into four groups (n=10). Group I-Sound teeth (control with no preparation). Group II-MOD Inlay, Group III-Partial Onlay, Group IV-Complete Onlay. Restorations were fabricated with monolithic partially sintered zirconia CAD (SAGEMAX- NexxZr). All the 30 samples were cemented using Multilink Automix (Ivoclar) and subjected to fracture resistance testing using Universal Testing Machine (UTM) (Instron) with a steel ball of 3.5 mm diameter at crosshead speed of 0.5 mm/minute. Stereomicroscope was used to evaluate the modes of failure of the fractured specimen. Fracture resistance was tested using parametric one way ANOVA test, unpaired t-test and Tukey test. Fracture patterns were assessed using non-parametric Chi-square test. Group IV (Complete Onlay) presented highest fracture resistance and showed statistical significant difference. Group II (MOD Inlay) and Group III (Partial Onlay) showed significantly lower values than the Group I (Sound teeth). However, Groups I, II and III presented no significant difference from each other. Coming to the modes of failure, Group II (MOD Inlay) and Group III (Partial Onlay) presented mixed type of failures; Group IV (Complete Onlay) demonstrated 70% Type I failures. Of the three cavity designs evaluated, Complete Onlay had shown a significant increase in the fracture resistance than the Sound teeth.
Levodopa-Induced Changes in Electromyographic Patterns in Patients with Advanced Parkinson’s Disease
Ruonala, Verneri; Pekkonen, Eero; Airaksinen, Olavi; Kankaanpää, Markku; Karjalainen, Pasi A; Rissanen, Saara M
2018-01-01
Levodopa medication is the most efficient treatment for motor symptoms of Parkinson’s disease (PD). Levodopa significantly alleviates rigidity, rest tremor, and bradykinesia in PD. The severity of motor symptoms can be graded with UPDRS-III scale. Levodopa challenge test is routinely used to assess patients’ eligibility to deep-brain stimulation (DBS) in PD. Feasible and objective measurements to assess motor symptoms of PD during levodopa challenge test would be helpful in unifying the treatment. Twelve patients with advanced PD who were candidates for DBS treatment were recruited to the study. Measurements were done in four phases before and after levodopa challenge test. Rest tremor and rigidity were evaluated using UPDRS-III score. Electromyographic (EMG) signals from biceps brachii and kinematic signals from forearm were recorded with wireless measurement setup. The patients performed two different tasks: arm isometric tension and arm passive flexion–extension. The electromyographic and the kinematic signals were analyzed with parametric, principal component, and spectrum-based approaches. The principal component approach for isometric tension EMG signals showed significant decline in characteristics related to PD during levodopa challenge test. The spectral approach on passive flexion–extension EMG signals showed a significant decrease on involuntary muscle activity during the levodopa challenge test. Both effects were stronger during the levodopa challenge test compared to that of patients’ personal medication. There were no significant changes in the parametric approach for EMG and kinematic signals during the measurement. The results show that a wireless and wearable measurement and analysis can be used to study the effect of levodopa medication in advanced Parkinson’s disease. PMID:29459845
Mapping the Chevallier-Polarski-Linder parametrization onto physical dark energy Models
NASA Astrophysics Data System (ADS)
Scherrer, Robert J.
2015-08-01
We examine the Chevallier-Polarski-Linder (CPL) parametrization, in the context of quintessence and barotropic dark energy models, to determine the subset of such models to which it can provide a good fit. The CPL parametrization gives the equation of state parameter w for the dark energy as a linear function of the scale factor a , namely w =w0+wa(1 -a ). In the case of quintessence models, we find that over most of the w0, wa parameter space the CPL parametrization maps onto a fairly narrow form of behavior for the potential V (ϕ ), while a one-dimensional subset of parameter space, for which wa=κ (1 +w0) , with κ constant, corresponds to a wide range of functional forms for V (ϕ ). For barotropic models, we show that the functional dependence of the pressure on the density, up to a multiplicative constant, depends only on wi=wa+w0 and not on w0 and wa separately. Our results suggest that the CPL parametrization may not be optimal for testing either type of model.
Latent component-based gear tooth fault detection filter using advanced parametric modeling
NASA Astrophysics Data System (ADS)
Ettefagh, M. M.; Sadeghi, M. H.; Rezaee, M.; Chitsaz, S.
2009-10-01
In this paper, a new parametric model-based filter is proposed for gear tooth fault detection. The designing of the filter consists of identifying the most proper latent component (LC) of the undamaged gearbox signal by analyzing the instant modules (IMs) and instant frequencies (IFs) and then using the component with lowest IM as the proposed filter output for detecting fault of the gearbox. The filter parameters are estimated by using the LC theory in which an advanced parametric modeling method has been implemented. The proposed method is applied on the signals, extracted from simulated gearbox for detection of the simulated gear faults. In addition, the method is used for quality inspection of the produced Nissan-Junior vehicle gearbox by gear profile error detection in an industrial test bed. For evaluation purpose, the proposed method is compared with the previous parametric TAR/AR-based filters in which the parametric model residual is considered as the filter output and also Yule-Walker and Kalman filter are implemented for estimating the parameters. The results confirm the high performance of the new proposed fault detection method.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2004-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merion M.
2002-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2003-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
NASA Technical Reports Server (NTRS)
Huddleston, J. D.; Aylward, J. R.
1973-01-01
The investigations and testing associated with the CO2 removal efficiency and voltage degradation of a hydrogen depolarized carbon oxide concentrator are reported. Also discussed is the vibration testing of a water vapor electrolysis cell pair. Performance testing of various HDC cell pairs with Cs2CO3 electrolyte provided sufficient parametric and endurance data to size a six man space station prototype CO2 removal system as having 36 HDC cell pairs, and to verify a life capability exceeding six moths. Testing also demonstrated that tetramethylammonium carbonate is an acceptable HDC electrolyte for operating over the relative humidity range of 30 to 90 percent and over a temperature range of 50 to 80 F.
Survey Of High Speed Test Techniques
NASA Astrophysics Data System (ADS)
Gheewala, Tushar
1988-02-01
The emerging technologies for the characterization and production testing of high-speed devices and integrated circuits are reviewed. The continuing progress in the field of semiconductor technologies will, in the near future, demand test techniques to test 10ps to lOOps gate delays, 10 GHz to 100 GHz analog functions and 10,000 to 100,000 gates on a single chip. Clearly, no single test technique would provide a cost-effective answer to all the above demands. A divide-and-conquer approach based on a judicial selection of parametric, functional and high-speed tests will be required. In addition, design-for-test methods need to be pursued which will include on-chip test electronics as well as circuit techniques that minimize the circuit performance sensitivity to allowable process variations. The electron and laser beam based test technologies look very promising and may provide the much needed solutions to not only the high-speed test problem but also to the need for high levels of fault coverage during functional testing.
DFTB Parameters for the Periodic Table: Part 1, Electronic Structure.
Wahiduzzaman, Mohammad; Oliveira, Augusto F; Philipsen, Pier; Zhechkov, Lyuben; van Lenthe, Erik; Witek, Henryk A; Heine, Thomas
2013-09-10
A parametrization scheme for the electronic part of the density-functional based tight-binding (DFTB) method that covers the periodic table is presented. A semiautomatic parametrization scheme has been developed that uses Kohn-Sham energies and band structure curvatures of real and fictitious homoatomic crystal structures as reference data. A confinement potential is used to tighten the Kohn-Sham orbitals, which includes two free parameters that are used to optimize the performance of the method. The method is tested on more than 100 systems and shows excellent overall performance.
Kim, Da-Eun; Yang, Hyeri; Jang, Won-Hee; Jung, Kyoung-Mi; Park, Miyoung; Choi, Jin Kyu; Jung, Mi-Sook; Jeon, Eun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Park, Jung Eun; Sohn, Soo Jung; Kim, Tae Sung; Ahn, Il Young; Jeong, Tae-Cheon; Lim, Kyung-Min; Bae, SeungJin
2016-01-01
In order for a novel test method to be applied for regulatory purposes, its reliability and relevance, i.e., reproducibility and predictive capacity, must be demonstrated. Here, we examine the predictive capacity of a novel non-radioisotopic local lymph node assay, LLNA:BrdU-FCM (5-bromo-2'-deoxyuridine-flow cytometry), with a cutoff approach and inferential statistics as a prediction model. 22 reference substances in OECD TG429 were tested with a concurrent positive control, hexylcinnamaldehyde 25%(PC), and the stimulation index (SI) representing the fold increase in lymph node cells over the vehicle control was obtained. The optimal cutoff SI (2.7≤cutoff <3.5), with respect to predictive capacity, was obtained by a receiver operating characteristic curve, which produced 90.9% accuracy for the 22 substances. To address the inter-test variability in responsiveness, SI values standardized with PC were employed to obtain the optimal percentage cutoff (42.6≤cutoff <57.3% of PC), which produced 86.4% accuracy. A test substance may be diagnosed as a sensitizer if a statistically significant increase in SI is elicited. The parametric one-sided t-test and non-parametric Wilcoxon rank-sum test produced 77.3% accuracy. Similarly, a test substance could be defined as a sensitizer if the SI means of the vehicle control, and of the low, middle, and high concentrations were statistically significantly different, which was tested using ANOVA or Kruskal-Wallis, with post hoc analysis, Dunnett, or DSCF (Dwass-Steel-Critchlow-Fligner), respectively, depending on the equal variance test, producing 81.8% accuracy. The absolute SI-based cutoff approach produced the best predictive capacity, however the discordant decisions between prediction models need to be examined further. Copyright © 2015 Elsevier Inc. All rights reserved.
Tips and Tricks for Successful Application of Statistical Methods to Biological Data.
Schlenker, Evelyn
2016-01-01
This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.
Hong, Quan Nha; Coutu, Marie-France; Berbiche, Djamal
2017-01-01
The Work Role Functioning Questionnaire (WRFQ) was developed to assess workers' perceived ability to perform job demands and is used to monitor presenteeism. Still few studies on its validity can be found in the literature. The purpose of this study was to assess the items and factorial composition of the Canadian French version of the WRFQ (WRFQ-CF). Two measurement approaches were used to test the WRFQ-CF: Classical Test Theory (CTT) and non-parametric Item Response Theory (IRT). A total of 352 completed questionnaires were analyzed. A four-factor and three-factor model models were tested and shown respectively good fit with 14 items (Root Mean Square Error of Approximation (RMSEA) = 0.06, Standardized Root Mean Square Residual (SRMR) = 0.04, Bentler Comparative Fit Index (CFI) = 0.98) and with 17 items (RMSEA = 0.059, SRMR = 0.048, CFI = 0.98). Using IRT, 13 problematic items were identified, of which 9 were common with CTT. This study tested different models with fewer problematic items found in a three-factor model. Using a non-parametric IRT and CTT for item purification gave complementary results. IRT is still scarcely used and can be an interesting alternative method to enhance the quality of a measurement instrument. More studies are needed on the WRFQ-CF to refine its items and factorial composition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borio, R.W.; Lewis, R.D.; Koucky, R.W.
1996-04-01
Electric utility power plants account for about one-third of the NO{sub x} and two-thirds of the SO{sub 2} emissions in the US cyclone-fired boilers, while representing about 9% of the US coal-fired generating capacity, emit about 14% of the NO{sub x} produced by coal-fired utility boilers. Given this background, the Environmental Protection Agency, the Gas Research Institute, the Electric Power Research Institute, the Pittsburgh Energy Technology Center, and the Ohio Coal Development Office sponsored a program led by ABB Combustion Engineering, Inc. (ABB-CE) to demonstrate reburning on a cyclone-fired boiler. Ohio Edison provided Unit No. 1 at their Niles Stationmore » for the reburn demonstration along with financial assistance. The Niles Unit No. 1 reburn system was started up in September 1990. This reburn program was the first full-scale reburn system demonstration in the US. This report describes work performed during the program. The work included a review of reburn technology, aerodynamic flow model testing of reburn system design concepts, design and construction of the reburn system, parametric performance testing, long-term load dispatch testing, and boiler tube wall thickness monitoring. The report also contains a description of the Niles No. 1 host unit, a discussion of conclusions and recommendations derived from the program, tabulation of data from parametric and long-term tests, and appendices which contain additional tabulated test results.« less
Design, fabrication, and operation of a test rig for high-speed tapered-roller bearings
NASA Technical Reports Server (NTRS)
Signer, H. R.
1974-01-01
A tapered-roller bearing test machine was designed, fabricated and successfully operated at speeds to 20,000 rpm. Infinitely variable radial loads to 26,690 N (6,000 lbs.) and thrust loads to 53,380 N (12,000 lbs.) can be applied to test bearings. The machine instrumentation proved to have the accuracy and reliability required for parametric bearing performance testing and has the capability of monitoring all programmed test parameters at continuous operation during life testing. This system automatically shuts down a test if any important test parameter deviates from the programmed conditions, or if a bearing failure occurs. A lubrication system was developed as an integral part of the machine, capable of lubricating test bearings by external jets and by means of passages feeding through the spindle and bearing rings into the critical internal bearing surfaces. In addition, provisions were made for controlled oil cooling of inner and outer rings to effect the type of bearing thermal management that is required when testing at high speeds.
NASA Technical Reports Server (NTRS)
Wright, J. P.; Wilson, D. E.
1976-01-01
Many payloads currently proposed to be flown by the space shuttle system require long-duration cooling in the 3 to 200 K temperature range. Common requirements also exist for certain DOD payloads. Parametric design and optimization studies are reported for multistage and diode heat pipe radiator systems designed to operate in this temperature range. Also optimized are ground test systems for two long-life passive thermal control concepts operating under specified space environmental conditions. The ground test systems evaluated are ultimately intended to evolve into flight test qualification prototypes for early shuttle flights.
Composite panel development at JPL
NASA Technical Reports Server (NTRS)
Mcelroy, Paul; Helms, Rich
1988-01-01
Parametric computer studies can be use in a cost effective manner to determine optimized composite mirror panel designs. An InterDisciplinary computer Model (IDM) was created to aid in the development of high precision reflector panels for LDR. The materials properties, thermal responses, structural geometries, and radio/optical precision are synergistically analyzed for specific panel designs. Promising panels designs are fabricated and tested so that comparison with panel test results can be used to verify performance prediction models and accommodate design refinement. The iterative approach of computer design and model refinement with performance testing and materials optimization has shown good results for LDR panels.
Cao, Zheng; Nampalliwar, Sourabh; Bambi, Cosimo; Dauser, Thomas; García, Javier A
2018-02-02
Recently, we have extended the x-ray reflection model relxill to test the spacetime metric in the strong gravitational field of astrophysical black holes. In the present Letter, we employ this extended model to analyze XMM-Newton, NuSTAR, and Swift data of the supermassive black hole in 1H0707-495 and test deviations from a Kerr metric parametrized by the Johannsen deformation parameter α_{13}. Our results are consistent with the hypothesis that the spacetime metric around the black hole in 1H0707-495 is described by the Kerr solution.
Non-classical Signature of Parametric Fluorescence and its Application in Metrology
NASA Astrophysics Data System (ADS)
Hamar, M.; Michálek, V.; Pathak, A.
2014-08-01
The article provides a short theoretical background of what the non-classical light means. We applied the criterion for the existence of non-classical effects derived by C.T. Lee on parametric fluorescence. The criterion was originally derived for the study of two light beams with one mode per beam. We checked if the criterion is still working for two multimode beams of parametric down-conversion through numerical simulations. The theoretical results were tested by measurement of photon number statistics of twin beams emitted by nonlinear BBO crystal pumped by intense femtoseconds UV pulse. We used ICCD camera as the detector of photons in both beams. It appears that the criterion can be used for the measurement of the quantum efficiencies of the ICCD cameras.
NASA Technical Reports Server (NTRS)
Walstad, D. G.; Lockman, W. K.
1974-01-01
Data obtained from heat transfer tests of an 0.006-scale space shuttle vehicle in a 3.5-foot hypersonic wind tunnel are presented. The purpose of these tests was to parametrically investigate the ascent heating of the integrated vehicle. Configurations tested were complete for integrated vehicle, orbiter alone, external tank alone, and SRB alone. All configurations were tested with and without transition grit. Testing was conducted at a Mach number of 5.3, and at Reynolds numbers of 2 and 5 million per foot. The angle of attack range varied from 0 to minus 5 degress, execpt for SRB alone, which was tested from minus 5 to 90 degrees. Heat transfer data were obtained from 223 iron-constantan thermocouples attached to thin-skin stainless steel inserts.
Preliminary Options Assessment of Versatile Irradiation Test Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat
The objective of this report is to summarize the work undertaken at INL from April 2016 to January 2017 and aimed at analyzing some options for designing and building a versatile test reactor; the scope of work was agreed upon with DOE-NE. Section 2 presents some results related to KNK II and PRISM Mod A. Section 3 presents some alternatives to the VCTR presented in [ ] as well as a neutronic parametric study to assess the minimum power requirement needed for a 235U metal fueled fast test reactor capable to generate a fast (>100 keV) flux of 4.0 xmore » 1015 n /cm2-s at the test location. Section 4 presents some results regarding a fundamental characteristic of test reactors, namely displacement per atom (dpa) in test samples. Section 5 presents the INL assessment of the ANL fast test reactor design FASTER. Section 6 presents a summary.« less
NASA Technical Reports Server (NTRS)
Olson, B. A.; Lee, H. C.; Osgerby, I. T.; Heck, R. M.; Hess, H.
1980-01-01
The durability of CATCOM catalysts and catalyst supports was experimentally demonstrated in a combustion environment under simulated gas turbine engine combustor operating conditions. A test of 1000 hours duration was completed with one catalyst using no. 2 diesel fuel and operating at catalytically-supported thermal combustion conditions. The performance of the catalyst was determined by monitoring emissions throughout the test, and by examining the physical condition of the catalyst core at the conclusion of the test. Tests were performed periodically to determine changes in catalytic activity of the catalyst core. Detailed parametric studies were also run at the beginning and end of the durability test, using no. 2 fuel oil. Initial and final emissions for the 1000 hours test respectively were: unburned hydrocarbons (C3 vppm):0, 146, carbon monoxide (vppm):30, 2420; nitrogen oxides (vppm):5.7, 5.6.
NASA Technical Reports Server (NTRS)
Moog, R. D.; Bacchus, D. L.; Utreja, L. R.
1979-01-01
The aerodynamic performance characteristics have been determined for the Space Shuttle Solid Rocket Booster drogue, main, and pilot parachutes. The performance evaluation on the 20-degree conical ribbon parachutes is based primarily on air drop tests of full scale prototype parachutes. In addition, parametric wind tunnel tests were performed and used in parachute configuration development and preliminary performance assessments. The wind tunnel test data are compared to the drop test results and both sets of data are used to determine the predicted performance of the Solid Rocket Booster flight parachutes. Data from other drop tests of large ribbon parachutes are also compared with the Solid Rocket Booster parachute performance characteristics. Parameters assessed include full open terminal drag coefficients, reefed drag area, opening characteristics, clustering effects, and forebody interference.
Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker
2012-08-01
Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.
Evaluation of Second-Level Inference in fMRI Analysis
Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs
2016-01-01
We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578
OBIST methodology incorporating modified sensitivity of pulses for active analogue filter components
NASA Astrophysics Data System (ADS)
Khade, R. H.; Chaudhari, D. S.
2018-03-01
In this paper, oscillation-based built-in self-test method is used to diagnose catastrophic and parametric faults in integrated circuits. Sallen-Key low pass filter and high pass filter circuits with different gains are used to investigate defects. Variation in seven parameters of operational amplifier (OP-AMP) like gain, input impedance, output impedance, slew rate, input bias current, input offset current, input offset voltage and catastrophic as well as parametric defects in components outside OP-AMP are introduced in the circuit and simulation results are analysed. Oscillator output signal is converted to pulses which are used to generate a signature of the circuit. The signature and pulse count changes with the type of fault present in the circuit under test (CUT). The change in oscillation frequency is observed for fault detection. Designer has flexibility to predefine tolerance band of cut-off frequency and range of pulses for which circuit should be accepted. The fault coverage depends upon the required tolerance band of the CUT. We propose a modification of sensitivity of parameter (pulses) to avoid test escape and enhance yield. Result shows that the method provides 100% fault coverage for catastrophic faults.
Xavier, Shannon; Best, Michael W; Schorr, Emily; Bowie, Christopher R
2015-01-01
Schizotypy is phenologically and genetically related to schizophrenia-spectrum illness. Previous studies find cognitive function to be mildly impaired, but specific impairments and their relationship to functioning are not well understood. In this study, we sought to examine how cognitive load affects performance in schizotypy and to examine whether impairments might manifest in functional capacity and quality of life. Undergraduate students were screened for abnormally high levels of schizotypy (N = 72) and compared to those without psychopathology (N = 80) on a standard battery of neuropsychological tests, cognitive tests with varying cognitive load, functional capacity measures and quality of life. The high schizotypy group did not differ from controls on traditional measures of neuropsychological functioning, but an interaction of group by cognitive load was observed, where those with schizotypy manifested a greater decline in performance as information processing load was parametrically increased. Differences in functioning were observed and cognitive impairment was associated with impaired functioning. Cognitive and functional impairment can be observed in those with high schizotypal traits who are non-treatment seeking. The sensitivity of cognitive tests to impairment in this population might be a function of their ability to parametrically increase cognitive load.
Comparison of System Identification Techniques for the Hydraulic Manipulator Test Bed (HMTB)
NASA Technical Reports Server (NTRS)
Morris, A. Terry
1996-01-01
In this thesis linear, dynamic, multivariable state-space models for three joints of the ground-based Hydraulic Manipulator Test Bed (HMTB) are identified. HMTB, housed at the NASA Langley Research Center, is a ground-based version of the Dexterous Orbital Servicing System (DOSS), a representative space station manipulator. The dynamic models of the HMTB manipulator will first be estimated by applying nonparametric identification methods to determine each joint's response characteristics using various input excitations. These excitations include sum of sinusoids, pseudorandom binary sequences (PRBS), bipolar ramping pulses, and chirp input signals. Next, two different parametric system identification techniques will be applied to identify the best dynamical description of the joints. The manipulator is localized about a representative space station orbital replacement unit (ORU) task allowing the use of linear system identification methods. Comparisons, observations, and results of both parametric system identification techniques are discussed. The thesis concludes by proposing a model reference control system to aid in astronaut ground tests. This approach would allow the identified models to mimic on-orbit dynamic characteristics of the actual flight manipulator thus providing astronauts with realistic on-orbit responses to perform space station tasks in a ground-based environment.
Linkage mapping of beta 2 EEG waves via non-parametric regression.
Ghosh, Saurabh; Begleiter, Henri; Porjesz, Bernice; Chorlian, David B; Edenberg, Howard J; Foroud, Tatiana; Goate, Alison; Reich, Theodore
2003-04-01
Parametric linkage methods for analyzing quantitative trait loci are sensitive to violations in trait distributional assumptions. Non-parametric methods are relatively more robust. In this article, we modify the non-parametric regression procedure proposed by Ghosh and Majumder [2000: Am J Hum Genet 66:1046-1061] to map Beta 2 EEG waves using genome-wide data generated in the COGA project. Significant linkage findings are obtained on chromosomes 1, 4, 5, and 15 with findings at multiple regions on chromosomes 4 and 15. We analyze the data both with and without incorporating alcoholism as a covariate. We also test for epistatic interactions between regions of the genome exhibiting significant linkage with the EEG phenotypes and find evidence of epistatic interactions between a region each on chromosome 1 and chromosome 4 with one region on chromosome 15. While regressing out the effect of alcoholism does not affect the linkage findings, the epistatic interactions become statistically insignificant. Copyright 2003 Wiley-Liss, Inc.
Observational Signatures of Parametric Instability at 1AU
NASA Astrophysics Data System (ADS)
Bowen, T. A.; Bale, S. D.; Badman, S.
2017-12-01
Observations and simulations of inertial compressive turbulence in the solar wind are characterized by density structures anti-correlated with magnetic fluctuations parallel to the mean field. This signature has been interpreted as observational evidence for non-propagating pressure balanced structures (PBS), kinetic ion acoustic waves, as well as the MHD slow mode. Recent work, specifically Verscharen et al. (2017), has highlighted the unexpected fluid like nature of the solar wind. Given the high damping rates of parallel propagating compressive fluctuations, their ubiquity in satellite observations is surprising and suggests the presence of a driving process. One possible candidate for the generation of compressive fluctuations in the solar wind is the parametric instability, in which large amplitude Alfvenic fluctuations decay into parallel propagating compressive waves. This work employs 10 years of WIND observations in order to test the parametric decay process as a source of compressive waves in the solar wind through comparing collisionless damping rates of compressive fluctuations with growth rates of the parametric instability. Preliminary results suggest that generation of compressive waves through parametric decay is overdamped at 1 AU. However, the higher parametric decay rates expected in the inner heliosphere likely allow for growth of the slow mode-the remnants of which could explain density fluctuations observed at 1AU.
Nickel metal hydride LEO cycle testing
NASA Technical Reports Server (NTRS)
Lowery, Eric
1995-01-01
The George C. Marshall Space Flight Center is working to characterize aerospace AB5 Nickel Metal Hydride (NiMH) cells. The cells are being evaluated in terms of storage, low earth orbit (LEO) cycling, and response to parametric testing (high rate charge and discharge, charge retention, pulse current ability, etc.). Cells manufactured by Eagle Picher are the subjects of the evaluation. There is speculation that NiMH cells may become direct replacements for current Nickel Cadmium cells in the near future.
Development of an inflatable radiator system. [for space shuttles
NASA Technical Reports Server (NTRS)
Leach, J. W.
1976-01-01
Conceptual designs of an inflatable radiator system developed for supplying short duration supplementary cooling of space vehicles are described along with parametric trade studies, materials evaluation/selection studies, thermal and structural analyses, and numerous element tests. Fabrication techniques developed in constructing the engineering models and performance data from the model thermal vacuum tests are included. Application of these data to refining the designs of the flight articles and to constructing a full scale prototype radiator is discussed.
The theoretical tools of experimental gravitation
NASA Technical Reports Server (NTRS)
Will, C. M.
1972-01-01
Theoretical frameworks for testing relativistic gravity are presented in terms of a system for analyzing theories of gravity invented as alternatives to Einstein. The parametrized post-Newtonian (PPN) formalism, based on the Dicke framework and the Eotvos-Dicke-Braginsky experiment, is discussed in detail. The metric theories of gravity, and their post-Newtonian limits are reviewed, and PPN equations of motion are derived. These equations are used to analyze specific effects and experimental tests in the solar system.
Theoretical frameworks for testing relativistic gravity: A review
NASA Technical Reports Server (NTRS)
Thorne, K. S.; Will, C. M.; Ni, W.
1971-01-01
Metric theories of gravity are presented, including the definition of metric theory, evidence for its existence, and response of matter to gravity with test body trajectories, gravitational red shift, and stressed matter responses. Parametrized post-Newtonian framework and interpretations are reviewed. Gamma, beta and gamma, and varied other parameters were measured. Deflection of electromagnetic waves, radar time delay, geodetic gyroscope precession, perihelion shifts, and periodic effects in orbits are among various studies carried out for metric theory experimentation.
Constraining torsion with Gravity Probe B
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao Yi; Guth, Alan H.; Cabi, Serkan
2007-11-15
It is well-entrenched folklore that all torsion gravity theories predict observationally negligible torsion in the solar system, since torsion (if it exists) couples only to the intrinsic spin of elementary particles, not to rotational angular momentum. We argue that this assumption has a logical loophole which can and should be tested experimentally, and consider nonstandard torsion theories in which torsion can be generated by macroscopic rotating objects. In the spirit of action=reaction, if a rotating mass like a planet can generate torsion, then a gyroscope would be expected to feel torsion. An experiment with a gyroscope (without nuclear spin) suchmore » as Gravity Probe B (GPB) can test theories where this is the case. Using symmetry arguments, we show that to lowest order, any torsion field around a uniformly rotating spherical mass is determined by seven dimensionless parameters. These parameters effectively generalize the parametrized post-Newtonian formalism and provide a concrete framework for further testing Einstein's general theory of relativity (GR). We construct a parametrized Lagrangian that includes both standard torsion-free GR and Hayashi-Shirafuji maximal torsion gravity as special cases. We demonstrate that classic solar system tests rule out the latter and constrain two observable parameters. We show that Gravity Probe B is an ideal experiment for further constraining nonstandard torsion theories, and work out the most general torsion-induced precession of its gyroscope in terms of our torsion parameters.« less
Key statistical and analytical issues for evaluating treatment effects in periodontal research.
Tu, Yu-Kang; Gilthorpe, Mark S
2012-06-01
Statistics is an indispensible tool for evaluating treatment effects in clinical research. Due to the complexities of periodontal disease progression and data collection, statistical analyses for periodontal research have been a great challenge for both clinicians and statisticians. The aim of this article is to provide an overview of several basic, but important, statistical issues related to the evaluation of treatment effects and to clarify some common statistical misconceptions. Some of these issues are general, concerning many disciplines, and some are unique to periodontal research. We first discuss several statistical concepts that have sometimes been overlooked or misunderstood by periodontal researchers. For instance, decisions about whether to use the t-test or analysis of covariance, or whether to use parametric tests such as the t-test or its non-parametric counterpart, the Mann-Whitney U-test, have perplexed many periodontal researchers. We also describe more advanced methodological issues that have sometimes been overlooked by researchers. For instance, the phenomenon of regression to the mean is a fundamental issue to be considered when evaluating treatment effects, and collinearity amongst covariates is a conundrum that must be resolved when explaining and predicting treatment effects. Quick and easy solutions to these methodological and analytical issues are not always available in the literature, and careful statistical thinking is paramount when conducting useful and meaningful research. © 2012 John Wiley & Sons A/S.
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
STOVL Hot Gas Ingestion control technology
NASA Technical Reports Server (NTRS)
Amuedo, K. C.; Williams, B. R.; Flood, J. D.; Johns, A. L.
1991-01-01
A comprehensive wind tunnel test program was conducted to evaluate control of Hot Gas Ingestion (HGI) on a 9.2 percent scale model of the McDonnell Aircraft Company model 279-3C advanced Short Takeoff and Vertical Landing (STOVL) configuration. The test was conducted in the NASA-Lewis Research Center 9 ft by 15 ft Low Speed Wind Tunnel during the summer of 1987. Initial tests defined baseline HGI levels as determined by engine face temperature rise and temperature distortion. Subsequent testing was conducted to evaluate HGI control parametrically using Lift Improvement Devices (LIDs), forward nozzle splay angle, a combination of LIDs and forward nozzle splay angle, and main inlet blocking. The results from this test program demonstrate that HGI can be effectively controlled and that HGI is not a barrier to STOVL aircraft development.
A method for developing design diagrams for ceramic and glass materials using fatigue data
NASA Technical Reports Server (NTRS)
Heslin, T. M.; Magida, M. B.; Forrest, K. A.
1986-01-01
The service lifetime of glass and ceramic materials can be expressed as a plot of time-to-failure versus applied stress whose plot is parametric in percent probability of failure. This type of plot is called a design diagram. Confidence interval estimates for such plots depend on the type of test that is used to generate the data, on assumptions made concerning the statistical distribution of the test results, and on the type of analysis used. This report outlines the development of design diagrams for glass and ceramic materials in engineering terms using static or dynamic fatigue tests, assuming either no particular statistical distribution of test results or a Weibull distribution and using either median value or homologous ratio analysis of the test results.
Petti, S; Messano, G A
2016-05-01
Traditional cleaning and disinfection methods are inefficient for complete decontamination of hospital surfaces from meticillin-resistant Staphylococcus aureus (MRSA). Additional methods, such as nano-TiO2-based photocatalytic disinfection (PCD), could be helpful. To evaluate anti-MRSA activity of PCD on polyvinyl chloride (PVC) surfaces in natural-like conditions. Two identical PVC surfaces were used, and nano-TiO2 was incorporated into one of them. The surfaces were contaminated with MRSA isolated from hospitalized patients using a mist sprayer to simulate the mode of environmental contamination caused by a carrier. MRSA cell density was assessed before contamination until 180min after contamination using Rodac plates. The differences between test and control surfaces in terms of MRSA density and log MRSA density reduction were assessed using parametric and non-parametric statistical tests. Five strains were tested, and each strain was tested five times. The highest median MRSA densities [46.3 and 43.1 colony-forming units (cfu)/cm(2) for control and test surfaces, respectively] were detected 45min after contamination. Median MRSA densities 180min after contamination were 10.1 and 0.7cfu/cm(2) for control and test surfaces, respectively (P<0.01). Log MRSA density reduction attributable to PCD was 1.16logcfu/cm(2), corresponding to 93% reduction of the baseline MRSA contamination. The disinfectant activity remained stable throughout the 25 testing occasions, despite between-test cleaning and disinfection. The anti-MRSA activity of PCD was compatible with the benchmark for surface hygiene in hospitals (<1cfu/cm(2)), but required 3h of exposure to photocatalysis. Thus, PCD could be considered for non-clinical surfaces. However, for clinical surfaces, PCD should be regarded as supplemental to conventional decontamination procedures, rather than an alternative. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Li, Zhigang; Liu, Weiguo; Zhang, Jinhuan; Hu, Jingwen
2015-09-01
Skull fracture is one of the most common pediatric traumas. However, injury assessment tools for predicting pediatric skull fracture risk is not well established mainly due to the lack of cadaver tests. Weber conducted 50 pediatric cadaver drop tests for forensic research on child abuse in the mid-1980s (Experimental studies of skull fractures in infants, Z Rechtsmed. 92: 87-94, 1984; Biomechanical fragility of the infant skull, Z Rechtsmed. 94: 93-101, 1985). To our knowledge, these studies contained the largest sample size among pediatric cadaver tests in the literature. However, the lack of injury measurements limited their direct application in investigating pediatric skull fracture risks. In this study, 50 pediatric cadaver tests from Weber's studies were reconstructed using a parametric pediatric head finite element (FE) model which were morphed into subjects with ages, head sizes/shapes, and skull thickness values that reported in the tests. The skull fracture risk curves for infants from 0 to 9 months old were developed based on the model-predicted head injury measures through logistic regression analysis. It was found that the model-predicted stress responses in the skull (maximal von Mises stress, maximal shear stress, and maximal first principal stress) were better predictors than global kinematic-based injury measures (peak head acceleration and head injury criterion (HIC)) in predicting pediatric skull fracture. This study demonstrated the feasibility of using age- and size/shape-appropriate head FE models to predict pediatric head injuries. Such models can account for the morphological variations among the subjects, which cannot be considered by a single FE human model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Royer, Michael P.; McCullough, Jeffrey J.; Tucker, Joseph C.
The lumen depreciation and color shift of 17 different A lamps (15 LED, 1 CFL, 1 halogen) was monitored in the automated long-term test apparatus (ALTA) for more than 7,500 hours. Ten samples of each lamp model were tested, with measurements recorded on a weekly basis. The lamps were operated continuously at an ambient temperature of 45°C (-1°C). Importantly, the steady-state test conditions were not optimized for inducing catastrophic failure for any of the lamp technologies—to which thermal cycling is a strong contributor— and are not typical of normal use patterns—which usually include off periods where the lamp cools down.more » Further, the test conditions differ from those used in standardized long-term test methods (i.e., IES LM-80, IES LM-84), so the results should not be directly compared. On the other hand, the test conditions are similar to those used by ENERGY STAR (when elevated temperature testing is called for). Likewise, the conditions and assumptions used by manufacturers to generated lifetime claims may vary; the CALiPER long-term data is informative, but cannot necessarily be used to discredit manufacturer claims. The test method used for this investigation should be interpreted as one more focused on the long-term effects of elevated temperature operation, at an ambient temperature that is not uncommon in luminaires. On average, the lumen maintenance of the LED lamps monitored in the ALTA was better than benchmark lamps, but there was considerable variation from lamp model to lamp model. While three lamp models had average lumen maintenance above 99% at the end of the study period, two products had average lumen maintenance below 65%, constituting a parametric failure. These two products, along with a third, also exhibited substantial color shift, another form of parametric failure. While none of the LED lamps exhibited catastrophic failure—and all of the benchmarks did—the early degradation of performance is concerning, especially with a new technology trying to build a reputation with consumers. Beyond the observed parametric failures nearly half of the products failed to meet early-life thresholds for lumen maintenance, which were borrowed from ENERGY STAR specifications. That is, the lumen maintenance was sufficiently low at 6,000 hours that seven of the products are unlikely to have lumen maintenance above 70% at their rated lifetime (which was usually 25,000 hours). Given the methods used for this investigation—most notably continuous operation—the results should not be interpreted as indicative of a lamp’s performance in a typical environment. Likewise, these results are not directly relatable to manufacturer lifetime claims. This report is best used to understand the variation in LED product performance, compare the robustness of LED lamps and benchmark conventional lamps, and understand the characteristics of lumen and chromaticity change. A key takeaway is that the long-term performance of LED lamps can vary greatly from model to model (i.e., the technology is not homogenous), although the lamp-to-lamp consistency within a given model is relatively good. Further, operation of LED lamps in an enclosed luminaire (or otherwise in high ambient temperatures), can induce parametric failure of LEDs much earlier than their rated lifetime; manufacturer warnings about such conditions should be followed if performance degradation is unacceptable.« less
Cost-Aware Design of a Discrimination Strategy for Unexploded Ordnance Cleanup
2011-02-25
Acronyms ANN: Artificial Neural Network AUC: Area Under the Curve BRAC: Base Realignment And Closure DLRT: Distance Likelihood Ratio Test EER...Discriminative Aggregate Nonparametric [25] Artificial Neural Network ANN Discriminative Aggregate Parametric [33] 11 Results and Discussion Task #1
Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan
2016-01-01
The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.
NASA Astrophysics Data System (ADS)
Anees, Asim; Aryal, Jagannath; O'Reilly, Małgorzata M.; Gale, Timothy J.; Wardlaw, Tim
2016-12-01
A robust non-parametric framework, based on multiple Radial Basic Function (RBF) kernels, is proposed in this study, for detecting land/forest cover changes using Landsat 7 ETM+ images. One of the widely used frameworks is to find change vectors (difference image) and use a supervised classifier to differentiate between change and no-change. The Bayesian Classifiers e.g. Maximum Likelihood Classifier (MLC), Naive Bayes (NB), are widely used probabilistic classifiers which assume parametric models, e.g. Gaussian function, for the class conditional distributions. However, their performance can be limited if the data set deviates from the assumed model. The proposed framework exploits the useful properties of Least Squares Probabilistic Classifier (LSPC) formulation i.e. non-parametric and probabilistic nature, to model class posterior probabilities of the difference image using a linear combination of a large number of Gaussian kernels. To this end, a simple technique, based on 10-fold cross-validation is also proposed for tuning model parameters automatically instead of selecting a (possibly) suboptimal combination from pre-specified lists of values. The proposed framework has been tested and compared with Support Vector Machine (SVM) and NB for detection of defoliation, caused by leaf beetles (Paropsisterna spp.) in Eucalyptus nitens and Eucalyptus globulus plantations of two test areas, in Tasmania, Australia, using raw bands and band combination indices of Landsat 7 ETM+. It was observed that due to multi-kernel non-parametric formulation and probabilistic nature, the LSPC outperforms parametric NB with Gaussian assumption in change detection framework, with Overall Accuracy (OA) ranging from 93.6% (κ = 0.87) to 97.4% (κ = 0.94) against 85.3% (κ = 0.69) to 93.4% (κ = 0.85), and is more robust to changing data distributions. Its performance was comparable to SVM, with added advantages of being probabilistic and capable of handling multi-class problems naturally with its original formulation.
NASA Astrophysics Data System (ADS)
Diakogiannis, Foivos I.; Lewis, Geraint F.; Ibata, Rodrigo A.; Guglielmo, Magda; Kafle, Prajwal R.; Wilkinson, Mark I.; Power, Chris
2017-09-01
Dwarf galaxies, among the most dark matter dominated structures of our Universe, are excellent test-beds for dark matter theories. Unfortunately, mass modelling of these systems suffers from the well-documented mass-velocity anisotropy degeneracy. For the case of spherically symmetric systems, we describe a method for non-parametric modelling of the radial and tangential velocity moments. The method is a numerical velocity anisotropy 'inversion', with parametric mass models, where the radial velocity dispersion profile, σrr2, is modelled as a B-spline, and the optimization is a three-step process that consists of (I) an evolutionary modelling to determine the mass model form and the best B-spline basis to represent σrr2; (II) an optimization of the smoothing parameters and (III) a Markov chain Monte Carlo analysis to determine the physical parameters. The mass-anisotropy degeneracy is reduced into mass model inference, irrespective of kinematics. We test our method using synthetic data. Our algorithm constructs the best kinematic profile and discriminates between competing dark matter models. We apply our method to the Fornax dwarf spheroidal galaxy. Using a King brightness profile and testing various dark matter mass models, our model inference favours a simple mass-follows-light system. We find that the anisotropy profile of Fornax is tangential (β(r) < 0) and we estimate a total mass of M_{tot} = 1.613^{+0.050}_{-0.075} × 10^8 M_{⊙}, and a mass-to-light ratio of Υ_V = 8.93 ^{+0.32}_{-0.47} (M_{⊙}/L_{⊙}). The algorithm we present is a robust and computationally inexpensive method for non-parametric modelling of spherical clusters independent of the mass-anisotropy degeneracy.
Zhao, Ni; Chen, Jun; Carroll, Ian M.; Ringel-Kulka, Tamar; Epstein, Michael P.; Zhou, Hua; Zhou, Jin J.; Ringel, Yehuda; Li, Hongzhe; Wu, Michael C.
2015-01-01
High-throughput sequencing technology has enabled population-based studies of the role of the human microbiome in disease etiology and exposure response. Distance-based analysis is a popular strategy for evaluating the overall association between microbiome diversity and outcome, wherein the phylogenetic distance between individuals’ microbiome profiles is computed and tested for association via permutation. Despite their practical popularity, distance-based approaches suffer from important challenges, especially in selecting the best distance and extending the methods to alternative outcomes, such as survival outcomes. We propose the microbiome regression-based kernel association test (MiRKAT), which directly regresses the outcome on the microbiome profiles via the semi-parametric kernel machine regression framework. MiRKAT allows for easy covariate adjustment and extension to alternative outcomes while non-parametrically modeling the microbiome through a kernel that incorporates phylogenetic distance. It uses a variance-component score statistic to test for the association with analytical p value calculation. The model also allows simultaneous examination of multiple distances, alleviating the problem of choosing the best distance. Our simulations demonstrated that MiRKAT provides correctly controlled type I error and adequate power in detecting overall association. “Optimal” MiRKAT, which considers multiple candidate distances, is robust in that it suffers from little power loss in comparison to when the best distance is used and can achieve tremendous power gain in comparison to when a poor distance is chosen. Finally, we applied MiRKAT to real microbiome datasets to show that microbial communities are associated with smoking and with fecal protease levels after confounders are controlled for. PMID:25957468
Simulation-based hypothesis testing of high dimensional means under covariance heterogeneity.
Chang, Jinyuan; Zheng, Chao; Zhou, Wen-Xin; Zhou, Wen
2017-12-01
In this article, we study the problem of testing the mean vectors of high dimensional data in both one-sample and two-sample cases. The proposed testing procedures employ maximum-type statistics and the parametric bootstrap techniques to compute the critical values. Different from the existing tests that heavily rely on the structural conditions on the unknown covariance matrices, the proposed tests allow general covariance structures of the data and therefore enjoy wide scope of applicability in practice. To enhance powers of the tests against sparse alternatives, we further propose two-step procedures with a preliminary feature screening step. Theoretical properties of the proposed tests are investigated. Through extensive numerical experiments on synthetic data sets and an human acute lymphoblastic leukemia gene expression data set, we illustrate the performance of the new tests and how they may provide assistance on detecting disease-associated gene-sets. The proposed methods have been implemented in an R-package HDtest and are available on CRAN. © 2017, The International Biometric Society.
A SAS(®) macro implementation of a multiple comparison post hoc test for a Kruskal-Wallis analysis.
Elliott, Alan C; Hynan, Linda S
2011-04-01
The Kruskal-Wallis (KW) nonparametric analysis of variance is often used instead of a standard one-way ANOVA when data are from a suspected non-normal population. The KW omnibus procedure tests for some differences between groups, but provides no specific post hoc pair wise comparisons. This paper provides a SAS(®) macro implementation of a multiple comparison test based on significant Kruskal-Wallis results from the SAS NPAR1WAY procedure. The implementation is designed for up to 20 groups at a user-specified alpha significance level. A Monte-Carlo simulation compared this nonparametric procedure to commonly used parametric multiple comparison tests. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Upper surface blowing noise of the NASA-Ames quiet short-haul research aircraft
NASA Technical Reports Server (NTRS)
Bohn, A. J.; Shovlin, M. D.
1980-01-01
An experimental study of the propulsive-lift noise of the NASA-Ames quiet short-haul research aircraft (QSRA) is described. Comparisons are made of measured QSRA flyover noise and model propulsive-lift noise data available in references. Developmental tests of trailing-edge treatments were conducted using sawtooth-shaped and porous USB flap trailing-edge extensions. Small scale parametric tests were conducted to determine noise reduction/design relationships. Full-scale static tests were conducted with the QSRA preparatory to the selection of edge treatment designs for flight testing. QSRA flight and published model propulsive-lift noise data have similar characteristics. Noise reductions of 2 to 3 dB were achieved over a wide range of frequency and directivity angles in static tests of the QSRA. These noise reductions are expected to be achieved or surpassed in flight tests planned by NASA in 1980.
Appplications of the post-Tolman-Oppenheimer-Volkoff formalism
NASA Astrophysics Data System (ADS)
Silva, Hector O.; Glampedakis, Kostas; Pappas, George; Berti, Emanuele
2017-01-01
Besides their astrophysical interest, neutron stars are promising candidates for testing theories of gravity in the strong-field regime. It is known that, generically, modifications to general relativity affect the bulk properties of neutron stars, e.g. their masses and radii, in a way that depends on the specific choice of theory. In this presentation we review a theory-agnostic approach to model relativistic stars, called the post-Tolman-Oppenheimer-Volkoff formalism. Drawing inspiration from the parametrized post-Newtonian formalism, this framework allows us to describe perturbative deviations from general relativity in the structure of neutrons stars in a parametrized manner. We show that a variety of astrophysical observables (namely the surface redshift, the apparent radius, the Eddington luminosity and the orbital frequency of particles in geodesic motion around neutron stars) can be parametrized using only two parameters.
Simple and flexible SAS and SPSS programs for analyzing lag-sequential categorical data.
O'Connor, B P
1999-11-01
This paper describes simple and flexible programs for analyzing lag-sequential categorical data, using SAS and SPSS. The programs read a stream of codes and produce a variety of lag-sequential statistics, including transitional frequencies, expected transitional frequencies, transitional probabilities, adjusted residuals, z values, Yule's Q values, likelihood ratio tests of stationarity across time and homogeneity across groups or segments, transformed kappas for unidirectional dependence, bidirectional dependence, parallel and nonparallel dominance, and significance levels based on both parametric and randomization tests.
1984-09-28
variables before simula- tion of model - Search for reality checks a, - Express uncertainty as a probability density distribution. a. H2 a, H-22 TWIF... probability that the software con- tains errors. This prior is updated as test failure data are accumulated. Only a p of 1 (software known to contain...discusssed; both parametric and nonparametric versions are presented. It is shown by the author that the bootstrap underlies the jackknife method and
Predicting multi-wall structural response to hypervelocity impact using the hull code
NASA Technical Reports Server (NTRS)
Schonberg, William P.
1993-01-01
Previously, multi-wall structures have been analyzed extensively, primarily through experiment, as a means of increasing the meteoroid/space debris impact protection of spacecraft. As structural configurations become more varied, the number of tests required to characterize their response increases dramatically. As an alternative to experimental testing, numerical modeling of high-speed impact phenomena is often being used to predict the response of a variety of structural systems under different impact loading conditions. The results of comparing experimental tests to Hull Hydrodynamic Computer Code predictions are reported. Also, the results of a numerical parametric study of multi-wall structural response to hypervelocity cylindrical projectile impact are presented.
Advanced air revitalization system testing
NASA Technical Reports Server (NTRS)
Heppner, D. B.; Hallick, T. M.; Schubert, F. H.
1983-01-01
A previously developed experimental air revitalization system was tested cyclically and parametrically. One-button startup without manual interventions; extension by 1350 hours of tests with the system; capability for varying process air carbon dioxide partial pressure and humidity and coolant source for simulation of realistic space vehicle interfaces; dynamic system performance response on the interaction of the electrochemical depolarized carbon dioxide concentrator, the Sabatier carbon dioxide reduction subsystem, and the static feed water electrolysis oxygen generation subsystem, the carbon dioxide concentrator module with unitized core technology for the liquid cooled cell; and a preliminary design for a regenerative air revitalization system for the space station are discussed.
Total recognition discriminability in Huntington's and Alzheimer's disease.
Graves, Lisa V; Holden, Heather M; Delano-Wood, Lisa; Bondi, Mark W; Woods, Steven Paul; Corey-Bloom, Jody; Salmon, David P; Delis, Dean C; Gilbert, Paul E
2017-03-01
Both the original and second editions of the California Verbal Learning Test (CVLT) provide an index of total recognition discriminability (TRD) but respectively utilize nonparametric and parametric formulas to compute the index. However, the degree to which population differences in TRD may vary across applications of these nonparametric and parametric formulas has not been explored. We evaluated individuals with Huntington's disease (HD), individuals with Alzheimer's disease (AD), healthy middle-aged adults, and healthy older adults who were administered the CVLT-II. Yes/no recognition memory indices were generated, including raw nonparametric TRD scores (as used in CVLT-I) and raw and standardized parametric TRD scores (as used in CVLT-II), as well as false positive (FP) rates. Overall, the patient groups had significantly lower TRD scores than their comparison groups. The application of nonparametric and parametric formulas resulted in comparable effect sizes for all group comparisons on raw TRD scores. Relative to the HD group, the AD group showed comparable standardized parametric TRD scores (despite lower raw nonparametric and parametric TRD scores), whereas the previous CVLT literature has shown that standardized TRD scores are lower in AD than in HD. Possible explanations for the similarity in standardized parametric TRD scores in the HD and AD groups in the present study are discussed, with an emphasis on the importance of evaluating TRD scores in the context of other indices such as FP rates in an effort to fully capture recognition memory function using the CVLT-II.
Type I Error Rates and Power Estimates of Selected Parametric and Nonparametric Tests of Scale.
ERIC Educational Resources Information Center
Olejnik, Stephen F.; Algina, James
1987-01-01
Estimated Type I Error rates and power are reported for the Brown-Forsythe, O'Brien, Klotz, and Siegal-Tukey procedures. The effect of aligning the data using deviations from group means or group medians is investigated. (RB)
The Distribution of the Sum of Signed Ranks
ERIC Educational Resources Information Center
Albright, Brian
2012-01-01
We describe the calculation of the distribution of the sum of signed ranks and develop an exact recursive algorithm for the distribution as well as an approximation of the distribution using the normal. The results have applications to the non-parametric Wilcoxon signed-rank test.
NASA Technical Reports Server (NTRS)
Splettstoesser, W. R.; Schultz, K. J.; Boxwell, D. A.; Schmitz, F. H.
1984-01-01
Acoustic data taken in the anechoic Deutsch-Niederlaendischer Windkanal (DNW) have documented the blade vortex interaction (BVI) impulsive noise radiated from a 1/7-scale model main rotor of the AH-1 series helicopter. Averaged model scale data were compared with averaged full scale, inflight acoustic data under similar nondimensional test conditions. At low advance ratios (mu = 0.164 to 0.194), the data scale remarkable well in level and waveform shape, and also duplicate the directivity pattern of BVI impulsive noise. At moderate advance ratios (mu = 0.224 to 0.270), the scaling deteriorates, suggesting that the model scale rotor is not adequately simulating the full scale BVI noise; presently, no proved explanation of this discrepancy exists. Carefully performed parametric variations over a complete matrix of testing conditions have shown that all of the four governing nondimensional parameters - tip Mach number at hover, advance ratio, local inflow ratio, and thrust coefficient - are highly sensitive to BVI noise radiation.
NASA Astrophysics Data System (ADS)
Houssein, Hend A. A.; Jaafar, M. S.; Ramli, R. M.; Ismail, N. E.; Ahmad, A. L.; Bermakai, M. Y.
2010-07-01
In this study, the subpopulations of human blood parameters including lymphocytes, the mid-cell fractions (eosinophils, basophils, and monocytes), and granulocytes were determined by electronic sizing in the Health Centre of Universiti Sains Malaysia. These parameters have been correlated with human blood characteristics such as age, gender, ethnicity, and blood types; before and after irradiation with 0.95 mW He-Ne laser (λ = 632.8 nm). The correlations were obtained by finding patterns, paired non-parametric tests, and an independent non-parametric tests using the SPSS version 11.5, centroid and peak positions, and flux variations. The findings show that the centroid and peak positions, flux peak and total flux, were very much correlated and can become a significant indicator for blood analyses. Furthermore, the encircled flux analysis demonstrated a good future prospect in blood research, thus leading the way as a vibrant diagnosis tool to clarify diseases associated with blood.
Parametric study of the lubrication of thrust loaded 120-mm bore ball bearings to 3 million DN
NASA Technical Reports Server (NTRS)
Signer, H.; Bamberger, E. N.; Zaretsky, E. V.
1973-01-01
A parametric study was performed with 120-mm bore angular-contact ball bearings under varying thrust loads, bearing and lubricant temperatures, and cooling and lubricant flow rates. Contact angles were nominally 20 and 24 deg with bearing speeds to 3 million DN. Endurance tests were run at 3 million DN and a temperature of 492 K (425 F) with 10 bearings having a nominal 24 deg contact angle at a thrust load of 22241 N (5000 lb). Bearing operating temperature, differences in temperatures between the inner and outer races, and bearing power consumption can be tuned to any desirable operating requirement by varying 4 parameters. These parameters are outer-race cooling, inner-race cooling, lubricant flow to the inner race, and oil inlet temperature. Preliminary endurance tests at 3 million DN and 492 K (425 F) indicate that long term bearing operation can be achieved with a high degree of reliability.
Rank-based permutation approaches for non-parametric factorial designs.
Umlauft, Maria; Konietschke, Frank; Pauly, Markus
2017-11-01
Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.
Parametric versus Cox's model: an illustrative analysis of divorce in Canada.
Balakrishnan, T R; Rao, K V; Krotki, K J; Lapierre-adamcyk, E
1988-06-01
Recent demographic literature clearly recognizes the importance of survival modes in the analysis of cross-sectional event histories. Of the various survival models, Cox's (1972) partial parametric model has been very popular due to its simplicity, and readily available computer software for estimation, sometimes at the cost of precision and parsimony of the model. This paper focuses on parametric failure time models for event history analysis such as Weibell, lognormal, loglogistic, and exponential models. The authors also test the goodness of fit of these parametric models versus the Cox's proportional hazards model taking Kaplan-Meier estimate as base. As an illustration, the authors reanalyze the Canadian Fertility Survey data on 1st marriage dissolution with parametric models. Though these parametric model estimates were not very different from each other, there seemed to be a slightly better fit with loglogistic. When 8 covariates were used in the analysis, it was found that the coefficients were similar in the models, and the overall conclusions about the relative risks would not have been different. The findings reveal that in marriage dissolution, the differences according to demographic and socioeconomic characteristics may be far more important than is generally found in many studies. Therefore, one should not treat the population as homogeneous in analyzing survival probabilities of marriages, other than for cursory analysis of overall trends.
Moretti, Stefano; van Leeuwen, Danitsja; Gmuender, Hans; Bonassi, Stefano; van Delft, Joost; Kleinjans, Jos; Patrone, Fioravante; Merlo, Domenico Franco
2008-01-01
Background In gene expression analysis, statistical tests for differential gene expression provide lists of candidate genes having, individually, a sufficiently low p-value. However, the interpretation of each single p-value within complex systems involving several interacting genes is problematic. In parallel, in the last sixty years, game theory has been applied to political and social problems to assess the power of interacting agents in forcing a decision and, more recently, to represent the relevance of genes in response to certain conditions. Results In this paper we introduce a Bootstrap procedure to test the null hypothesis that each gene has the same relevance between two conditions, where the relevance is represented by the Shapley value of a particular coalitional game defined on a microarray data-set. This method, which is called Comparative Analysis of Shapley value (shortly, CASh), is applied to data concerning the gene expression in children differentially exposed to air pollution. The results provided by CASh are compared with the results from a parametric statistical test for testing differential gene expression. Both lists of genes provided by CASh and t-test are informative enough to discriminate exposed subjects on the basis of their gene expression profiles. While many genes are selected in common by CASh and the parametric test, it turns out that the biological interpretation of the differences between these two selections is more interesting, suggesting a different interpretation of the main biological pathways in gene expression regulation for exposed individuals. A simulation study suggests that CASh offers more power than t-test for the detection of differential gene expression variability. Conclusion CASh is successfully applied to gene expression analysis of a data-set where the joint expression behavior of genes may be critical to characterize the expression response to air pollution. We demonstrate a synergistic effect between coalitional games and statistics that resulted in a selection of genes with a potential impact in the regulation of complex pathways. PMID:18764936
Comparison of Salmonella enteritidis phage types isolated from layers and humans in Belgium in 2005.
Welby, Sarah; Imberechts, Hein; Riocreux, Flavien; Bertrand, Sophie; Dierick, Katelijne; Wildemauwe, Christa; Hooyberghs, Jozef; Van der Stede, Yves
2011-08-01
The aim of this study was to investigate the available results for Belgium of the European Union coordinated monitoring program (2004/665 EC) on Salmonella in layers in 2005, as well as the results of the monthly outbreak reports of Salmonella Enteritidis in humans in 2005 to identify a possible statistical significant trend in both populations. Separate descriptive statistics and univariate analysis were carried out and the parametric and/or non-parametric hypothesis tests were conducted. A time cluster analysis was performed for all Salmonella Enteritidis phage types (PTs) isolated. The proportions of each Salmonella Enteritidis PT in layers and in humans were compared and the monthly distribution of the most common PT, isolated in both populations, was evaluated. The time cluster analysis revealed significant clusters during the months May and June for layers and May, July, August, and September for humans. PT21, the most frequently isolated PT in both populations in 2005, seemed to be responsible of these significant clusters. PT4 was the second most frequently isolated PT. No significant difference was found for the monthly trend evolution of both PT in both populations based on parametric and non-parametric methods. A similar monthly trend of PT distribution in humans and layers during the year 2005 was observed. The time cluster analysis and the statistical significance testing confirmed these results. Moreover, the time cluster analysis showed significant clusters during the summer time and slightly delayed in time (humans after layers). These results suggest a common link between the prevalence of Salmonella Enteritidis in layers and the occurrence of the pathogen in humans. Phage typing was confirmed to be a useful tool for identifying temporal trends.
Covariate analysis of bivariate survival data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, L.E.
1992-01-01
The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methodsmore » have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.« less
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Bansal, Ravi; Peterson, Bradley S
2018-06-01
Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.
PM-1 NUCLEAR POWER PLANT PROGRAM. Quarterly Progress Report No. 2 for June 1 to August 31, 1959
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sieg, J.S.; Smith, E.H.
1959-10-01
The objective of the contract is the design, development, fabrication, installation, and initial testing and operation of a prepackaged air- transportable pressurized water reactor nuclear power plant, the PM-1. The specified output is 1 Mwe and 7 million Btu/hr of heat. The plant is to be operational by March 1962. The principal efforts were completion of the plant parametric study and preparation of the preliminary design. A summary of design parameters is given. Systems development work included study and selection of packages for full-scale testing, a survey of in-core instrumentation techniques, control and instrumentation development, and development of components formore » the steam generator, condenser, and turbine generator, which are not commercially available. Reactor development work included completion of the parametric zeropower experiments and preparrtions for a flexible zeropower test program, a revision of plans for irradiation testing PM-1 fuel elements, initiation of a reactor flow test program, outliring of a heat tnansfer test program, completion of the seven-tube test section (SETCH-1) tests, and evaluation of control rod actuators leading to specification of a magnetic jack-type control rod drive similar to that reported in ANL-5768. Completion of the prelimirary design led to initiation of the final design effort, which will be the principal activity during the next two project quarters. Preparations for core fabrication included procurement of core cladding material for the zero-power teat core, arrangement with a subcontractor to convent UF/sub 6/ to UO/sub 2/ and to commence delivery of the oxide during the next quarter, development of fuel element fabrication and ultrasonic testing techniques, study of control rod materials, UO/sub 2/ recovery techniques, and boron analysis methods. Preliminary work on site preparation was pursued with receipt of USAEC approval for a location on the eastern slope of Warren Peak at Sundance, Wyoming. A survey of this site is underway. A preliminary Hazards Summary Report is in preparation. (For preceding period see MND-M-1812.) (auth)« less
Language Learning Strategy Use and Reading Achievement
ERIC Educational Resources Information Center
Ghafournia, Narjes
2014-01-01
The current study investigated the differences across the varying levels of EFL learners in the frequency and choice of learning strategies. Using a reading test, questionnaire, and parametric statistical analysis, the findings yielded up discrepancies among the participants in the implementation of language-learning strategies concerning their…
Design, fabrication, and testing of a SMA hybrid composite jet engine chevron
NASA Technical Reports Server (NTRS)
Turner, Travis L.; Cabell, Randolph H.; Cano, Roberto J.; Fleming, Gary A.
2006-01-01
Control of jet noise continues to be an important research topic. Exhaust nozzle chevrons have been shown to reduce jet noise, but parametric effects are not well understood. Additionally, thrust loss due to chevrons at cruise suggests significant benefit from deployable chevrons. The focus of this study is development of an active chevron concept for the primary purpose of parametric studies for jet noise reduction in the laboratory and technology development to leverage for full scale systems. The active chevron concept employed in this work consists of a laminated composite structure with embedded shape memory alloy (SMA) actuators, termed a SMA hybrid composite (SMAHC). The actuators are embedded on one side of the middle surface such that thermal excitation generates a moment and deflects the structure. A brief description of the chevron design is given followed by details of the fabrication approach. Results from bench top tests are presented and correlated with numerical predictions from a model for such structures that was recently implemented in MSC.Nastran and ABAQUS. Excellent performance and agreement with predictions is demonstrated. Results from tests in a representative flow environment are also presented. Excellent performance is again achieved for both open- and closed-loop tests, the latter demonstrating control to a specified immersion into the flow. The actuation authority and immersion performance is shown to be relatively insensitive to nozzle pressure ratio (NPR). Very repeatable immersion control with modest power requirements is demonstrated.
What is the Most Sensitive Measure of Water Maze Probe Test Performance?
Maei, Hamid R.; Zaslavsky, Kirill; Teixeira, Cátia M.; Frankland, Paul W.
2009-01-01
The water maze is commonly used to assay spatial cognition, or, more generally, learning and memory in experimental rodent models. In the water maze, mice or rats are trained to navigate to a platform located below the water's surface. Spatial learning is then typically assessed in a probe test, where the platform is removed from the pool and the mouse or rat is allowed to search for it. Performance in the probe test may then be evaluated using either occupancy-based (percent time in a virtual quadrant [Q] or zone [Z] centered on former platform location), error-based (mean proximity to former platform location [P]) or counting-based (platform crossings [X]) measures. While these measures differ in their popularity, whether they differ in their ability to detect group differences is not known. To address this question we compiled five separate databases, containing more than 1600 mouse probe tests. Random selection of individual trials from respective databases then allowed us to simulate experiments with varying sample and effect sizes. Using this Monte Carlo-based method, we found that the P measure consistently outperformed the Q, Z and X measures in its ability to detect group differences. This was the case regardless of sample or effect size, and using both parametric and non-parametric statistical analyses. The relative superiority of P over other commonly used measures suggests that it is the most appropriate measure to employ in both low- and high-throughput water maze screens. PMID:19404412
Miles, Lachlan F; Marchiori, Paolo; Falter, Florian
2017-09-01
This manuscript represents a pilot study assessing the feasibility of a single-compartment, individualised, pharmacokinetic algorithm for protamine dosing after cardiopulmonary bypass. A pilot cohort study in a specialist NHS cardiothoracic hospital targeting patients undergoing elective cardiac surgery using cardiopulmonary bypass. Patients received protamine doses according to a pharmacokinetic algorithm (n = 30) or using an empirical, fixed-dose model (n = 30). Categorical differences between the groups were evaluated using the Chi-squared test or Fisher's exact test. Continuous data was analysed using a paired Student's t-test for parametric data and the paired samples Wilcoxon test for non-parametric data. Patients who had protamine dosing according to the algorithm demonstrated a lower protamine requirement post-bypass relative to empirical management as measured by absolute dose (243 ± 49mg vs. 305 ± 34.7mg; p<0.001) and the heparin to protamine ratio (0.79 ± 0.12 vs. 1.1 ± 0.15; p<0.001). There was no difference in the pre- to post-bypass activated clotting time (ACT) ratio (1.05 ± 0.12 vs. 1.02 ± 0.15; p=0.9). Patients who received protamine according to the algorithm had no significant difference in transfusion requirement (13.3% vs. 30.0%; p=0.21). This study showed that an individualized pharmacokinetic algorithm for the reversal of heparin after cardiopulmonary bypass is feasible in comparison with a fixed dosing strategy and may reduce the protamine requirement following on-pump cardiac surgery.
Testing the Kerr metric with the iron line and the KRZ parametrization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ni, Yueying; Jiang, Jiachen; Bambi, Cosimo, E-mail: yyni13@fudan.edu.cn, E-mail: jcjiang12@fudan.edu.cn, E-mail: bambi@fudan.edu.cn
The spacetime geometry around astrophysical black holes is supposed to be well approximated by the Kerr metric, but deviations from the Kerr solution are predicted in a number of scenarios involving new physics. Broad iron Kα lines are commonly observed in the X-ray spectrum of black holes and originate by X-ray fluorescence of the inner accretion disk. The profile of the iron line is sensitively affected by the spacetime geometry in the strong gravity region and can be used to test the Kerr black hole hypothesis. In this paper, we extend previous work in the literature. In particular: i )more » as test-metric, we employ the parametrization recently proposed by Konoplya, Rezzolla, and Zhidenko, which has a number of subtle advantages with respect to the existing approaches; ii ) we perform simulations with specific X-ray missions, and we consider NuSTAR as a prototype of current observational facilities and eXTP as an example of the next generation of X-ray observatories. We find a significant difference between the constraining power of NuSTAR and eXTP. With NuSTAR, it is difficult or impossible to constrain deviations from the Kerr metric. With eXTP, in most cases we can obtain quite stringent constraints (modulo we have the correct astrophysical model).« less
Han, Maggie; Ye, Xiaoling; Preciado, Priscila; Williams, Schantel; Campos, Israel; Bonner, Marcee; Young, Candace; Marsh, Daniel; Larkin, John W; Usvyat, Len A; Maddux, Franklin W; Pecoits-Filho, Roberto; Kotanko, Peter
2018-01-01
Neighborhood walkability is associated with indicators of health in the general population. We explored the association between neighborhood walkability and daily steps in hemodialysis (HD) patients. We measured daily steps over 5 weeks using Fitbit Flex (Fitbit, San Francisco, CA, USA) and retrieved Walk Score® (WS) data by patient's home ZIP code (www.walkscore.com; 0 = poorest walkability; 100 = greatest walkability). HD patients took a mean of 6,393 ± 3,550 steps/day (n = 46). Median WS of the neighborhood where they resided was 28. Patients in an above-median WS (n = 27) neighborhood took significantly more daily steps compared to those (n = 19) in a below-median WS neighborhood (7,514 ± 3,900 vs. 4,800 ± 2,228 steps/day; p < 0.001, t test). Daily steps and WS were directly correlated (R = 0.425; p = 0.0032, parametric test; R = 0.359, p = 0.0143, non-parametric test). This is the first study conducted among HD patients to indicate a direct relationship between neighborhood walkability and the actual steps taken. These results should be considered when designing initiatives to increase and improvise exercise routines in HD populations. © 2018 S. Karger AG, Basel.
Static Strength of Adhesively-bonded Woven Fabric Kenaf Composite Plates
NASA Astrophysics Data System (ADS)
Hilton, Ahmad; Lee, Sim Yee; Supar, Khairi
2017-06-01
Natural fibers are potentially used as reinforcing materials and combined with epoxy resin as matrix system to form a superior specific strength (or stiffness) materials known as composite materials. The advantages of implementing natural fibers such as kenaf fibers are renewable, less hazardous during fabrication and handling process; and relatively cheap compared to synthetic fibers. The aim of current work is to conduct a parametric study on static strength of adhesively bonded woven fabric kenaf composite plates. Fabrication of composite panels were conducted using hand lay-up techniques, with variation of stacking sequence, over-lap length, joint types and lay-up types as identified in testing series. Quasi-static testing was carried out using mechanical testing following code of practice. Load-displacement profiles were analyzed to study its structural response prior to ultimate failures. It was found that cross-ply lay-up demonstrates better static strength compared to quasi-isotropic lay-up counterparts due to larger volume of 0° plies exhibited in cross-ply lay-up. Consequently, larger overlap length gives better joining strength, as expected, however this promotes to weight penalty in the joining structure. Most samples showed failures within adhesive region known as cohesive failure modes, however, few sample demonstrated interface failure. Good correlations of parametric study were found and discussed in the respective section.
Luna, Stelio Pacca Loureiro; Martino, Irene Di; Lorena, Silvia Elaine Rodolfo de Sá; Capua, Maria Luisa Buffo de; Lima, Alfredo Feio da Maia; Santos, Bianca Paiva Costa Rodrigues dos; Brondani, Juliana Tabarelli; Vesce, Giancarlo
2015-12-01
To investigate the analgesic effect of acupuncture (AP) or micro-dose pharmacopuncture (PA), using carprofen or morphine, in bitches undergoing ovariohysterectomy (OHE). Thirty five dogs were randomly assigned to five groups after sedation with acepromazine IM: AP, 0.5 mg.kg(-1) of morphine subcutaneously (SC), 4 mg.kg(-1) of carprofen SC, and PA with 0.05 mg.kg(-1) of morphine or 0.4 mg.kg(-1) of carprofen. Anaesthesia was induced with propofol and maintained with isoflurane. Pain was assessed after OHE by a blind observer for 24h, by dynamic visual analogue scale (DIVAS), Glasgow (CMPS-SF), Melbourne (UMPS) and Colorado University pain scale (CSU). Animals reaching 33% of the UMPS score received rescue analgesia with morphine IM. Non parametric data were analysed by Kruskal-Wallis or Friedman tests where applicable, followed by Dunn's test. Parametric data were analysed by two way ANOVA, followed by Tukey test. There were no differences among groups in number of rescue analgesia. Except for the DIVAS score where animals treated with morphine had the lowest score compared with AP and carprofen, at 1h after surgery, there were no other differences among groups. Acupuncture or pharmacopuncture were equally effective as morphine or carprofen to control postoperative pain in bitches undergoing ovariohysterectomy.
Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald
2007-05-01
(R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).
Parametric down-conversion with nonideal and random quasi-phase-matching
NASA Astrophysics Data System (ADS)
Yang, Chun-Yao; Lin, Chun; Liljestrand, Charlotte; Su, Wei-Min; Canalias, Carlota; Chuu, Chih-Sung
2016-05-01
Quasi-phase-matching (QPM) has enriched the capacity of parametric down-conversion (PDC) in generating biphotons for many fundamental tests and advanced applications. However, it is not clear how the nonidealities and randomness in the QPM grating of a parametric down-converter may affect the quantum properties of the biphotons. This paper intends to provide insights into the interplay between PDC and nonideal or random QPM structures. Using a periodically poled nonlinear crystal with short periodicity, we conduct experimental and theoretical studies of PDC subject to nonideal duty cycle and random errors in domain lengths. We report the observation of biphotons emerging through noncritical birefringent-phasematching, which is impossible to occur in PDC with an ideal QPM grating, and a biphoton spectrum determined by the details of nonidealities and randomness. We also observed QPM biphotons with a diminished strength. These features are both confirmed by our theory. Our work provides new perspectives for biphoton engineering with QPM.
Stress Recovery and Error Estimation for Shell Structures
NASA Technical Reports Server (NTRS)
Yazdani, A. A.; Riggs, H. R.; Tessler, A.
2000-01-01
The Penalized Discrete Least-Squares (PDLS) stress recovery (smoothing) technique developed for two dimensional linear elliptic problems is adapted here to three-dimensional shell structures. The surfaces are restricted to those which have a 2-D parametric representation, or which can be built-up of such surfaces. The proposed strategy involves mapping the finite element results to the 2-D parametric space which describes the geometry, and smoothing is carried out in the parametric space using the PDLS-based Smoothing Element Analysis (SEA). Numerical results for two well-known shell problems are presented to illustrate the performance of SEA/PDLS for these problems. The recovered stresses are used in the Zienkiewicz-Zhu a posteriori error estimator. The estimated errors are used to demonstrate the performance of SEA-recovered stresses in automated adaptive mesh refinement of shell structures. The numerical results are encouraging. Further testing involving more complex, practical structures is necessary.
Experimental parametric study of servers cooling management in data centers buildings
NASA Astrophysics Data System (ADS)
Nada, S. A.; Elfeky, K. E.; Attia, Ali M. A.; Alshaer, W. G.
2017-06-01
A parametric study of air flow and cooling management of data centers servers is experimentally conducted for different design conditions. A physical scale model of data center accommodating one rack of four servers was designed and constructed for testing purposes. Front and rear rack and server's temperatures distributions and supply/return heat indices (SHI/RHI) are used to evaluate data center thermal performance. Experiments were conducted to parametrically study the effects of perforated tiles opening ratio, servers power load variation and rack power density. The results showed that (1) perforated tile of 25% opening ratio provides the best results among the other opening ratios, (2) optimum benefit of cold air in servers cooling is obtained at uniformly power loading of servers (3) increasing power density decrease air re-circulation but increase air bypass and servers temperature. The present results are compared with previous experimental and CFD results and fair agreement was found.
NASA Technical Reports Server (NTRS)
Schuller, F. T.; Pinel, S. I.; Signer, H. R.
1980-01-01
Parametric tests were conducted with a 35 mm bore angular contact ball bearing with a double outer land guided cage. Provisions were made for jet lubrication and outer-ring cooling of the bearing. Test conditions included a combined thrust and radial load at nominal shaft speeds of 48,000 rpm, and an oil-in temperature of 394 K (250 F). Successful operation of the test bearing was accomplished up to 2.5 million DN. Test results were compared with those obtained with similar bearing having a single outer land guided cage. Higher temperatures were generated with the double outer land guided cage bearing, and bearing power loss and cage slip were greater. Cooling the outer ring resulted in a decrease in overall bearing operating temperature.
Experimental parametric studies of transonic T-tail flutter. [wind tunnel tests
NASA Technical Reports Server (NTRS)
Ruhlin, C. L.; Sandford, M. C.
1975-01-01
Wind-tunnel tests of the T-tail of a wide-body jet airplane were made at Mach numbers up to 1.02. The model consisted of a 1/13-size scaled version of the T-tail, fuselage, and inboard wing of the airplane. Two interchangeable T-tails were tested, one with design stiffness for flutter-clearance studies and one with reduced stiffness for flutter-trend studies. Transonic antisymmetric-flutter boundaries were determined for the models with variations in: (1) fin-spar stiffness, (2) stabilizer dihedral angle (-5 deg and 0 deg), (3) wing and forward-fuselage shape, and (4) nose shape of the fin-stabilizer juncture. A transonic symmetric-flutter boundary and flutter trends were established for variations in stabilizer pitch stiffness. Photographs of the test configurations are shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanna, T.; Sakkaravarthi, K.; Kumar, C. Senthil
In this paper, we have studied the integrability nature of a system of three-coupled Gross-Pitaevskii type nonlinear evolution equations arising in the context of spinor Bose-Einstein condensates by applying the Painleve singularity structure analysis. We show that only for two sets of parametric choices, corresponding to the known integrable cases, the system passes the Painleve test.
PILOT SCALE PROCESS EVALUATION OF REBURNING FOR IN-FURNACE NOX REDUCTION
The report gives results of coal and natural gas reburning application tests to a pilot scale 3.0 MWt furnace to provide the scaling information required for commercial application of reburning to pulverized-coal-fired boilers. Initial parametric studies had been conducted in a 2...
A Design-Based Engineering Graphics Course for First-Year Students.
ERIC Educational Resources Information Center
Smith, Shana Shiang-Fong
2003-01-01
Describes the first-year Introduction to Design course at Iowa State University which incorporates design for manufacturing and concurrent engineering principles into the curriculum. Autodesk Inventor was used as the primary CAD tool for parametric solid modeling. Test results show that student spatial visualization skills were dramatically…
Relative Reinforcer Rates and Magnitudes Do Not Control Concurrent Choice Independently
ERIC Educational Resources Information Center
Elliffe, Douglas; Davison, Michael; Landon, Jason
2008-01-01
One assumption of the matching approach to choice is that different independent variables control choice independently of each other. We tested this assumption for reinforcer rate and magnitude in an extensive parametric experiment. Five pigeons responded for food reinforcement on switching-key concurrent variable-interval variable-interval…
Application of Transformations in Parametric Inference
ERIC Educational Resources Information Center
Brownstein, Naomi; Pensky, Marianna
2008-01-01
The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…
Smirr, Jean-Loup; Guilbaud, Sylvain; Ghalbouni, Joe; Frey, Robert; Diamanti, Eleni; Alléaume, Romain; Zaquine, Isabelle
2011-01-17
Fast characterization of pulsed spontaneous parametric down conversion (SPDC) sources is important for applications in quantum information processing and communications. We propose a simple method to perform this task, which only requires measuring the counts on the two output channels and the coincidences between them, as well as modeling the filter used to reduce the source bandwidth. The proposed method is experimentally tested and used for a complete evaluation of SPDC sources (pair emission probability, total losses, and fidelity) of various bandwidths. This method can find applications in the setting up of SPDC sources and in the continuous verification of the quality of quantum communication links.
NASA Technical Reports Server (NTRS)
Kiess, Thomas E.; Shih, Yan-Hua; Sergienko, A. V.; Alley, Carroll O.
1994-01-01
We report a new two-photon polarization correlation experiment for realizing the Einstein-Podolsky-Rosen-Bohm (EPRB) state and for testing Bell-type inequalities. We use the pair of orthogonally-polarized light quanta generated in Type 2 parametric down conversion. Using 1 nm interference filters in front of our detectors, we observe from the output of a 0.5mm beta - BaB2O4 (BBO) crystal the EPRB correlations in coincidence counts, and measure an associated Bell inequality violation of 22 standard deviations. The quantum state of the photon pair is a polarization analog of the spin-1/2 singlet state.
Development of Corrections for Biomass Burning Effects in Version 2 of GEWEX/SRB Algorithm
NASA Technical Reports Server (NTRS)
Pinker, Rachel T.; Laszlo, I.; Dicus, Dennis L. (Technical Monitor)
1999-01-01
The objectives of this project were: (1) To incorporate into an existing version of the University of Maryland Surface Radiation Budget (SRB) model, optical parameters of forest fire aerosols, using best available information, as well as optical properties of other aerosols, identified as significant. (2) To run the model on regional scales with the new parametrization and information on forest fire occurrence and plume advection, as available from NASA LARC, and test improvements in inferring surface fluxes against daily values of measured fluxes. (3) Develop strategy how to incorporate the new parametrization on global scale and how to transfer modified model to NASA LARC.
Digital Modeling and Testing Research on Digging Mechanism of Deep Rootstalk Crops
NASA Astrophysics Data System (ADS)
Yang, Chuanhua; Xu, Ma; Wang, Zhoufei; Yang, Wenwu; Liao, Xinglong
The digital model of the laboratory bench parts of digging deep rootstalk crops were established through adopting the parametric model technology based on feature. The virtual assembly of the laboratory bench of digging deep rootstalk crops was done and the digital model of the laboratory bench parts of digging deep rootstalk crops was gained. The vibrospade, which is the key part of the laboratory bench of digging deep rootstalk crops was simulated and the movement parametric curves of spear on the vibrospade were obtained. The results show that the spear was accorded with design requirements. It is propitious to the deep rootstalk.
Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations
NASA Technical Reports Server (NTRS)
Mantz, A.; Allen, S. W.
2011-01-01
Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.
Catalytic distillation water recovery subsystem
NASA Technical Reports Server (NTRS)
Budininkas, P.; Rasouli, F.
1985-01-01
An integrated engineering breadboard subsystem for the recovery of potable water from untreated urine based on the vapor phase catalytic ammonia removal was designed, fabricated and tested. Unlike other evaporative methods, this process catalytically oxidizes ammonia and volatile hydrocarbons vaporizing with water to innocuous products; therefore, no pretreatment of urine is required. Since the subsystem is fabricated from commercially available components, its volume, weight and power requirements are not optimized; however, it is suitable for zero-g operation. The testing program consists of parametric tests, one month of daily tests and a continuous test of 168 hours duration. The recovered water is clear, odorless, low in ammonia and organic carbon, and requires only an adjustment of its pH to meet potable water standards. The obtained data indicate that the vapor phase catalytic ammonia removal process, if further developed, would also be competitive with other water recovery systems in weight, volume and power requirements.
Moscoso del Prado Martín, Fermín
2013-12-01
I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.
NASA Technical Reports Server (NTRS)
Johnson, J. D.; Braddock, W. F.
1974-01-01
A test of a 0.563 percent scale space shuttle Solid Rocket Booster (SRB) model, MSFC Model 449, was conducted in a trisonic wind tunnel. Test Mach numbers were 0.4, 0.6, 0.9, 1.2, 1.96, 3.48, 4.0, 4.45, and 4.96. Test angles-of-attack ranged from minus 10 degrees to 190 degrees. Test Reynolds numbers ranged from 3.0 million per foot to 8.6 million per foot. Test roll angles were 0, 11.25, 22.5, 45, and 90 degrees. In addition to the static stability evaluation of the primary SRB configuration, five parametric investigations were made: (1) effect of Reynolds number, (2) effect of engine shroud flare angle, (3) effect of engine shroud length, (4) effect of engine shroud strakes, and (5) effect of engine shroud strakes and trust vector control bottles.
The linear transformation model with frailties for the analysis of item response times.
Wang, Chun; Chang, Hua-Hua; Douglas, Jeffrey A
2013-02-01
The item response times (RTs) collected from computerized testing represent an underutilized source of information about items and examinees. In addition to knowing the examinees' responses to each item, we can investigate the amount of time examinees spend on each item. In this paper, we propose a semi-parametric model for RTs, the linear transformation model with a latent speed covariate, which combines the flexibility of non-parametric modelling and the brevity as well as interpretability of parametric modelling. In this new model, the RTs, after some non-parametric monotone transformation, become a linear model with latent speed as covariate plus an error term. The distribution of the error term implicitly defines the relationship between the RT and examinees' latent speeds; whereas the non-parametric transformation is able to describe various shapes of RT distributions. The linear transformation model represents a rich family of models that includes the Cox proportional hazards model, the Box-Cox normal model, and many other models as special cases. This new model is embedded in a hierarchical framework so that both RTs and responses are modelled simultaneously. A two-stage estimation method is proposed. In the first stage, the Markov chain Monte Carlo method is employed to estimate the parametric part of the model. In the second stage, an estimating equation method with a recursive algorithm is adopted to estimate the non-parametric transformation. Applicability of the new model is demonstrated with a simulation study and a real data application. Finally, methods to evaluate the model fit are suggested. © 2012 The British Psychological Society.
Formation of parametric images using mixed-effects models: a feasibility study.
Huang, Husan-Ming; Shih, Yi-Yu; Lin, Chieh
2016-03-01
Mixed-effects models have been widely used in the analysis of longitudinal data. By presenting the parameters as a combination of fixed effects and random effects, mixed-effects models incorporating both within- and between-subject variations are capable of improving parameter estimation. In this work, we demonstrate the feasibility of using a non-linear mixed-effects (NLME) approach for generating parametric images from medical imaging data of a single study. By assuming that all voxels in the image are independent, we used simulation and animal data to evaluate whether NLME can improve the voxel-wise parameter estimation. For testing purposes, intravoxel incoherent motion (IVIM) diffusion parameters including perfusion fraction, pseudo-diffusion coefficient and true diffusion coefficient were estimated using diffusion-weighted MR images and NLME through fitting the IVIM model. The conventional method of non-linear least squares (NLLS) was used as the standard approach for comparison of the resulted parametric images. In the simulated data, NLME provides more accurate and precise estimates of diffusion parameters compared with NLLS. Similarly, we found that NLME has the ability to improve the signal-to-noise ratio of parametric images obtained from rat brain data. These data have shown that it is feasible to apply NLME in parametric image generation, and the parametric image quality can be accordingly improved with the use of NLME. With the flexibility to be adapted to other models or modalities, NLME may become a useful tool to improve the parametric image quality in the future. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Manufacture and evaluation of Li/BCX DD cells
NASA Technical Reports Server (NTRS)
Meyer, S.; Takeuchi, E.
1990-01-01
This project is divided into four main tasks: cell manufacture, acceptance, and lot certification of cells, performance testing of cells, and abuse testing of cells. Lithium/bromine chloride in thionyl chloride (Li/BCX) 149 DD cells (PN 3B2085-XA) were built according to the provisions of Electrochem Industries Quality Plan 17096. Acceptance and lot certification testing was performed according to NASA JSC Document EP5-83-025, Revision B. Acceptance testing included open circuit and load voltage check, visual examination, size and weight measurements, and high temperature exposure. Lot certification tests were performed for capacity performance and for performance under conditions of thermal and electrical abuse. These tests included 149 C exposure, capacity discharge, fuse check, high temperature exposure, high rate discharge, short circuit, vibration, and overdischarge testing. A quantity of 200 cells was delivered to Johnson Space Center for life test evaluation. A parametric evaluation of the capacity discharge of Li/BCX DD cells was performed over a variety of temperatures and discharge rates. This testing served to map the performance capability of the cell. Tests were also performed over a variety of electrical and thermal abuse conditions. Abuse tests included short circuit, charging, overdischarge, high temperature exposure, shock, and vibration.
Distributional Tests for Gravitational Waves from Core-Collapse Supernovae
NASA Astrophysics Data System (ADS)
Szczepanczyk, Marek; LIGO Collaboration
2017-01-01
Core-Collapse Supernovae (CCSN) are spectacular and violent deaths of massive stars. CCSN are some of the most interesting candidates for producing gravitational-waves (GW) transients. Current published results focus on methodologies to detect single GW unmodelled transients. The advantages of these tests are that they do not require a background for which we have an analytical model. Examples of non-parametric tests that will be compared are Kolmogorov-Smirnov, Mann-Whitney, chi squared, and asymmetric chi squared. I will present methodological results using publicly released LIGO-S6 data recolored to the design sensitivity of Advanced LIGO and that will be time lagged between interferometers sites so that the resulting coincident events are not GW.
Measurement of shower development and its Molière radius with a four-plane LumiCal test set-up
NASA Astrophysics Data System (ADS)
Abramowicz, H.; Abusleme, A.; Afanaciev, K.; Benhammou, Y.; Bortko, L.; Borysov, O.; Borysova, M.; Bozovic-Jelisavcic, I.; Chelkov, G.; Daniluk, W.; Dannheim, D.; Elsener, K.; Firlej, M.; Firu, E.; Fiutowski, T.; Ghenescu, V.; Gostkin, M.; Hempel, M.; Henschel, H.; Idzik, M.; Ignatenko, A.; Ishikawa, A.; Kananov, S.; Karacheban, O.; Klempt, W.; Kotov, S.; Kotula, J.; Kozhevnikov, D.; Kruchonok, V.; Krupa, B.; Kulis, Sz.; Lange, W.; Leonard, J.; Lesiak, T.; Levy, A.; Levy, I.; Lohmann, W.; Lukic, S.; Moron, J.; Moszczynski, A.; Neagu, A. T.; Nuiry, F.-X.; Pandurovic, M.; Pawlik, B.; Preda, T.; Rosenblat, O.; Sailer, A.; Schumm, B.; Schuwalow, S.; Smiljanic, I.; Smolyanskiy, P.; Swientek, K.; Terlecki, P.; Uggerhoj, U. I.; Wistisen, T. N.; Wojton, T.; Yamamoto, H.; Zawiejski, L.; Zgura, I. S.; Zhemchugov, A.
2018-02-01
A prototype of a luminometer, designed for a future e^+e^- collider detector, and consisting at present of a four-plane module, was tested in the CERN PS accelerator T9 beam. The objective of this beam test was to demonstrate a multi-plane tungsten/silicon operation, to study the development of the electromagnetic shower and to compare it with MC simulations. The Molière radius has been determined to be 24.0 ± 0.6 (stat.) ± 1.5 (syst.) mm using a parametrization of the shower shape. Very good agreement was found between data and a detailed Geant4 simulation.
NASA Astrophysics Data System (ADS)
Sahoo, Sushree S.; Singh, Vijay K.; Panda, Subrata K.
2015-02-01
Flexural behaviour of cross ply laminated woven Glass/Epoxy composite plate has been investigated in this article. Flexural responses are examined by a three point bend test and tensile test carried out on INSTRON 5967 and Universal Testing Machine INSTRON 1195 respectively. The finite element model is developed in ANSYS parametric design language code and discretised using an eight nodded structural shell element. Convergence behaviour of the simulation result has been performed and validated by comparing the results with experimental values. The effects of various parameters such as side-to-thickness ratio, modular ratio on flexural behaviour of woven Glass/Epoxy laminated composite plate are discussed in details.
Development of advanced lightweight containment systems
NASA Technical Reports Server (NTRS)
Stotler, C.
1981-01-01
Parametric type data were obtained on advanced lightweight containment systems. These data were used to generate design methods and procedures necessary for the successful development of such systems. The methods were then demonstrated through the design of a lightweight containment system for a CF6 size engine. The containment concept evaluated consisted basically of a lightweight structural sandwich shell wrapped with dry Kevlar cloth. The initial testing was directed towards the determination of the amount of Kevlar required to result in threshold containment for a specific set of test conditions. A relationship was then developed between the thickness required and the energy of the released blade so that the data could be used to design for conditions other than those tested.
NASA Astrophysics Data System (ADS)
Gentry, D.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Duca, Z. A.; Jacobsen, M. B.; Kirby, J.; McCaig, H. C.; Murukesan, G.; Rader, E.; Cullen, T.; Rennie, V.; Schwieterman, E. W.; Stevens, A. H.; Sutton, S. A.; Tan, G.; Yin, C.; Cullen, D.; Geppert, W.; Stockton, A. M.
2017-12-01
Studies in planetary analogue sites correlating remote imagery, mineralogy, and biomarker assay results help predict biomarker distribution and preservation. The FELDSPAR team has conducted five expeditions (2012-2017) to Icelandic Mars analogue sites with an increasingly refined battery of physicochemical measurements and biomarker assays. Two additional expeditions are planned; here we report intermediate results.The biomarker assays performed represent a diversity of potential biomarker types: ATP, cell counts, qPCR with domain-level primers, and DNA content. Mineralogical, chemical, and physical measurements and observations include temperature, pH, moisture content, and Raman, near-IR reflectance, and X-ray fluorescence spectra. Sites are geologically recent basaltic lava flows (Fimmvörðuháls, Eldfell, Holuhraun) and barren basaltic sand plains (Mælifellssandur, Dyngjusandur). All samples were 'homogeneous' at the 1 m to 1 km scale in apparent color, morphology, and grain size.[1]Sample locations were arranged in hierarchically nested grids at 10 cm, 1 m, 10 m, 100 m, and >1 km scales. Several measures of spatial distribution and variability were derived: unbiased sample variance, F- and pairwise t-tests with Bonferroni correction, and the non-parametric H- and u-tests. All assay results, including preliminary mineralogical information in the form of notable spectral bands, were then tested for correlation using the non-parametric Spearman's rank test.[2] For Fimmvörðuháls, four years of data were also examined for temporal trends.Biomarker quantification (other than cell count) was generally well correlated, although all assays showed notable variability even at the smallest examined spatial scale. Pairwise comparisons proved to be the most intuitive measure of variability; non-parametric characterization indicated trends at the >100 m scale, but required more replicates than were feasible at smaller scales. Future work will integrate additional mineralogical measurements and more specialized modeling. [1] Amador, E. S. et al. (2015) Planet. Space Sci., 106 1-10. [2] Gentry, D. M. et al. (2017) Astrobio., in press.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.
2014-01-01
The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.
2014-01-01
The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (1) mode blockage, (2) liner insertion loss, (3) short ducts, and (4) mode reflection.
Potable water recovery for spacecraft application by electrolytic pretreatment/air evaporation
NASA Technical Reports Server (NTRS)
Wells, G. W.
1975-01-01
A process for the recovery of potable water from urine using electrolytic pretreatment followed by distillation in a closed-cycle air evaporator has been developed and tested. Both the electrolytic pretreatment unit and the air evaporation unit are six-person, flight-concept prototype, automated units. Significantly extended wick lifetimes have been achieved in the air evaporation unit using electrolytically pretreated, as opposed to chemically pretreated, urine feed. Parametric test data are presented on product water quality, wick life, process power, maintenance requirements, and expendable requirements.
1978-12-01
tested at typical dual-flow cycle conditions. B. Models 9, 10, and 12 are similar in geometry to Modelo 6, 7, and 8; however, inner flow was regulated to...Stream I.- Mt. Stf 2., drl 1 I Tl Vý v N.. (T/PO) t (’k) (it;=ecl (PT/Po), (*R) (W.-/2 1 (tc’eeo) 1 1.119 45 1000 6 l.$& 100 100 . 6 1.221 1SO0 1000 Xot
Automation Hooks Architecture Trade Study for Flexible Test Orchestration
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Maclean, John R.; Graffagnino, Frank J.; McCartney, Patrick A.
2010-01-01
We describe the conclusions of a technology and communities survey supported by concurrent and follow-on proof-of-concept prototyping to evaluate feasibility of defining a durable, versatile, reliable, visible software interface to support strategic modularization of test software development. The objective is that test sets and support software with diverse origins, ages, and abilities can be reliably integrated into test configurations that assemble and tear down and reassemble with scalable complexity in order to conduct both parametric tests and monitored trial runs. The resulting approach is based on integration of three recognized technologies that are currently gaining acceptance within the test industry and when combined provide a simple, open and scalable test orchestration architecture that addresses the objectives of the Automation Hooks task. The technologies are automated discovery using multicast DNS Zero Configuration Networking (zeroconf), commanding and data retrieval using resource-oriented Restful Web Services, and XML data transfer formats based on Automatic Test Markup Language (ATML). This open-source standards-based approach provides direct integration with existing commercial off-the-shelf (COTS) analysis software tools.
Applying interprofessional Team-Based Learning in patient safety: a pilot evaluation study.
Lochner, Lukas; Girardi, Sandra; Pavcovich, Alessandra; Meier, Horand; Mantovan, Franco; Ausserhofer, Dietmar
2018-03-27
Interprofessional education (IPE) interventions are not always successful in achieving learning outcomes. Team-Based Learning (TBL) would appear to be a suitable pedagogical method for IPE, as it focuses on team performance; however, little is known about interprofessional TBL as an instructional framework for patient safety. In this pilot-study, we aimed to (1) describe participants' reactions to TBL, (2) observe their achievement with respect to interprofessional education learning objectives, and (3) document their attitudinal shifts with regard to patient safety behaviours. We developed and implemented a three-day course for pre-qualifying, non-medical healthcare students to give instruction on non-technical skills related to 'learning from errors'. The course consisted of three sequential modules: 'Recognizing Errors', 'Analysing Errors', and 'Reporting Errors'. The evaluation took place within a quasi-experimental pre-test-post-test study design. Participants completed self-assessments through valid and reliable instruments such as the Mennenga's TBL Student Assessment Instrument and the University of the West of England's Interprofessional Questionnaire. The mean scores of the individual readiness assurance tests were compared with the scores of the group readiness assurance test in order to explore if students learned from each other during group discussions. Data was analysed using descriptive (i.e. mean, standard deviation), parametric (i.e. paired t-test), and non-parametric (i.e. Wilcoxon signed-rank test) methods. Thirty-nine students from five different bachelor's programs attended the course. The participants positively rated TBL as an instructional approach. All teams outperformed the mean score of their individual members during the readiness assurance process. We observed significant improvements in 'communication and teamwork' and 'interprofessional learning' but not in 'interprofessional interaction' and 'interprofessional relationships.' Findings on safety attitudes and behaviours were mixed. TBL was well received by the students. Our first findings indicate that interprofessional TBL seems to be a promising pedagogical method to achieve patient safety learning objectives. It is crucial to develop relevant clinical cases that involve all professions. Further research with larger sample sizes (e.g. including medical students) and more rigorous study designs (e.g. pre-test post-test with a control group) is needed to confirm our preliminary findings.
Bowie, Christopher R.; Reichenberg, Abraham; McClure, Margaret M.; Leung, Winnie L.; Harvey, Philip D.
2008-01-01
Cognitive dysfunction is a common feature of schizophrenia and deficits are present before the onset of psychosis, and are moderate to severe by the time of the first episode. Controversy exists over the course of cognitive dysfunction after the first episode. This study examined age-associated differences in performance on clinical neuropsychological (NP) and information processing tasks in a sample of geriatric community living schizophrenia patients (n=172). Compared to healthy control subjects (n=70), people with schizophrenia did not differ on NP tests across age groups but showed evidence for age-associated cognitive worsening on the more complex components of an information-processing test. Age-related changes in cognitive function in schizophrenia may be a function of both the course of illness and the processing demands of the cognitive measure of interest. Tests with fixed difficulty, such as clinical NP tests, may differ in their sensitivity from tests for which parametric difficulty manipulations can be performed. PMID:18053687
Rocketdyne LOX bearing tester program
NASA Technical Reports Server (NTRS)
Keba, J. E.; Beatty, R. F.
1988-01-01
The cause, or causes, for the Space Shuttle Main Engine ball wear were unknown, however, several mechanisms were suspected. Two testers were designed and built for operation in liquid oxygen to empirically gain insight into the problems and iterate solutions in a timely and cost efficient manner independent of engine testing. Schedules and test plans were developed that defined a test matrix consisting of parametric variations of loading, cooling or vapor margin, cage lubrication, material, and geometry studies. Initial test results indicated that the low pressure pump thrust bearing surface distress is a function of high axial load. Initial high pressure turbopump bearing tests produced the wear phenomenon observed in the turbopump and identified an inadequate vapor margin problem and a coolant flowrate sensitivity issue. These tests provided calibration data of analytical model predictions to give high confidence in the positive impact of future turbopump design modification for flight. Various modifications will be evaluated in these testers, since similar turbopump conditions can be produced and the benefit of the modification will be quantified in measured wear life comparisons.
Efficient logistic regression designs under an imperfect population identifier.
Albert, Paul S; Liu, Aiyi; Nansel, Tonja
2014-03-01
Motivated by actual study designs, this article considers efficient logistic regression designs where the population is identified with a binary test that is subject to diagnostic error. We consider the case where the imperfect test is obtained on all participants, while the gold standard test is measured on a small chosen subsample. Under maximum-likelihood estimation, we evaluate the optimal design in terms of sample selection as well as verification. We show that there may be substantial efficiency gains by choosing a small percentage of individuals who test negative on the imperfect test for inclusion in the sample (e.g., verifying 90% test-positive cases). We also show that a two-stage design may be a good practical alternative to a fixed design in some situations. Under optimal and nearly optimal designs, we compare maximum-likelihood and semi-parametric efficient estimators under correct and misspecified models with simulations. The methodology is illustrated with an analysis from a diabetes behavioral intervention trial. © 2013, The International Biometric Society.
Tests of gravity with future space-based experiments
NASA Astrophysics Data System (ADS)
Sakstein, Jeremy
2018-03-01
Future space-based tests of relativistic gravitation—laser ranging to Phobos, accelerometers in orbit, and optical networks surrounding Earth—will constrain the theory of gravity with unprecedented precision by testing the inverse-square law, the strong and weak equivalence principles, and the deflection and time delay of light by massive bodies. In this paper, we estimate the bounds that could be obtained on alternative gravity theories that use screening mechanisms to suppress deviations from general relativity in the Solar System: chameleon, symmetron, and Galileon models. We find that space-based tests of the parametrized post-Newtonian parameter γ will constrain chameleon and symmetron theories to new levels, and that tests of the inverse-square law using laser ranging to Phobos will provide the most stringent constraints on Galileon theories to date. We end by discussing the potential for constraining these theories using upcoming tests of the weak equivalence principle, and conclude that further theoretical modeling is required in order to fully utilize the data.
Hong, Chuan; Chen, Yong; Ning, Yang; Wang, Shuang; Wu, Hao; Carroll, Raymond J
2017-01-01
Motivated by analyses of DNA methylation data, we propose a semiparametric mixture model, namely the generalized exponential tilt mixture model, to account for heterogeneity between differentially methylated and non-differentially methylated subjects in the cancer group, and capture the differences in higher order moments (e.g. mean and variance) between subjects in cancer and normal groups. A pairwise pseudolikelihood is constructed to eliminate the unknown nuisance function. To circumvent boundary and non-identifiability problems as in parametric mixture models, we modify the pseudolikelihood by adding a penalty function. In addition, the test with simple asymptotic distribution has computational advantages compared with permutation-based test for high-dimensional genetic or epigenetic data. We propose a pseudolikelihood based expectation-maximization test, and show the proposed test follows a simple chi-squared limiting distribution. Simulation studies show that the proposed test controls Type I errors well and has better power compared to several current tests. In particular, the proposed test outperforms the commonly used tests under all simulation settings considered, especially when there are variance differences between two groups. The proposed test is applied to a real data set to identify differentially methylated sites between ovarian cancer subjects and normal subjects.
NASA Astrophysics Data System (ADS)
Khai Tiu, Ervin Shan; Huang, Yuk Feng; Ling, Lloyd
2018-03-01
An accurate streamflow forecasting model is important for the development of flood mitigation plan as to ensure sustainable development for a river basin. This study adopted Variational Mode Decomposition (VMD) data-preprocessing technique to process and denoise the rainfall data before putting into the Support Vector Machine (SVM) streamflow forecasting model in order to improve the performance of the selected model. Rainfall data and river water level data for the period of 1996-2016 were used for this purpose. Homogeneity tests (Standard Normal Homogeneity Test, the Buishand Range Test, the Pettitt Test and the Von Neumann Ratio Test) and normality tests (Shapiro-Wilk Test, Anderson-Darling Test, Lilliefors Test and Jarque-Bera Test) had been carried out on the rainfall series. Homogenous and non-normally distributed data were found in all the stations, respectively. From the recorded rainfall data, it was observed that Dungun River Basin possessed higher monthly rainfall from November to February, which was during the Northeast Monsoon. Thus, the monthly and seasonal rainfall series of this monsoon would be the main focus for this research as floods usually happen during the Northeast Monsoon period. The predicted water levels from SVM model were assessed with the observed water level using non-parametric statistical tests (Biased Method, Kendall's Tau B Test and Spearman's Rho Test).
Patient acceptance of non-invasive testing for fetal aneuploidy via cell-free fetal DNA.
Vahanian, Sevan A; Baraa Allaf, M; Yeh, Corinne; Chavez, Martin R; Kinzler, Wendy L; Vintzileos, Anthony M
2014-01-01
To evaluate factors associated with patient acceptance of noninvasive prenatal testing for trisomy 21, 18 and 13 via cell-free fetal DNA. This was a retrospective study of all patients who were offered noninvasive prenatal testing at a single institution from 1 March 2012 to 2 July 2012. Patients were identified through our perinatal ultrasound database; demographic information, testing indication and insurance coverage were compared between patients who accepted the test and those who declined. Parametric and nonparametric tests were used as appropriate. Significant variables were assessed using multivariate logistic regression. The value p < 0.05 was considered significant. Two hundred thirty-five patients were offered noninvasive prenatal testing. Ninety-three patients (40%) accepted testing and 142 (60%) declined. Women who accepted noninvasive prenatal testing were more commonly white, had private insurance and had more than one testing indication. There was no statistical difference in the number or the type of testing indications. Multivariable logistic regression analysis was then used to assess individual variables. After controlling for race, patients with public insurance were 83% less likely to accept noninvasive prenatal testing than those with private insurance (3% vs. 97%, adjusted RR 0.17, 95% CI 0.05-0.62). In our population, having public insurance was the factor most strongly associated with declining noninvasive prenatal testing.
Bignardi, A B; El Faro, L; Torres Júnior, R A A; Cardoso, V L; Machado, P F; Albuquerque, L G
2011-10-31
We analyzed 152,145 test-day records from 7317 first lactations of Holstein cows recorded from 1995 to 2003. Our objective was to model variations in test-day milk yield during the first lactation of Holstein cows by random regression model (RRM), using various functions in order to obtain adequate and parsimonious models for the estimation of genetic parameters. Test-day milk yields were grouped into weekly classes of days in milk, ranging from 1 to 44 weeks. The contemporary groups were defined as herd-test-day. The analyses were performed using a single-trait RRM, including the direct additive, permanent environmental and residual random effects. In addition, contemporary group and linear and quadratic effects of the age of cow at calving were included as fixed effects. The mean trend of milk yield was modeled with a fourth-order orthogonal Legendre polynomial. The additive genetic and permanent environmental covariance functions were estimated by random regression on two parametric functions, Ali and Schaeffer and Wilmink, and on B-spline functions of days in milk. The covariance components and the genetic parameters were estimated by the restricted maximum likelihood method. Results from RRM parametric and B-spline functions were compared to RRM on Legendre polynomials and with a multi-trait analysis, using the same data set. Heritability estimates presented similar trends during mid-lactation (13 to 31 weeks) and between week 37 and the end of lactation, for all RRM. Heritabilities obtained by multi-trait analysis were of a lower magnitude than those estimated by RRM. The RRMs with a higher number of parameters were more useful to describe the genetic variation of test-day milk yield throughout the lactation. RRM using B-spline and Legendre polynomials as base functions appears to be the most adequate to describe the covariance structure of the data.
NASA Astrophysics Data System (ADS)
Foyo-Moreno, I.; Vida, J.; Olmo, F. J.; Alados-Arboledas, L.
2000-11-01
Since the discovery of the ozone depletion in Antarctic and the globally declining trend of stratospheric ozone concentration, public and scientific concern has been raised in the last decades. A very important consequence of this fact is the increased broadband and spectral UV radiation in the environment and the biological effects and heath risks that may take place in the near future. The absence of widespread measurements of this radiometric flux has lead to the development and use of alternative estimation procedures such as the parametric approaches. Parametric models compute the radiant energy using available atmospheric parameters. Some parametric models compute the global solar irradiance at surface level by addition of its direct beam and diffuse components. In the present work, we have developed a comparison between two cloudless sky parametrization schemes. Both methods provide an estimation of the solar spectral irradiance that can be integrated spectrally within the limits of interest. For this test we have used data recorded in a radiometric station located at Granada (37.180°N, 3.580°W, 660 m a.m.s.l.), an inland location. The database includes hourly values of the relevant variables covering the years 1994-95. The performance of the models has been tested in relation to their predictive capability of global solar irradiance in the UV range (290-385 nm). After our study, it appears that information concerning the aerosol radiative effects is fundamental in order to obtain a good estimation. The original version of SPCTRAL2 provides estimates of the experimental values with negligible mean bias deviation. This suggests not only the appropriateness of the model but also the convenience of the aerosol features fixed in it to Granada conditions. SMARTS2 model offers increased flexibility concerning the selection of different aerosol models included in the code and provides the best results when the selected models are those considered as urban. Although SMARTS2 provide slightly worse results, both models give estimates of solar ultraviolet irradiance with mean bias deviation below 5%, and root mean square deviation close to experimental errors.
NASA Technical Reports Server (NTRS)
Groll, M.; Pittman, R. B.; Eninger, J. E.
1975-01-01
A recently developed, potentially high-performance nonarterial wick has been extensively tested. This slab wick has an axially varying porosity which can be tailored to match the local stress imposed on the wick. The purpose of the tests was to establish the usefulness of the graded-porosity slab wick at cryogenic temperatures between 110 K and 260 K, with methane and ethane as working fluids. For comparison, a homogeneous (i.e., uniform porosity) slab wick was also tested. The tests included: (1) maximum heat pipe performance as a function of fluid inventory, (2) maximum performance as a function of operating temperature, (3) maximum performance as a function of evaporator elevation, and (4) influence of slab wick orientation on performance. The experimental data was compared with theoretical predictions obtained with the computer program GRADE.
NASA Technical Reports Server (NTRS)
Groll, M.; Pittman, R. B.; Eninger, J. E.
1976-01-01
A recently developed, potentially high-performance nonarterial wick was extensively tested. This slab wick has an axially varying porosity which can be tailored to match the local stress imposed on the wick. The purpose of the tests was to establish the usefulness of the graded-porosity slab wick at cryogenic temperatures between 110 and 260 K, with methane and ethane as working fluids. For comparison, a homogeneous (i.e., uniform porosity) slab wick was also tested. The tests included: maximum heat pipe performance as a function of fluid inventory, maximum performance as a function of operating temperature, maximum performance as a function of evaporator elevation, and influence of slab wick orientation on performance. The experimental data were compared with theoretical predictions obtained with the GRADE computer program.
Generalizations and Extensions of the Probability of Superiority Effect Size Estimator
ERIC Educational Resources Information Center
Ruscio, John; Gera, Benjamin Lee
2013-01-01
Researchers are strongly encouraged to accompany the results of statistical tests with appropriate estimates of effect size. For 2-group comparisons, a probability-based effect size estimator ("A") has many appealing properties (e.g., it is easy to understand, robust to violations of parametric assumptions, insensitive to outliers). We review…
Machine Learning Based Evaluation of Reading and Writing Difficulties.
Iwabuchi, Mamoru; Hirabayashi, Rumi; Nakamura, Kenryu; Dim, Nem Khan
2017-01-01
The possibility of auto evaluation of reading and writing difficulties was investigated using non-parametric machine learning (ML) regression technique for URAWSS (Understanding Reading and Writing Skills of Schoolchildren) [1] test data of 168 children of grade 1 - 9. The result showed that the ML had better prediction than the ordinary rule-based decision.
Cost estimating methods for advanced space systems
NASA Technical Reports Server (NTRS)
Cyr, Kelley
1988-01-01
The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.
This commentary is the second of a series outlining one specific concept in interpreting biomarkers data. In the first, an observational method was presented for assessing the distribution of measurements before making parametric calculations. Here, the discussion revolves around...
Preliminary Multi-Variable Parametric Cost Model for Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Hendrichs, Todd
2010-01-01
This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.
Force Project Technology Presentation to the NRCC
2014-02-04
Functional Bridge components Smart Odometer Adv Pretreatment Smart Bridge Multi-functional Gap Crossing Fuel Automated Tracking System Adv...comprehensive matrix of candidate composite material systems and textile reinforcement architectures via modeling/analyses and testing. Product(s...Validated Dynamic Modeling tool based on parametric study using material models to reliably predict the textile mechanics of the hose
Learning Patterns as Criterion for Forming Work Groups in 3D Simulation Learning Environments
ERIC Educational Resources Information Center
Maria Cela-Ranilla, Jose; Molías, Luis Marqués; Cervera, Mercè Gisbert
2016-01-01
This study analyzes the relationship between the use of learning patterns as a grouping criterion to develop learning activities in the 3D simulation environment at University. Participants included 72 Spanish students from the Education and Marketing disciplines. Descriptive statistics and non-parametric tests were conducted. The process was…
Noncontact measurement of vibration using airborne ultrasound.
Mater, O B; Remenieras, J P; Bruneel, C; Roncin, A; Patat, F
1998-01-01
A noncontact ultrasonic method for measuring the surface normal vibration of objects was studied. The instrument consists of a pair of 420 kHz ultrasonic air transducers. One is used to emit ultrasounds toward the moving surface, and the other receives the ultrasound reflected from the object under test. Two effects induce a phase modulation on the received signal. The first effect results from the variation of the round trip time interval tau required for the wavefront to go from the emitter to the moving surface and back to the receiver. This is the Doppler effect directly proportional to the surface displacement. The second effect results from the nonlinear parametric interactions of the ultrasonic beams (forward and backward) with the low frequency sound field emitted in the air by the vibrating surface. This latter phenomenon, which is a volume effect, is proportional to the velocity of the vibrating surface and increases with the distance between the transducers and the surface under test. The relative contribution of the Doppler and parametric effects are evaluated, and both have to be taken into account for ultrasonic interferometry in air.
Aviation-fuel property effects on combustion
NASA Technical Reports Server (NTRS)
Rosfjord, T. J.
1984-01-01
The fuel chemical property influence on a gas turbine combustor was studied using 25 test fuels. Fuel physical properties were de-emphasized by using fuel injectors which produce highly-atomized, and hence rapidly vaporizing sprays. A substantial fuel spray characterization effort was conducted to allow selection of nozzles which assured that such sprays were achieved for all fuels. The fuels were specified to cover the following wide ranges of chemical properties: hydrogen, 9.1 to 15 (wt) pct; total aromatics, 0 to 100 (vol) pct; and naphthalene, 0 to 30 (vol) pct. standard fuels (e.g., Jet A, JP4), speciality products (e.g., decalin, xylene tower bottoms) and special fuel blends were included. The latter group included six, 4-component blends prepared to achieve parametric variations in fuel hydrogen, total aromatics and naphthalene contents. The principle influences of fuel chemical properties on the combustor behavior were reflected by the radiation, liner temperature, and exhaust smoke number (or equivalently, soot number density) data. Test results indicated that naphthalene content strongly influenced the radiative heat load while parametric variations in total aromatics did not.
Parametric analysis of parameters for electrical-load forecasting using artificial neural networks
NASA Astrophysics Data System (ADS)
Gerber, William J.; Gonzalez, Avelino J.; Georgiopoulos, Michael
1997-04-01
Accurate total system electrical load forecasting is a necessary part of resource management for power generation companies. The better the hourly load forecast, the more closely the power generation assets of the company can be configured to minimize the cost. Automating this process is a profitable goal and neural networks should provide an excellent means of doing the automation. However, prior to developing such a system, the optimal set of input parameters must be determined. The approach of this research was to determine what those inputs should be through a parametric study of potentially good inputs. Input parameters tested were ambient temperature, total electrical load, the day of the week, humidity, dew point temperature, daylight savings time, length of daylight, season, forecast light index and forecast wind velocity. For testing, a limited number of temperatures and total electrical loads were used as a basic reference input parameter set. Most parameters showed some forecasting improvement when added individually to the basic parameter set. Significantly, major improvements were exhibited with the day of the week, dew point temperatures, additional temperatures and loads, forecast light index and forecast wind velocity.
A parametric study of fracture toughness of fibrous composite materials
NASA Technical Reports Server (NTRS)
Poe, C. C., Jr.
1987-01-01
Impacts to fibrous composite laminates by objects with low velocities can break fibers giving crack-like damage. The damage may not extend completely through a thick laminate. The tension strength of these damage laminates is reduced much like that of cracked metals. The fracture toughness depends on fiber and matrix properties, fiber orientations, and stacking sequence. Accordingly, a parametric study was made to determine how fiber and matrix properties and fiber orientations affect fracture toughness and notch sensitivity. The values of fracture toughness were predicted from the elastic constants of the laminate and the failing strain of the fibers using a general fracture toughness parameter developed previously. For a variety of laminates, values of fracture toughness from tests of center-cracked specimens and values of residual strength from tests of thick laminates with surface cracks were compared to the predictions to give credibility to the study. In contrast to the usual behavior of metals, it is shown that both ultimate tensile strength and fracture toughness of composites can be increased without increasing notch sensitivity.
Degradation of Leakage Currents and Reliability Prediction for Tantalum Capacitors
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2016-01-01
Two types of failures in solid tantalum capacitors, catastrophic and parametric, and their mechanisms are described. Analysis of voltage and temperature reliability acceleration factors reported in literature shows a wide spread of results and requires more investigation. In this work, leakage currents in two types of chip tantalum capacitors were monitored during highly accelerated life testing (HALT) at different temperatures and voltages. Distributions of degradation rates were approximated using a general log-linear Weibull model and yielded voltage acceleration constants B = 9.8 +/- 0.5 and 5.5. The activation energies were Ea = 1.65 eV and 1.42 eV. The model allows for conservative estimations of times to failure and was validated by long-term life test data. Parametric degradation and failures are reversible and can be annealed at high temperatures. The process is attributed to migration of charged oxygen vacancies that reduce the barrier height at the MnO2/Ta2O5 interface and increase injection of electrons from the MnO2 cathode. Analysis showed that the activation energy of the vacancies' migration is 1.1 eV.
Religiousness as a factor of hesitation against doping behavior in college-age athletes.
Zenic, Natasa; Stipic, Marija; Sekulic, Damir
2013-06-01
Religiousness is rarely studied as protective factor against substance use and misuse in sport. Further, we have found no investigation where college-age athletes were sampled and studied accordingly. The aim of the present study was to identify gender-specific protective effects of the religiousness (measured by Santa Clara Questionnaire) and other social, educational, and sport variables as a potential factors of hesitation against doping behaviors in sport-science-students from Mostar, Bosnia, and Herzegovina (51 women and 111 men; age range, 18-26). The gender differences for the non-parametric variables were established by Kruskall-Wallis test, while for the parametric variables the t-test for independent samples was used. Multiple regression calculations revealed religiousness as the most significant predictor of the social, health, sport and legal factors of hesitation against doping behaviors in both genders. However, the differential influence of the social, educational, sport and religious factors in relation to negative consequences of the doping behaviors is found for men and women. Such differential influence must be emphasized in tailoring the anti-doping policy and interventions.
Endurance Test and Evaluation of Alkaline Water Electrolysis Cells
NASA Technical Reports Server (NTRS)
Kovach, Andrew J.; Schubert, Franz H.; Chang, B. J.; Larkins, Jim T.
1985-01-01
The overall objective of this program is to assess the state of alkaline water electrolysis cell technology and its potential as part of a Regenerative Fuel Cell System (RFCS) of a multikilowatt orbiting powerplant. The program evaluates the endurance capabilities of alkaline electrolyte water electrolysis cells under various operating conditions, including constant condition testing, cyclic testing and high pressure testing. The RFCS demanded the scale-up of existing cell hardware from 0.1 sq ft active electrode area to 1.0 sq ft active electrode area. A single water electrolysis cell and two six-cell modules of 1.0 sq ft active electrode area were designed and fabricated. The two six-cell 1.0 sq ft modules incorporate 1.0 sq ft utilized cores, which allow for minimization of module assembly complexity and increased tolerance to pressure differential. A water electrolysis subsystem was designed and fabricated to allow testing of the six-cell modules. After completing checkout, shakedown, design verification and parametric testing, a module was incorporated into the Regenerative Fuel Cell System Breadboard (RFCSB) for testing at Life Systems, Inc., and at NASA JSC.
An ROC-type measure of diagnostic accuracy when the gold standard is continuous-scale.
Obuchowski, Nancy A
2006-02-15
ROC curves and summary measures of accuracy derived from them, such as the area under the ROC curve, have become the standard for describing and comparing the accuracy of diagnostic tests. Methods for estimating ROC curves rely on the existence of a gold standard which dichotomizes patients into disease present or absent. There are, however, many examples of diagnostic tests whose gold standards are not binary-scale, but rather continuous-scale. Unnatural dichotomization of these gold standards leads to bias and inconsistency in estimates of diagnostic accuracy. In this paper, we propose a non-parametric estimator of diagnostic test accuracy which does not require dichotomization of the gold standard. This estimator has an interpretation analogous to the area under the ROC curve. We propose a confidence interval for test accuracy and a statistical test for comparing accuracies of tests from paired designs. We compare the performance (i.e. CI coverage, type I error rate, power) of the proposed methods with several alternatives. An example is presented where the accuracies of two quick blood tests for measuring serum iron concentrations are estimated and compared.
An Adaptive Genetic Association Test Using Double Kernel Machines.
Zhan, Xiang; Epstein, Michael P; Ghosh, Debashis
2015-10-01
Recently, gene set-based approaches have become very popular in gene expression profiling studies for assessing how genetic variants are related to disease outcomes. Since most genes are not differentially expressed, existing pathway tests considering all genes within a pathway suffer from considerable noise and power loss. Moreover, for a differentially expressed pathway, it is of interest to select important genes that drive the effect of the pathway. In this article, we propose an adaptive association test using double kernel machines (DKM), which can both select important genes within the pathway as well as test for the overall genetic pathway effect. This DKM procedure first uses the garrote kernel machines (GKM) test for the purposes of subset selection and then the least squares kernel machine (LSKM) test for testing the effect of the subset of genes. An appealing feature of the kernel machine framework is that it can provide a flexible and unified method for multi-dimensional modeling of the genetic pathway effect allowing for both parametric and nonparametric components. This DKM approach is illustrated with application to simulated data as well as to data from a neuroimaging genetics study.
An appraisal of statistical procedures used in derivation of reference intervals.
Ichihara, Kiyoshi; Boyd, James C
2010-11-01
When conducting studies to derive reference intervals (RIs), various statistical procedures are commonly applied at each step, from the planning stages to final computation of RIs. Determination of the necessary sample size is an important consideration, and evaluation of at least 400 individuals in each subgroup has been recommended to establish reliable common RIs in multicenter studies. Multiple regression analysis allows identification of the most important factors contributing to variation in test results, while accounting for possible confounding relationships among these factors. Of the various approaches proposed for judging the necessity of partitioning reference values, nested analysis of variance (ANOVA) is the likely method of choice owing to its ability to handle multiple groups and being able to adjust for multiple factors. Box-Cox power transformation often has been used to transform data to a Gaussian distribution for parametric computation of RIs. However, this transformation occasionally fails. Therefore, the non-parametric method based on determination of the 2.5 and 97.5 percentiles following sorting of the data, has been recommended for general use. The performance of the Box-Cox transformation can be improved by introducing an additional parameter representing the origin of transformation. In simulations, the confidence intervals (CIs) of reference limits (RLs) calculated by the parametric method were narrower than those calculated by the non-parametric approach. However, the margin of difference was rather small owing to additional variability in parametrically-determined RLs introduced by estimation of parameters for the Box-Cox transformation. The parametric calculation method may have an advantage over the non-parametric method in allowing identification and exclusion of extreme values during RI computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ray, Jaideep; Lee, Jina; Lefantzi, Sophia
The estimation of fossil-fuel CO2 emissions (ffCO2) from limited ground-based and satellite measurements of CO2 concentrations will form a key component of the monitoring of treaties aimed at the abatement of greenhouse gas emissions. To that end, we construct a multiresolution spatial parametrization for fossil-fuel CO2 emissions (ffCO2), to be used in atmospheric inversions. Such a parametrization does not currently exist. The parametrization uses wavelets to accurately capture the multiscale, nonstationary nature of ffCO2 emissions and employs proxies of human habitation, e.g., images of lights at night and maps of built-up areas to reduce the dimensionality of the multiresolution parametrization.more » The parametrization is used in a synthetic data inversion to test its suitability for use in atmospheric inverse problem. This linear inverse problem is predicated on observations of ffCO2 concentrations collected at measurement towers. We adapt a convex optimization technique, commonly used in the reconstruction of compressively sensed images, to perform sparse reconstruction of the time-variant ffCO2 emission field. We also borrow concepts from compressive sensing to impose boundary conditions i.e., to limit ffCO2 emissions within an irregularly shaped region (the United States, in our case). We find that the optimization algorithm performs a data-driven sparsification of the spatial parametrization and retains only of those wavelets whose weights could be estimated from the observations. Further, our method for the imposition of boundary conditions leads to a 10computational saving over conventional means of doing so. We conclude with a discussion of the accuracy of the estimated emissions and the suitability of the spatial parametrization for use in inverse problems with a significant degree of regularization.« less
Parametric Thermal Models of the Transient Reactor Test Facility (TREAT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley K. Heath
2014-03-01
This work supports the restart of transient testing in the United States using the Department of Energy’s Transient Reactor Test Facility at the Idaho National Laboratory. It also supports the Global Threat Reduction Initiative by reducing proliferation risk of high enriched uranium fuel. The work involves the creation of a nuclear fuel assembly model using the fuel performance code known as BISON. The model simulates the thermal behavior of a nuclear fuel assembly during steady state and transient operational modes. Additional models of the same geometry but differing material properties are created to perform parametric studies. The results show thatmore » fuel and cladding thermal conductivity have the greatest effect on fuel temperature under the steady state operational mode. Fuel density and fuel specific heat have the greatest effect for transient operational model. When considering a new fuel type it is recommended to use materials that decrease the specific heat of the fuel and the thermal conductivity of the fuel’s cladding in order to deal with higher density fuels that accompany the LEU conversion process. Data on the latest operating conditions of TREAT need to be attained in order to validate BISON’s results. BISON’s models for TREAT (material models, boundary convection models) are modest and need additional work to ensure accuracy and confidence in results.« less
Zhang, Chen; Wang, Yuan; Song, Xiaowei; Kubota, Jumpei; He, Yanmin; Tojo, Junji; Zhu, Xiaodong
2017-12-31
This paper concentrates on a Chinese context and makes efforts to develop an integrated process to explicitly elucidate the relationship between economic growth and water pollution discharge-chemical oxygen demand (COD) discharge and ammonia nitrogen (NH 3 -N), using two unbalanced panel data sets covering the period separately from 1990 to 2014, and 2001 to 2014. In our present study, the panel unit root tests, cointegration tests, and Granger causality tests allowing for cross-sectional dependence, nonstationary, and heterogeneity are conducted to examine the causal effects of economic growth on COD/NH 3 -N discharge. Further, we simultaneously apply semi-parametric fixed effects estimation and parametric fixed effects estimation to investigate environmental Kuznets curve relationship for COD/NH 3 -N discharge. Our empirical results show a long-term bidirectional causality between economic growth and COD/NH 3 -N discharge in China. Within the Stochastic Impacts by Regression on Population, Affluence and Technology framework, we find evidence in support of an inverted U-shaped curved link between economic growth and COD/NH 3 -N discharge. To the best of our knowledge, there have not been any efforts made in investigating the nexus of economic growth and water pollution in such an integrated manner. Therefore, this study takes a fresh look on this topic. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Meadows, Alexander R.; Cupal, Josef; Hříbek, Petr; Durák, Michal; Kramer, Daniel; Rus, Bedřich
2017-05-01
We present the design of a collinear femtosecond optical parametric amplification (OPA) system producing a tunable output at wavelengths between 1030 nm and 1080 nm from a Ti:Sapphire pump laser at a wavelength of 795 nm. Generation of a supercontinuum seed pulse is followed by one stage of amplification in Beta Barium Borate (BBO) and two stages of amplification in Potassium Titanyle Arsenate (KTA), resulting in a 225 μJ output pulse with a duration of 90 fs. The output of the system has been measured by self-referenced spectral interferometry to yield the complete spectrum and spectral phase of the pulse. When compared to KTP, the greater transparency of KTA in the spectral range from 3 - 4 μm allows for reduced idler absorption and enhanced gain from the OPA process when it is pumped by the fundamental frequency of a Ti:sapphire laser. In turn, the use of the Ti:sapphire fundamental at 795 nm as a pump improves the efficiency with which light can be converted to wavelengths between 1030 nm and 1080 nm and subsequently used to test components for Nd-based laser systems. This OPA system is operated at 1 kHz for diagnostic development and laser-induced damage threshold testing of optical components for the ELI-Beamlines project.
NASA Astrophysics Data System (ADS)
Bugała, Artur; Bednarek, Karol; Kasprzyk, Leszek; Tomczewski, Andrzej
2017-10-01
The paper presents the most representative - from the three-year measurement time period - characteristics of daily and monthly electricity production from a photovoltaic conversion using modules installed in a fixed and 2-axis tracking construction. Results are presented for selected summer, autumn, spring and winter days. Analyzed measuring stand is located on the roof of the Faculty of Electrical Engineering Poznan University of Technology building. The basic parameters of the statistical analysis like mean value, standard deviation, skewness, kurtosis, median, range, or coefficient of variation were used. It was found that the asymmetry factor can be useful in the analysis of the daily electricity production from a photovoltaic conversion. In order to determine the repeatability of monthly electricity production, occurring between the summer, and summer and winter months, a non-parametric Mann-Whitney U test was used as a statistical solution. In order to analyze the repeatability of daily peak hours, describing the largest value of the hourly electricity production, a non-parametric Kruskal-Wallis test was applied as an extension of the Mann-Whitney U test. Based on the analysis of the electric energy distribution from a prepared monitoring system it was found that traditional forecasting methods of the electricity production from a photovoltaic conversion, like multiple regression models, should not be the preferred methods of the analysis.
A parametric approach to irregular fatigue prediction
NASA Technical Reports Server (NTRS)
Erismann, T. H.
1972-01-01
A parametric approach to irregular fatigue protection is presented. The method proposed consists of two parts: empirical determination of certain characteristics of a material by means of a relatively small number of well-defined standard tests, and arithmetical application of the results obtained to arbitrary loading histories. The following groups of parameters are thus taken into account: (1) the variations of the mean stress, (2) the interaction of these variations and the superposed oscillating stresses, (3) the spectrum of the oscillating-stress amplitudes, and (4) the sequence of the oscillating-stress amplitudes. It is pointed out that only experimental verification can throw sufficient light upon possibilities and limitations of this (or any other) prediction method.
Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten
2018-01-01
Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.
NASA Astrophysics Data System (ADS)
Oliveira, N. P.; Maciel, L.; Catarino, A. P.; Rocha, A. M.
2017-10-01
This work proposes the creation of models of surfaces using a parametric computer modelling software to obtain three-dimensional structures in weft knitted fabrics produced on single needle system machines. Digital prototyping, another feature of digital modelling software, was also explored in three-dimensional drawings generated using the Rhinoceros software. With this approach, different 3D structures were developed and produced. Physical characterization tests were then performed on the resulting 3D weft knitted structures to assess their ability to promote comfort. From the obtained results, it is apparent that the developed structures have potential for application in different market segments, such as clothing and interior textiles.
Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins
NASA Technical Reports Server (NTRS)
Brenner, Marty; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.
Holt film wall shear instrumentation for boundary layer transition research
NASA Technical Reports Server (NTRS)
Schneider, Steven P.
1994-01-01
Measurements of the performance of hot-film wall-shear sensors were performed to aid development of improved sensors. The effect of film size and substrate properties on the sensor performance was quantified through parametric studies carried out both electronically and in a shock tube. The results show that sensor frequency response increases with decreasing sensor size, while at the same time sensitivity decreases. Substrate effects were also studied, through parametric variation of thermal conductivity and heat capacity. Early studies used complex dual-layer substrates, while later studies were designed for both single-layer and dual-layer substrates. Sensor failures and funding limitations have precluded completion of the substrate thermal-property tests.
A tool for the estimation of the distribution of landslide area in R
NASA Astrophysics Data System (ADS)
Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.
2012-04-01
We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery. The tool can also be used to evaluate the probability density and the frequency density of landslide volume.
Relevance of anisotropy and spatial variability of gas diffusivity for soil-gas transport
NASA Astrophysics Data System (ADS)
Schack-Kirchner, Helmer; Kühne, Anke; Lang, Friederike
2017-04-01
Models of soil gas transport generally do not consider neither direction dependence of gas diffusivity, nor its small-scale variability. However, in a recent study, we could provide evidence for anisotropy favouring vertical gas diffusion in natural soils. We hypothesize that gas transport models based on gas diffusion data measured with soil rings are strongly influenced by both, anisotropy and spatial variability and the use of averaged diffusivities could be misleading. To test this we used a 2-dimensional model of soil gas transport to under compacted wheel tracks to model the soil-air oxygen distribution in the soil. The model was parametrized with data obtained from soil-ring measurements with its central tendency and variability. The model includes vertical parameter variability as well as variation perpendicular to the elongated wheel track. Different parametrization types have been tested: [i)]Averaged values for wheel track and undisturbed. em [ii)]Random distribution of soil cells with normally distributed variability within the strata. em [iii)]Random distributed soil cells with uniformly distributed variability within the strata. All three types of small-scale variability has been tested for [j)] isotropic gas diffusivity and em [jj)]reduced horizontal gas diffusivity (constant factor), yielding in total six models. As expected the different parametrizations had an important influence to the aeration state under wheel tracks with the strongest oxygen depletion in case of uniformly distributed variability and anisotropy towards higher vertical diffusivity. The simple simulation approach clearly showed the relevance of anisotropy and spatial variability in case of identical central tendency measures of gas diffusivity. However, until now it did not consider spatial dependency of variability, that could even aggravate effects. To consider anisotropy and spatial variability in gas transport models we recommend a) to measure soil-gas transport parameters spatially explicit including different directions and b) to use random-field stochastic models to assess the possible effects for gas-exchange models.
NASA Technical Reports Server (NTRS)
Wilbur, Matthew L.; Yeager, William T., Jr.; Singleton, Jeffrey D.; Mirick, Paul H.; Wilkie, W. Keats
1998-01-01
This report provides data obtained during a wind-tunnel test conducted to investigate parametrically the effect of blade nonstructural mass on helicopter fixed-system vibratory loads. The data were obtained with aeroelastically scaled model rotor blades that allowed for the addition of concentrated nonstructural masses at multiple locations along the blade radius. Testing was conducted for advance ratios ranging from 0.10 to 0.35 for 10 blade-mass configurations. Three thrust levels were obtained at representative full-scale shaft angles for each blade-mass configuration. This report provides the fixed-system forces and moments measured during testing. The comprehensive database obtained is well-suited for use in correlation and development of advanced rotorcraft analyses.
Shielding requirements for the Space Station habitability modules
NASA Technical Reports Server (NTRS)
Avans, Sherman L.; Horn, Jennifer R.; Williamsen, Joel E.
1990-01-01
The design, analysis, development, and tests of the total meteoroid/debris protection system for the Space Station Freedom habitability modules, such as the habitation module, the laboratory module, and the node structures, are described. Design requirements are discussed along with development efforts, including a combination of hypervelocity testing and analyses. Computer hydrocode analysis of hypervelocity impact phenomena associated with Space Station habitability structures is covered and the use of optimization techniques, engineering models, and parametric analyses is assessed. Explosive rail gun development efforts and protective capability and damage tolerance of multilayer insulation due to meteoroid/debris impact are considered. It is concluded that anticipated changes in the debris environment definition and requirements will require rescoping the tests and analysis required to develop a protection system.
NASA Technical Reports Server (NTRS)
McGowan, David M.; Ambur, Damodar R.
1998-01-01
The results of an experimental study of the impact damage characteristics and residual strength of composite sandwich panels impacted with and without a compression loading are presented. Results of impact damage screening tests conducted to identify the impact-energy levels at which damage initiates and at which barely visible impact damage occurs in the impacted facesheet are discussed. Parametric effects studied in these tests include the impactor diameter, dropped-weight versus airgun-launched impactors, and the effect of the location of the impact site with respect to the panel boundaries. Residual strength results of panels tested in compression after impact are presented and compared with results of panels that are subjected to a compressive preload prior to being impacted.
Parametric Study of the Effect of Membrane Tension on Sunshield Dynamics
NASA Technical Reports Server (NTRS)
Ross, Brian; Johnston, John D.; Smith, James
2002-01-01
The NGST sunshield is a lightweight, flexible structure consisting of pretensioned membranes supported by deployable booms. The structural dynamic behavior of the sunshield must be well understood in order to predict its influence on observatory performance. A 1/10th scale model of the sunshield has been developed for ground testing to provide data to validate modeling techniques for thin film membrane structures. The validated models can then be used to predict the behaviour of the full scale sunshield. This paper summarizes the most recent tests performed on the 1/10th scale sunshield to study the effect of membrane preload on sunshield dynamics. Topics to be covered include the test setup, procedures, and a summary of results.
NASA Technical Reports Server (NTRS)
Thomas, R. E.; Gaines, G. B.
1978-01-01
Recommended design procedures to reduce the complete factorial design by retaining information on anticipated important interaction effects, and by generally giving up information on unconditional main effects are discussed. A hypothetical photovoltaic module used in the test design is presented. Judgments were made of the relative importance of various environmental stresses such as UV radiation, abrasion, chemical attack, temperature, mechanical stress, relative humidity and voltage. Consideration is given to a complete factorial design and its graphical representation, elimination of selected test conditions, examination and improvement of an engineering design, and parametric study. The resulting design consists of a mix of conditional main effects and conditional interactions and represents a compromise between engineering and statistical requirements.
A parametric model order reduction technique for poroelastic finite element models.
Lappano, Ettore; Polanz, Markus; Desmet, Wim; Mundo, Domenico
2017-10-01
This research presents a parametric model order reduction approach for vibro-acoustic problems in the frequency domain of systems containing poroelastic materials (PEM). The method is applied to the Finite Element (FE) discretization of the weak u-p integral formulation based on the Biot-Allard theory and makes use of reduced basis (RB) methods typically employed for parametric problems. The parametric reduction is obtained rewriting the Biot-Allard FE equations for poroelastic materials using an affine representation of the frequency (therefore allowing for RB methods) and projecting the frequency-dependent PEM system on a global reduced order basis generated with the proper orthogonal decomposition instead of standard modal approaches. This has proven to be better suited to describe the nonlinear frequency dependence and the strong coupling introduced by damping. The methodology presented is tested on two three-dimensional systems: in the first experiment, the surface impedance of a PEM layer sample is calculated and compared with results of the literature; in the second, the reduced order model of a multilayer system coupled to an air cavity is assessed and the results are compared to those of the reference FE model.
From Neutron Star Observables to the Equation of State. I. An Optimal Parametrization
NASA Astrophysics Data System (ADS)
Raithel, Carolyn A.; Özel, Feryal; Psaltis, Dimitrios
2016-11-01
The increasing number and precision of measurements of neutron star masses, radii, and, in the near future, moments of inertia offer the possibility of precisely determining the neutron star equation of state (EOS). One way to facilitate the mapping of observables to the EOS is through a parametrization of the latter. We present here a generic method for optimizing the parametrization of any physically allowed EOS. We use mock EOS that incorporate physically diverse and extreme behavior to test how well our parametrization reproduces the global properties of the stars, by minimizing the errors in the observables of mass, radius, and the moment of inertia. We find that using piecewise polytropes and sampling the EOS with five fiducial densities between ˜1-8 times the nuclear saturation density results in optimal errors for the smallest number of parameters. Specifically, it recreates the radii of the assumed EOS to within less than 0.5 km for the extreme mock EOS and to within less than 0.12 km for 95% of a sample of 42 proposed, physically motivated EOS. Such a parametrization is also able to reproduce the maximum mass to within 0.04 {M}⊙ and the moment of inertia of a 1.338 {M}⊙ neutron star to within less than 10% for 95% of the proposed sample of EOS.
Selecting a Separable Parametric Spatiotemporal Covariance Structure for Longitudinal Imaging Data
George, Brandon; Aban, Inmaculada
2014-01-01
Longitudinal imaging studies allow great insight into how the structure and function of a subject’s internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures, and the spatial from the outcomes of interest being observed at multiple points in a patients body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on Type I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the Type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be done in practice, as well as how covariance structure choice can change inferences about fixed effects. PMID:25293361
Static internal performance of a two-dimensional convergent-divergent nozzle with thrust vectoring
NASA Technical Reports Server (NTRS)
Bare, E. Ann; Reubush, David E.
1987-01-01
A parametric investigation of the static internal performance of multifunction two-dimensional convergent-divergent nozzles has been made in the static test facility of the Langley 16-Foot Transonic Tunnel. All nozzles had a constant throat area and aspect ratio. The effects of upper and lower flap angles, divergent flap length, throat approach angle, sidewall containment, and throat geometry were determined. All nozzles were tested at a thrust vector angle that varied from 5.60 tp 23.00 deg. The nozzle pressure ratio was varied up to 10 for all configurations.
NASA Technical Reports Server (NTRS)
Frost, A. L.; Dill, C. C.
1986-01-01
An investigation to determine the sensitivity of the space shuttle base and forebody aerodynamics to the size and shape of various solid plume simulators was conducted. Families of cones of varying angle and base diameter, at various axial positions behind a Space Shuttle launch vehicle model, were wind tunnel tested. This parametric evaluation yielded base pressure and force coefficient data which indicated that solid plume simulators are an inexpensive, quick method of approximating the effect of engine exhaust plumes on the base and forebody aerodynamics of future, complex multibody launch vehicles.
Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure
Berisha, Visar; Wisler, Alan; Hero, Alfred O.; Spanias, Andreas
2015-01-01
Information divergence functions play a critical role in statistics and information theory. In this paper we show that a non-parametric f-divergence measure can be used to provide improved bounds on the minimum binary classification probability of error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between training and test distributions. We confirm the theoretical results by designing feature selection algorithms using the criteria from these bounds and by evaluating the algorithms on a series of pathological speech classification tasks. PMID:26807014
Some tests of flat plate photovoltaic module cell temperatures in simulated field conditions
NASA Technical Reports Server (NTRS)
Griffith, J. S.; Rathod, M. S.; Paslaski, J.
1981-01-01
The nominal operating cell temperature (NOCT) of solar photovoltaic (PV) modules is an important characteristic. Typically, the power output of a PV module decreases 0.5% per deg C rise in cell temperature. Several tests were run with artificial sun and wind to study the parametric dependencies of cell temperature on wind speed and direction and ambient temperature. It was found that the cell temperature is extremely sensitive to wind speed, moderately so to wind direction and rather insensitive to ambient temperature. Several suggestions are made to obtain data more typical of field conditions.
Computer modeling of heat pipe performance
NASA Technical Reports Server (NTRS)
Peterson, G. P.
1983-01-01
A parametric study of the defining equations which govern the steady state operational characteristics of the Grumman monogroove dual passage heat pipe is presented. These defining equations are combined to develop a mathematical model which describes and predicts the operational and performance capabilities of a specific heat pipe given the necessary physical characteristics and working fluid. Included is a brief review of the current literature, a discussion of the governing equations, and a description of both the mathematical and computer model. Final results of preliminary test runs of the model are presented and compared with experimental tests on actual prototypes.
NASA Technical Reports Server (NTRS)
Brandon, C. A.; Gaddis, J. L.; El-Nashar, A. M.
1975-01-01
Performance data consisting of solute rejections and product flux were measured, as dependent on the operation parameters. These parameters and ranges were pressure (500,000 n/m2 to 700,000 n/m2), temperature (74 C to 95 C), velocity (1.6 M/sec to 10 M/sec), and concentration (up to 14x). Tests were carried out on analog washwater. Data presented include rejections of organic materials, ammonia, urea, and an assortment of ions. The membrane used was deposited in situ on a porcelain ceramic substrate.
New post-Newtonian parameter to test Chern-Simons gravity.
Alexander, Stephon; Yunes, Nicolas
2007-12-14
We study Chern-Simons (CS) gravity in the parametrized post-Newtonian (PPN) framework through a weak-field solution of the modified field equations. We find that CS gravity possesses the same PPN parameters as general relativity, except for the inclusion of a new term, proportional to the CS coupling and the curl of the PPN vector potential. This new term leads to a modification of frame dragging and gyroscopic precession and we provide an estimate of its size. This correction might be used in experiments, such as Gravity Probe B, to bound CS gravity and test string theory.
Regenerable biocide delivery unit, volume 2
NASA Technical Reports Server (NTRS)
Atwater, James E.; Wheeler, Richard R., Jr.
1992-01-01
Source code for programs dealing with the following topics are presented: (1) life cycle test stand-parametric test stand control (in BASIC); (2) simultaneous aqueous iodine equilibria-true equilibrium (in C); (3) simultaneous aqueous iodine equilibria-pseudo-equilibrium (in C); (4) pseudo-(fast)-equilibrium with iodide initially present (in C); (5) solution of simultaneous iodine rate expressions (Mathematica); (6) 2nd order kinetics of I2-formic acid in humidity condensate (Mathematica); (7) prototype RMCV onboard microcontroller (CAMBASIC); (8) prototype RAM data dump to PC (in BASIC); and (9) prototype real time data transfer to PC (in BASIC).
Analysis of quantitative data obtained from toxicity studies showing non-normal distribution.
Kobayashi, Katsumi
2005-05-01
The data obtained from toxicity studies are examined for homogeneity of variance, but, usually, they are not examined for normal distribution. In this study I examined the measured items of a carcinogenicity/chronic toxicity study with rats for both homogeneity of variance and normal distribution. It was observed that a lot of hematology and biochemistry items showed non-normal distribution. For testing normal distribution of the data obtained from toxicity studies, the data of the concurrent control group may be examined, and for the data that show a non-normal distribution, non-parametric tests with robustness may be applied.
Illiquidity premium and expected stock returns in the UK: A new approach
NASA Astrophysics Data System (ADS)
Chen, Jiaqi; Sherif, Mohamed
2016-09-01
This study examines the relative importance of liquidity risk for the time-series and cross-section of stock returns in the UK. We propose a simple way to capture the multidimensionality of illiquidity. Our analysis indicates that existing illiquidity measures have considerable asset specific components, which justifies our new approach. Further, we use an alternative test of the Amihud (2002) measure and parametric and non-parametric methods to investigate whether liquidity risk is priced in the UK. We find that the inclusion of the illiquidity factor in the capital asset pricing model plays a significant role in explaining the cross-sectional variation in stock returns, in particular with the Fama-French three-factor model. Further, using Hansen-Jagannathan non-parametric bounds, we find that the illiquidity-augmented capital asset pricing models yield a small distance error, other non-liquidity based models fail to yield economically plausible distance values. Our findings have important implications for managing the liquidity risk of equity portfolios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cinelli, C.; Di Nepi, G.; De Martini, F.
2004-08-01
A parametric source of polarization-entangled photon pairs with striking spatial characteristics is reported. The distribution of the output electromagnetic k modes excited by spontaneous parametric down-conversion and coupled to the output detectors can be very broad. Using these states realized over a full entanglement ring output distribution, the nonlocal properties of the generated entanglement have been tested by standard Bell measurements and by Ou-Mandel interferometry. A 'mode-patchwork' technique based on the quantum superposition principle is adopted to synthesize in a straightforward and reliable way any kind of mixed state, of large conceptual and technological interest in modern quantum information. Tunablemore » Werner states and maximally entangled mixed states have indeed been created by this technique and investigated by quantum tomography. A study of the entropic and nonlocal properties of these states has been undertaken experimentally and theoretically, by a unifying variational approach.« less
Ionescu, Crina-Maria; Geidl, Stanislav; Svobodová Vařeková, Radka; Koča, Jaroslav
2013-10-28
We focused on the parametrization and evaluation of empirical models for fast and accurate calculation of conformationally dependent atomic charges in proteins. The models were based on the electronegativity equalization method (EEM), and the parametrization procedure was tailored to proteins. We used large protein fragments as reference structures and fitted the EEM model parameters using atomic charges computed by three population analyses (Mulliken, Natural, iterative Hirshfeld), at the Hartree-Fock level with two basis sets (6-31G*, 6-31G**) and in two environments (gas phase, implicit solvation). We parametrized and successfully validated 24 EEM models. When tested on insulin and ubiquitin, all models reproduced quantum mechanics level charges well and were consistent with respect to population analysis and basis set. Specifically, the models showed on average a correlation of 0.961, RMSD 0.097 e, and average absolute error per atom 0.072 e. The EEM models can be used with the freely available EEM implementation EEM_SOLVER.
Test of the cosmic evolution using Gaussian processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Ming-Jian; Xia, Jun-Qing, E-mail: zhangmj@ihep.ac.cn, E-mail: xiajq@bnu.edu.cn
2016-12-01
Much focus was on the possible slowing down of cosmic acceleration under the dark energy parametrization. In the present paper, we investigate this subject using the Gaussian processes (GP), without resorting to a particular template of dark energy. The reconstruction is carried out by abundant data including luminosity distance from Union2, Union2.1 compilation and gamma-ray burst, and dynamical Hubble parameter. It suggests that slowing down of cosmic acceleration cannot be presented within 95% C.L., in considering the influence of spatial curvature and Hubble constant. In order to reveal the reason of tension between our reconstruction and previous parametrization constraint formore » Union2 data, we compare them and find that slowing down of acceleration in some parametrization is only a ''mirage'. Although these parameterizations fits well with the observational data, their tension can be revealed by high order derivative of distance D. Instead, GP method is able to faithfully model the cosmic expansion history.« less
Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.
2000-01-01
The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.
Effect of Monovalent Ion Parameters on Molecular Dynamics Simulations of G-Quadruplexes.
Havrila, Marek; Stadlbauer, Petr; Islam, Barira; Otyepka, Michal; Šponer, Jiří
2017-08-08
G-quadruplexes (GQs) are key noncanonical DNA and RNA architectures stabilized by desolvated monovalent cations present in their central channels. We analyze extended atomistic molecular dynamics simulations (∼580 μs in total) of GQs with 11 monovalent cation parametrizations, assessing GQ overall structural stability, dynamics of internal cations, and distortions of the G-tetrad geometries. Majority of simulations were executed with the SPC/E water model; however, test simulations with TIP3P and OPC water models are also reported. The identity and parametrization of ions strongly affect behavior of a tetramolecular d[GGG] 4 GQ, which is unstable with several ion parametrizations. The remaining studied RNA and DNA GQs are structurally stable, though the G-tetrad geometries are always deformed by bifurcated H-bonding in a parametrization-specific manner. Thus, basic 10-μs-scale simulations of fully folded GQs can be safely done with a number of cation parametrizations. However, there are parametrization-specific differences and basic force-field errors affecting the quantitative description of ion-tetrad interactions, which may significantly affect studies of the ion-binding processes and description of the GQ folding landscape. Our d[GGG] 4 simulations indirectly suggest that such studies will also be sensitive to the water models. During exchanges with bulk water, the Na + ions move inside the GQs in a concerted manner, while larger relocations of the K + ions are typically separated. We suggest that the Joung-Cheatham SPC/E K + parameters represent a safe choice in simulation studies of GQs, though variation of ion parameters can be used for specific simulation goals.
NASA Astrophysics Data System (ADS)
Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.
2015-12-01
Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.
Improving Your Data Transformations: Applying the Box-Cox Transformation
ERIC Educational Resources Information Center
Osborne, Jason W.
2010-01-01
Many of us in the social sciences deal with data that do not conform to assumptions of normality and/or homoscedasticity/homogeneity of variance. Some research has shown that parametric tests (e.g., multiple regression, ANOVA) can be robust to modest violations of these assumptions. Yet the reality is that almost all analyses (even nonparametric…
Charm: Cosmic history agnostic reconstruction method
NASA Astrophysics Data System (ADS)
Porqueres, Natalia; Ensslin, Torsten A.
2017-03-01
Charm (cosmic history agnostic reconstruction method) reconstructs the cosmic expansion history in the framework of Information Field Theory. The reconstruction is performed via the iterative Wiener filter from an agnostic or from an informative prior. The charm code allows one to test the compatibility of several different data sets with the LambdaCDM model in a non-parametric way.
ERIC Educational Resources Information Center
Roberts, James S.; Laughlin, James E.
1996-01-01
A parametric item response theory model for unfolding binary or graded responses is developed. The graded unfolding model (GUM) is a generalization of the hyperbolic cosine model for binary data of D. Andrich and G. Luo (1993). Applicability of the GUM to attitude testing is illustrated with real data. (SLD)
The effect of monetary incentives on absenteeism: a case study
Charles H. Wolf
1974-01-01
An attendance bonus paid by a wood processing firm was studied to determine its effectiveness in reducing absenteeism. Employees were divided into permanent and short-term groups, and their response to the bonus was studied, using non-parametric tests. The evidence suggested that the incentive favorably influenced the work attendance of only the permanent group....
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2007-01-01
The validation of cognitive attributes required for correct answers on binary test items or tasks has been addressed in previous research through the integration of cognitive psychology and psychometric models using parametric or nonparametric item response theory, latent class modeling, and Bayesian modeling. All previous models, each with their…
ERIC Educational Resources Information Center
Samejima, Fumiko
This paper is the final report of a multi-year project sponsored by the Office of Naval Research (ONR) in 1987 through 1990. The main objectives of the research summarized were to: investigate the non-parametric approach to the estimation of the operating characteristics of discrete item responses; revise and strengthen the package computer…
A Comparison of Uniform DIF Effect Size Estimators under the MIMIC and Rasch Models
ERIC Educational Resources Information Center
Jin, Ying; Myers, Nicholas D.; Ahn, Soyeon; Penfield, Randall D.
2013-01-01
The Rasch model, a member of a larger group of models within item response theory, is widely used in empirical studies. Detection of uniform differential item functioning (DIF) within the Rasch model typically employs null hypothesis testing with a concomitant consideration of effect size (e.g., signed area [SA]). Parametric equivalence between…
KRASH Parametric Sensitivity Study - Transport Category Airplanes
1987-12-01
8217_ COPILOT PELVIS w Uj 0 •-’-AV - 24.8 -10 .Y +10 VERTICAL ACCELERATION, -i 0 COPILOT PELVIS uUoAV 4 4 -10 2891 "+15 VERTICAL ACCELERATION...j 0 " -- PILOT PELVIS Uw - - AV =47,9 • ----- 4 -15 AV INCREMENTAL VELOCITY CHANGE, FT/SEC Figure 3-65. DC-7 Test, Measured Acceleration, Eight
Fire Accident Testing Evaluation (FATE)
NASA Technical Reports Server (NTRS)
Ross, H. D.; Mell, W.; Pettegrew, R.; Hicks, M.; Urban, D.
2001-01-01
By performing parametric experiments both in normal gravity and reduced gravity on the KC-135 aircraft, as well as developing and analyzing related modeling, generality of the interpretation of the experimental findings will be pursued along with direct recommendations for fire safety practices and policies for fire safety on spacecraft and in Martian habitats. This is the principal value of the research.
Code of Federal Regulations, 2012 CFR
2012-07-01
... selected for initial performance testing and defined within a group of similar emission units in accordance... similar air pollution control device applied to each similar emission unit within a defined group using... emission units within group “k”; Pi = Daily average parametric monitoring parameter value corresponding to...
Code of Federal Regulations, 2014 CFR
2014-07-01
... selected for initial performance testing and defined within a group of similar emission units in accordance... similar air pollution control device applied to each similar emission unit within a defined group using... emission units within group “k”; Pi = Daily average parametric monitoring parameter value corresponding to...
Influence of the Level Density Parametrization on the Effective GDR Width at High Spins
NASA Astrophysics Data System (ADS)
Mazurek, K.; Matejska, M.; Kmiecik, M.; Maj, A.; Dudek, J.
Parameterizations of the nucleonic level densities are tested by computing the effective GDR strength-functions and GDR widths at high spins. Calculations are based on the thermal shape fluctuation method with the Lublin-Strasbourg Drop (LSD) model. Results for 106Sn, 147Eu, 176W, 194Hg are compared to the experimental data.
D.J. Nicolsky; V.E. Romanovsky; G.G. Panteleev
2008-01-01
A variational data assimilation algorithm is developed to reconstruct thermal properties, porosity, and parametrization of the unfrozen water content for fully saturated soils. The algorithm is tested with simulated synthetic temperatures. The simulations are performed to determine the robustness and sensitivity of algorithm to estimate soil properties from in-situ...
Parametric vs. non-parametric daily weather generator: validation and comparison
NASA Astrophysics Data System (ADS)
Dubrovsky, Martin
2016-04-01
As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30 years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database.
An Oil-Free Thrust Foil Bearing Facility Design, Calibration, and Operation
NASA Technical Reports Server (NTRS)
Bauman, Steve
2005-01-01
New testing capabilities are needed in order to foster thrust foil air bearing technology development and aid its transition into future Oil-Free gas turbines. This paper describes a new test apparatus capable of testing thrust foil air bearings up to 100 mm in diameter at speeds to 80,000 rpm and temperatures to 650 C (1200 F). Measured parameters include bearing torque, load capacity, and bearing temperatures. This data will be used for design performance evaluations and for validation of foil bearing models. Preliminary test results demonstrate that the rig is capable of testing thrust foil air bearings under a wide range of conditions which are anticipated in future Oil-Free gas turbines. Torque as a function of speed and temperature corroborates results expected from rudimentary performance models. A number of bearings were intentionally failed with no resultant damage whatsoever to the test rig. Several test conditions (specific speeds and loads) revealed undesirable axial shaft vibrations which have been attributed to the magnetic bearing control system and are under study. Based upon these preliminary results, this test rig will be a valuable tool for thrust foil bearing research, parametric studies and technology development.
Experimental and analytical comparison of flowfields in a 110 N (25 lbf) H2/O2 rocket
NASA Technical Reports Server (NTRS)
Reed, Brian D.; Penko, Paul F.; Schneider, Steven J.; Kim, Suk C.
1991-01-01
A gaseous hydrogen/gaseous oxygen 110 N (25 lbf) rocket was examined through the RPLUS code using the full Navier-Stokes equations with finite rate chemistry. Performance tests were conducted on the rocket in an altitude test facility. Preliminary parametric analyses were performed for a range of mixture ratios and fuel film cooling pcts. It is shown that the computed values of specific impulse and characteristic exhaust velocity follow the trend of the experimental data. Specific impulse computed by the code is lower than the comparable test values by about two to three percent. The computed characteristic exhaust velocity values are lower than the comparable test values by three to four pct. Thrust coefficients computed by the code are found to be within two pct. of the measured values. It is concluded that the discrepancy between computed and experimental performance values could not be attributed to experimental uncertainty.
Lockwood, Alan H; Weissenborn, Karin; Bokemeyer, Martin; Tietge, U; Burchert, Wolfgang
2002-03-01
Many cirrhotics have abnormal neuropsychological test scores. To define the anatomical-physiological basis for encephalopathy in nonalcoholic cirrhotics, we performed resting-state fluorodeoxyglucose positron emission tomographic scans and administered a neuropsychological test battery to 18 patients and 10 controls. Statistical parametric mapping correlated changes in regional glucose metabolism with performance on the individual tests and a composite battery score. In patients without overt encephalopathy, poor performance correlated with reductions in metabolism in the anterior cingulate. In all patients, poor performance on the battery was positively correlated (p < 0.001) with glucose metabolism in bifrontal and biparietal regions of the cerebral cortex and negatively correlated with metabolism in hippocampal, lingual, and fusiform gyri and the posterior putamen. Similar patterns of abnormal metabolism were found when comparing the patients to 10 controls. Metabolic abnormalities in the anterior attention system and association cortices mediating executive and integrative function form the pathophysiological basis for mild hepatic encephalopathy.
Catalytic ignition of hydrogen/oxygen
NASA Technical Reports Server (NTRS)
Green, James M.; Zurawski, Robert L.
1988-01-01
An experimental program was conducted to evaluate the catalytic ignition of gaseous hydrogen and oxygen. Shell 405 granular catalyst and a unique monolithic sponge catalyst were tested. Mixture ratio, mass flow rate, propellant inlet temperature, and back pressure were varied parametrically in testing to determine the operational limits of a catalytic igniter. The test results showed that the gaseous hydrogen/oxygen propellant combination can be ignited catalytically using Shell 405 catalyst over a wide range of mixture ratios, mass flow rates, and propellant injection temperatures. These operating conditions must be optimized to ensure reliable ignition for an extended period of time. The results of the experimental program and the established operational limits for a catalytic igniter using both the granular and monolithic catalysts are presented. The capabilities of a facility constructed to conduct the igniter testing and the advantages of a catalytic igniter over other ignition systems for gaseous hydrogen and oxygen are also discussed.
NASA Technical Reports Server (NTRS)
1972-01-01
Materials and design technology of the all-silica LI-900 rigid surface insulation (RSI) thermal protection system (TPS) concept for the shuttle spacecraft is presented. All results of contract development efforts are documented. Engineering design and analysis of RSI strain arrestor plate material selections, sizing, and weight studies are reported. A shuttle prototype test panel was designed, analyzed, fabricated, and delivered. Thermophysical and mechanical properties of LI-900 were experimentally established and reported. Environmental tests, including simulations of shuttle loads represented by thermal response, turbulent duct, convective cycling, and chemical tolerance tests are described and results reported. Descriptions of material test samples and panels fabricated for testing are included. Descriptions of analytical sizing and design procedures are presented in a manner formulated to allow competent engineering organizations to perform rational design studies. Results of parametric studies involving material and system variables are reported. Material performance and design data are also delineated.
Bayesian multivariate hierarchical transformation models for ROC analysis.
O'Malley, A James; Zou, Kelly H
2006-02-15
A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box-Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial.
Bayesian multivariate hierarchical transformation models for ROC analysis
O'Malley, A. James; Zou, Kelly H.
2006-01-01
SUMMARY A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box–Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial. PMID:16217836
Quantitative Image Analysis Techniques with High-Speed Schlieren Photography
NASA Technical Reports Server (NTRS)
Pollard, Victoria J.; Herron, Andrew J.
2017-01-01
Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.
On the design of innovative heterogeneous tests using a shape optimization approach
NASA Astrophysics Data System (ADS)
Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.
2018-05-01
The development of full-field measurement methods enabled a new trend of mechanical tests. By providing the inhomogeneous strain field from the tests, these techniques are being widely used in sheet metal identification strategies, through heterogeneous mechanical tests. In this work, a heterogeneous mechanical test with an innovative tool/specimen shape, capable of producing rich heterogeneous strain paths providing extensive information on material behavior, is aimed. The specimen is found using a shape optimization process where a dedicated indicator that evaluates the richness of strain information is used. The methodology and results here presented are extended to non-specimen geometry dependence and to the non-dependence of the geometry parametrization through the use of the Ritz method for boundary value problems. Different curve models, such as Splines, B-Splines and NURBS, are used and C1 continuity throughout the specimen is guaranteed. Moreover, various optimization methods are used, deterministic and stochastic, in order to find the method or a combination of methods able to effectively minimize the cost function.
Evaluation of dispersion strengthened nickel-base alloy heat shields for space shuttle application
NASA Technical Reports Server (NTRS)
Johnson, R., Jr.; Killpatrick, D. H.
1973-01-01
The work reported constitutes the first phase of a two-phase program. Vehicle environments having critical effects on the thermal protection system are defined; TD Ni-20Cr material characteristics are reviewed and compared with TD Ni-20Cr produced in previous development efforts; cyclic load, temperature, and pressure effects on TD Ni-20Cr sheet material are investigated; the effects of braze reinforcement in improving the efficiency of spotwelded, diffusion-bonded, or seam-welded joints are evaluated through tests of simple lap-shear joint samples; parametric studies of metallic radiative thermal protection systems are reported; and the design, instrumentation, and testing of full-scale subsize heat shield panels are described. Tests of full-scale subsize panels included simulated meteoroid impact tests; simulated entry flight aerodynamic heating in an arc-heated plasma stream; programmed differential pressure loads and temperatures simulating mission conditions; and acoustic tests simulating sound levels experienced by heat shields during about boost flight. Test results are described, and the performances of two heat shield designs are compared and evaluated.
Machado-Alba, Jorge Enrique; Medina-Morales, Diego Alejandro; Echeverri-Cataño, Luis Felipe
2016-06-01
The results of two scales that measure quality of life of patients with diabetes mellitus treated with conventional or analogue insulin were evaluated and compared. Descriptive, observational, cross-sectional study, conducted in the cities of Pereira and Manizales, Colombia, in a care facility between 1 August 2013 and 30 March 2014. A total of 238 patients diagnosed with diabetes mellitus type 1 or type 2 who had been undergoing treatment with conventional or analogue insulin for at least 6months. Comparison of the results of the Diabetes 39 (specific) and European Quality of Life-5 Dimensions (EQ-5D) (generic) tools it was performed. Comparisons between the results of the two instruments were performed. Tests for parametric and non-parametric distribution (Pearson's correlation coefficient, Mann-Whitney U test, Student's t-test and Wilcoxon test) were used. The mean age was 57.7±16.6years. Conventional insulin was prescribed to 69.6% of patients, and analogue insulin was prescribed to 30.4% of patients. Diabetes-39 (D-39) showed 24.7% of subjects with a high quality of life. No statistically significant differences were found when comparing patients prescribed conventional or analogue insulin (p=0.35; 95% confidence interval [CI]: 0.375-1.419). In the EQ-5D survey, 45.7% claimed to have a high quality of life, without statistically significant differences between groups (p=0.56; 95%CI: 0.676-2.047). No differences between patients receiving conventional insulin versus analogue insulin were detected in terms of quality of life. The group aged over 60years requires special attention to improve their quality of life, and programs should focus on those individuals. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Clayton, Louie
2004-01-01
This paper provides a discussion of the history of Carbon Cloth Phenolic (CCP) ply lifting in the Redesigned Solid Rocket Motor (RSRM) Program, a brief presentation of theoretical methods used for analytical evaluation, and results of parametric analyses of CCP material subject to test conditions of the Laser Hardened Material Evaluation Laboratory. CCP ply lift can occur in regions of the RSRM nozzle where ply angle to flame surface is generally less than about 20 degrees. There is a heat rate dependence on likelihood and severity of the condition with the higher heating rates generally producing more ply lift. The event occurs in-depth, near the heated surface, where the load necessary to mechanically separate the CCP plies is produced by the initial stages of pyrolysis gas generation due to the thermal decomposition of the phenolic resin matrix. Due to the shallow lay-up angle of the composite, normal components of the indepth mechanical load, due to "pore pressure", are imparted primarily as a cross-ply tensile force on the interlaminar ply boundaries. Tensile capability in the cross-ply (out of plane) direction is solely determined by the matrix material capability. The elevated temperature matrix material capabilities are overcome by pressure induced mechanical normal stress and ply-lift occurs. A theoretical model used for CCP in-depth temperature, pressure, and normal stress prediction, based on first principles, is briefly discussed followed by a parametric evaluation of response variables subject to boundary conditions typical of on-going test programs at the LHMEL facility. Model response demonstrates general trends observed in test and provides insight into the interactivity of material properties and constitutive relationships.
Vigan, Marie; Stirnemann, Jérôme; Mentré, France
2014-05-01
Analysis of repeated time-to-event data is increasingly performed in pharmacometrics using parametric frailty models. The aims of this simulation study were (1) to assess estimation performance of Stochastic Approximation Expectation Maximization (SAEM) algorithm in MONOLIX, Adaptive Gaussian Quadrature (AGQ), and Laplace algorithm in PROC NLMIXED of SAS and (2) to evaluate properties of test of a dichotomous covariate on occurrence of events. The simulation setting is inspired from an analysis of occurrence of bone events after the initiation of treatment by imiglucerase in patients with Gaucher Disease (GD). We simulated repeated events with an exponential model and various dropout rates: no, low, or high. Several values of baseline hazard model, variability, number of subject, and effect of covariate were studied. For each scenario, 100 datasets were simulated for estimation performance and 500 for test performance. We evaluated estimation performance through relative bias and relative root mean square error (RRMSE). We studied properties of Wald and likelihood ratio test (LRT). We used these methods to analyze occurrence of bone events in patients with GD after starting an enzyme replacement therapy. SAEM with three chains and AGQ algorithms provided good estimates of parameters much better than SAEM with one chain and Laplace which often provided poor estimates. Despite a small number of repeated events, SAEM with three chains and AGQ gave small biases and RRMSE. Type I errors were closed to 5%, and power varied as expected for SAEM with three chains and AGQ. Probability of having at least one event under treatment was 19.1%.
NASA Technical Reports Server (NTRS)
Sapp, Clyde A.; See, Thomas H.; Zolensky, Michael E.
1992-01-01
During the 3 month deintegration of the LDEF, the M&D SIG generated approximately 5000 digital color stereo image pairs of impact related features from all space exposed surfaces. Currently, these images are being processed at JSC to yield more accurate feature information. Work is currently underway to determine the minimum number of data points necessary to parametrically define impact crater morphologies in order to minimize the man-hour intensive task of tie point selection. Initial attempts at deriving accurate crater depth and diameter measurements from binocular imagery were based on the assumption that the crater geometries were best defined by paraboloid. We made no assumptions regarding the crater depth/diameter ratios but instead allowed each crater to define its own coefficients by performing a least-squares fit based on user-selected tiepoints. Initial test cases resulted in larger errors than desired, so it was decided to test our basic assumptions that the crater geometries could be parametrically defined as paraboloids. The method for testing this assumption was to carefully slice test craters (experimentally produced in an appropriate aluminum alloy) vertically through the center resulting in a readily visible cross-section of the crater geometry. Initially, five separate craters were cross-sectioned in this fashion. A digital image of each cross-section was then created, and the 2-D crater geometry was then hand-digitized to create a table of XY position for each crater. A 2nd order polynomial (parabolic) was fitted to the data using a least-squares approach. The differences between the fit equation and the actual data were fairly significant, and easily large enough to account for the errors found in the 3-D fits. The differences between the curve fit and the actual data were consistent between the caters. This consistency suggested that the differences were due to the fact that a parabola did not sufficiently define the generic crater geometry. Fourth and 6th order equations were then fitted to each crater cross-section, and significantly better estimates of the crater geometry were obtained with each fit. Work is presently underway to determine the best way to make use of this new parametric crater definition.
Overview of the 1985 NASA Lewis Research Center SP-100 free-piston Stirling engine activities
NASA Technical Reports Server (NTRS)
Slaby, J.
1985-01-01
This effort is keyed on the design, fabrication, assembly, and testing of a 25 kWe Stirling space-power technology-feasibility demonstrator engine. Another facet of the SP-100 project covers the status of a 9000-hr endurance test conducted on a 2 kWe free-piston Stirling/linear alternator system employing hydrostatic gas bearings. Dynamic balancing of the RE-1000 engine (a 1 kWe free-piston Stirling engine) using a passive dynamic absorber will be discussed along with the results of a parametric study showing the relationships of Stirling power converter specific weight and efficiency as functions of Stirling engine heater to cooler temperature ratio. Planned tests will be described covering a hydrodynamic gas bearing concept for potential SP-100 application.
Burroughs, N J; Pillay, D; Mutimer, D
1999-01-01
Bayesian analysis using a virus dynamics model is demonstrated to facilitate hypothesis testing of patterns in clinical time-series. Our Markov chain Monte Carlo implementation demonstrates that the viraemia time-series observed in two sets of hepatitis B patients on antiviral (lamivudine) therapy, chronic carriers and liver transplant patients, are significantly different, overcoming clinical trial design differences that question the validity of non-parametric tests. We show that lamivudine-resistant mutants grow faster in transplant patients than in chronic carriers, which probably explains the differences in emergence times and failure rates between these two sets of patients. Incorporation of dynamic models into Bayesian parameter analysis is of general applicability in medical statistics. PMID:10643081
Mid-infrared pulsed laser ultrasonic testing for carbon fiber reinforced plastics.
Kusano, Masahiro; Hatano, Hideki; Watanabe, Makoto; Takekawa, Shunji; Yamawaki, Hisashi; Oguchi, Kanae; Enoki, Manabu
2018-03-01
Laser ultrasonic testing (LUT) can realize contactless and instantaneous non-destructive testing, but its signal-to-noise ratio must be improved in order to measure carbon fiber reinforced plastics (CFRPs). We have developed a mid-infrared (mid-IR) laser source optimal for generating ultrasonic waves in CFRPs by using a wavelength conversion device based on an optical parametric oscillator. This paper reports a comparison of the ultrasonic generation behavior between the mid-IR laser and the Nd:YAG laser. The mid-IR laser generated a significantly larger ultrasonic amplitude in CFRP laminates than a conventional Nd:YAG laser. In addition, our study revealed that the surface epoxy matrix of CFRPs plays an important role in laser ultrasonic generation. Copyright © 2017 Elsevier B.V. All rights reserved.
Parametric study of closed wet cooling tower thermal performance
NASA Astrophysics Data System (ADS)
Qasim, S. M.; Hayder, M. J.
2017-08-01
The present study involves experimental and theoretical analysis to evaluate the thermal performance of modified Closed Wet Cooling Tower (CWCT). The experimental study includes: design, manufacture and testing prototype of a modified counter flow forced draft CWCT. The modification based on addition packing to the conventional CWCT. A series of experiments was carried out at different operational parameters. In view of energy analysis, the thermal performance parameters of the tower are: cooling range, tower approach, cooling capacity, thermal efficiency, heat and mass transfer coefficients. The theoretical study included develops Artificial Neural Network (ANN) models to predicting various thermal performance parameters of the tower. Utilizing experimental data for training and testing, the models simulated by multi-layer back propagation algorithm for varying all operational parameters stated in experimental test.
NASA Technical Reports Server (NTRS)
Mcdill, Paul L.
1986-01-01
A test program, utilizing a large scale model, was run in the NASA Lewis Research Center 10- by 10-ft wind tunnel to examine the influence on performance of design parameters of turboprop S-duct inlet/diffuser systems. The parametric test program investigated inlet lip thickness, inlet/diffuser cross-sectional geometry, throat design Mach number, and shaft fairing shape. The test program was run at angles of attack to 15 deg and tunnel Mach numbers to 0.35. Results of the program indicate that current design techniques can be used to design inlet/diffuser systems with acceptable total pressure recovery, but several of the design parameters, notably lip thickness (contraction ratio) and shaft fairing cross section, must be optimized to prevent excessive distortion at the compressor face.
Pollution emissions from single swirl-can combustor modules at parametric test conditions
NASA Technical Reports Server (NTRS)
Mularz, E. J.; Wear, J. D.; Verbulecz, P. W.
1975-01-01
Exhaust pollutant emissions were measured from single swirl-can combustor modules operating over a pressure range of 69 to 276 N/sq cm (100 to 400 psia), over a fuel-air ratio range of 0.01 to 0.04, at an inlet air temperature of 733 K (860 F), and at a constant reference velocity of 23.2 m/sec). Many swirl-can module designs were evaluated; the 11 most promising designs exhibited oxides of nitrogen emission levels lower than that from conventional gas-turbine combustors. Although these single module test results are not necessarily indicative of the performance characteristics of a large array of modules, the results are very promixing and offer a number of module designs that should be tested in a full combustor.
Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?
NASA Astrophysics Data System (ADS)
Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy
2016-10-01
The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.
NASA Astrophysics Data System (ADS)
Jhajharia, Deepak; Yadav, Brijesh K.; Maske, Sunil; Chattopadhyay, Surajit; Kar, Anil K.
2012-01-01
Trends in rainfall, rainy days and 24 h maximum rainfall are investigated using the Mann-Kendall non-parametric test at twenty-four sites of subtropical Assam located in the northeastern region of India. The trends are statistically confirmed by both the parametric and non-parametric methods and the magnitudes of significant trends are obtained through the linear regression test. In Assam, the average monsoon rainfall (rainy days) during the monsoon months of June to September is about 1606 mm (70), which accounts for about 70% (64%) of the annual rainfall (rainy days). On monthly time scales, sixteen and seventeen sites (twenty-one sites each) witnessed decreasing trends in the total rainfall (rainy days), out of which one and three trends (seven trends each) were found to be statistically significant in June and July, respectively. On the other hand, seventeen sites witnessed increasing trends in rainfall in the month of September, but none were statistically significant. In December (February), eighteen (twenty-two) sites witnessed decreasing (increasing) trends in total rainfall, out of which five (three) trends were statistically significant. For the rainy days during the months of November to January, twenty-two or more sites witnessed decreasing trends in Assam, but for nine (November), twelve (January) and eighteen (December) sites, these trends were statistically significant. These observed changes in rainfall, although most time series are not convincing as they show predominantly no significance, along with the well-reported climatic warming in monsoon and post-monsoon seasons may have implications for human health and water resources management over bio-diversity rich Northeast India.
NASA Astrophysics Data System (ADS)
Meresescu, Alina G.; Kowalski, Matthieu; Schmidt, Frédéric; Landais, François
2018-06-01
The Water Residence Time distribution is the equivalent of the impulse response of a linear system allowing the propagation of water through a medium, e.g. the propagation of rain water from the top of the mountain towards the aquifers. We consider the output aquifer levels as the convolution between the input rain levels and the Water Residence Time, starting with an initial aquifer base level. The estimation of Water Residence Time is important for a better understanding of hydro-bio-geochemical processes and mixing properties of wetlands used as filters in ecological applications, as well as protecting fresh water sources for wells from pollutants. Common methods of estimating the Water Residence Time focus on cross-correlation, parameter fitting and non-parametric deconvolution methods. Here we propose a 1D full-deconvolution, regularized, non-parametric inverse problem algorithm that enforces smoothness and uses constraints of causality and positivity to estimate the Water Residence Time curve. Compared to Bayesian non-parametric deconvolution approaches, it has a fast runtime per test case; compared to the popular and fast cross-correlation method, it produces a more precise Water Residence Time curve even in the case of noisy measurements. The algorithm needs only one regularization parameter to balance between smoothness of the Water Residence Time and accuracy of the reconstruction. We propose an approach on how to automatically find a suitable value of the regularization parameter from the input data only. Tests on real data illustrate the potential of this method to analyze hydrological datasets.
Transverse momentum dependent parton distribution and fragmentation functions with QCD evolution
NASA Astrophysics Data System (ADS)
Aybat, S. Mert; Rogers, Ted C.
2011-06-01
We assess the current phenomenological status of transverse momentum dependent (TMD) parton distribution functions (PDFs) and fragmentation functions (FFs) and study the effect of consistently including perturbative QCD (pQCD) evolution. Our goal is to initiate the process of establishing reliable, QCD-evolved parametrizations for the TMD PDFs and TMD FFs that can be used both to test TMD factorization and to search for evidence of the breakdown of TMD factorization that is expected for certain processes. In this article, we focus on spin-independent processes because they provide the simplest illustration of the basic steps and can already be used in direct tests of TMD factorization. Our calculations are based on the Collins-Soper-Sterman (CSS) formalism, supplemented by recent theoretical developments which have clarified the precise definitions of the TMD PDFs and TMD FFs needed for a valid TMD-factorization theorem. Starting with these definitions, we numerically generate evolved TMD PDFs and TMD FFs using as input existing parametrizations for the collinear PDFs, collinear FFs, nonperturbative factors in the CSS factorization formalism, and recent fixed-scale fits. We confirm that evolution has important consequences, both qualitatively and quantitatively, and argue that it should be included in future phenomenological studies of TMD functions. Our analysis is also suggestive of extensions to processes that involve spin-dependent functions such as the Boer-Mulders, Sivers, or Collins functions, which we intend to pursue in future publications. At our website [http://projects.hepforge.org/tmd/], we have made available the tables and calculations needed to obtain the TMD parametrizations presented herein.
NASA Astrophysics Data System (ADS)
Kaucikas, M.; Warren, M.; Michailovas, A.; Antanavicius, R.; van Thor, J. J.
2013-02-01
This paper describes the investigation of an optical parametric oscillator (OPO) set-up based on two beta barium borate (BBO) crystals, where the interplay between the crystal orientations, cut angles and air dispersion substantially influenced the OPO performance, and especially the angular spectrum of the output beam. Theory suggests that if two BBO crystals are used in this type of design, they should be of different cuts. This paper aims to provide an experimental manifestation of this fact. Furthermore, it has been shown that air dispersion produces similar effects and should be taken into account. An x-ray crystallographic indexing of the crystals was performed as an independent test of the above conclusions.
Determination of in vivo mechanical properties of long bones from their impedance response curves
NASA Technical Reports Server (NTRS)
Borders, S. G.
1981-01-01
A mathematical model consisting of a uniform, linear, visco-elastic, Euler-Bernoulli beam to represent the ulna or tibia of the vibrating forearm or leg system is developed. The skin and tissue compressed between the probe and bone is represented by a spring in series with the beam. The remaining skin and tissue surrounding the bone is represented by a visco-elastic foundation with mass. An extensive parametric study is carried out to determine the effect of each parameter of the mathematical model on its impedance response. A system identification algorithm is developed and programmed on a digital computer to determine the parametric values of the model which best simulate the data obtained from an impedance test.
Parametric study of rock pile thermal storage for solar heating and cooling phase 1
NASA Technical Reports Server (NTRS)
Saha, H.
1977-01-01
The test data and an analysis were presented, of heat transfer characteristics of a solar thermal energy storage bed utilizing water filled cans as the energy storage medium. An attempt was made to optimize can size, can arrangement, and bed flow rates by experimental and analytical means. Liquid filled cans, as storage media, utilize benefits of both solids like rocks, and liquids like water. It was found that this combination of solid and liquid media shows unique heat transfer and heat content characteristics and is well suited for use with solar air systems for space and hot water heating. An extensive parametric study was made of heat transfer characteristics of rocks, of other solids, and of solid containers filled with liquids.
Dissipative particle dynamics: Systematic parametrization using water-octanol partition coefficients
NASA Astrophysics Data System (ADS)
Anderson, Richard L.; Bray, David J.; Ferrante, Andrea S.; Noro, Massimo G.; Stott, Ian P.; Warren, Patrick B.
2017-09-01
We present a systematic, top-down, thermodynamic parametrization scheme for dissipative particle dynamics (DPD) using water-octanol partition coefficients, supplemented by water-octanol phase equilibria and pure liquid phase density data. We demonstrate the feasibility of computing the required partition coefficients in DPD using brute-force simulation, within an adaptive semi-automatic staged optimization scheme. We test the methodology by fitting to experimental partition coefficient data for twenty one small molecules in five classes comprising alcohols and poly-alcohols, amines, ethers and simple aromatics, and alkanes (i.e., hexane). Finally, we illustrate the transferability of a subset of the determined parameters by calculating the critical micelle concentrations and mean aggregation numbers of selected alkyl ethoxylate surfactants, in good agreement with reported experimental values.
Experimental parametric study of jet vortex generators for flow separation control
NASA Technical Reports Server (NTRS)
Selby, Gregory
1991-01-01
A parametric wind-tunnel study was performed with jet vortex generators to determine their effectiveness in controlling flow separation associated with low-speed turbulence flow over a two-dimensional rearward-facing ramp. Results indicate that flow-separation control can be accomplished, with the level of control achieved being a function of jet speed, jet orientation (with respect to the free-stream direction), and orifice pattern (double row of jets vs. single row). Compared to slot blowing, jet vortex generators can provide an equivalent level of flow control over a larger spanwise region (for constant jet flow area and speed). Dye flow visualization tests in a water tunnel indicated that the most effective jet vortex generator configurations produced streamwise co-rotating vortices.
NASA Technical Reports Server (NTRS)
Holdeman, James D. (Technical Monitor); Chiappetta, Louis, Jr.; Hautman, Donald J.; Ols, John T.; Padget, Frederick C., IV; Peschke, William O. T.; Shirley, John A.; Siskind, Kenneth S.
2004-01-01
The low emissions potential of a Rich-Quench-Lean (RQL) combustor for use in the High Speed Civil Transport (HSCT) application was evaluated as part of Work Breakdown Structure (WBS) 1.0.2.7 of the NASA Critical Propulsion Components (CPC) Program under Contract NAS3-27235. Combustion testing was conducted in cell 1E of the Jet Burner Test Stand at United Technologies Research Center. Specifically, a Rich-Quench-Lean combustor, utilizing reduced scale quench technology implemented in a quench vane concept in a product-like configuration (Product Module Rig), demonstrated the capability of achieving an emissions index of nitrogen oxides (NOx EI) of 8.5 gm/Kg fuel at the supersonic flight condition (relative to the program goal of 5 gm/Kg fuel). Developmental parametric testing of various quench vane configurations in the more fundamental flametube, Single Module Rig Configuration, demonstrated NOx EI as low as 5.2. All configurations in both the Product Module Rig configuration and the Single Module Rig configuration demonstrated exceptional efficiencies, greater than 99.95 percent, relative to the program goal of 99.9 percent efficiency at supersonic cruise conditions. Sensitivity of emissions to quench orifice design parameters were determined during the parametric quench vane test series in support of the design of the Product Module Rig configuration. For the rectangular quench orifices investigated, an aspect ratio (length/width) of approximately 2 was found to be near optimum. An optimum for orifice spacing was found to exist at approximately 0.167 inches, resulting in 24 orifices per side of a quench vane, for the 0.435 inch quench zone channel height investigated in the Single Module Rig. Smaller quench zone channel heights appeared to be beneficial in reducing emissions. Measurements were also obtained in the Single Module Rig configuration on the sensitivity of emissions to the critical combustor parameters of fuel/air ratio, pressure drop, and residence time. Minimal sensitivity was observed for all of these parameters.
Space station structures and dynamics test program
NASA Technical Reports Server (NTRS)
Moore, Carleton J.; Townsend, John S.; Ivey, Edward W.
1987-01-01
The design, construction, and operation of a low-Earth orbit space station poses unique challenges for development and implementation of new technology. The technology arises from the special requirement that the station be built and constructed to function in a weightless environment, where static loads are minimal and secondary to system dynamics and control problems. One specific challenge confronting NASA is the development of a dynamics test program for: (1) defining space station design requirements, and (2) identifying the characterizing phenomena affecting the station's design and development. A general definition of the space station dynamic test program, as proposed by MSFC, forms the subject of this report. The test proposal is a comprehensive structural dynamics program to be launched in support of the space station. The test program will help to define the key issues and/or problems inherent to large space structure analysis, design, and testing. Development of a parametric data base and verification of the math models and analytical analysis tools necessary for engineering support of the station's design, construction, and operation provide the impetus for the dynamics test program. The philosophy is to integrate dynamics into the design phase through extensive ground testing and analytical ground simulations of generic systems, prototype elements, and subassemblies. On-orbit testing of the station will also be used to define its capability.
Beilenhoff, Ulrike; Biering, Holger; Blum, Reinhard; Brljak, Jadranka; Cimbro, Monica; Dumonceau, Jean-Marc; Hassan, Cesare; Jung, Michael; Neumann, Christiane; Pietsch, Michael; Pineau, Lionel; Ponchon, Thierry; Rejchrt, Stanislav; Rey, Jean-François; Schmidt, Verona; Tillett, Jayne; van Hooft, Jeanin
2017-12-01
1 Prerequisites. The clinical service provider should obtain confirmation from the endoscope washer-disinfector (EWD) manufacturer that all endoscopes intended to be used can be reprocessed in the EWD. 2 Installation qualification. This can be performed by different parties but national guidelines should define who has the responsibilities, taking into account legal requirements. 3 Operational qualification. This should include parametric tests to verify that the EWD is working according to its specifications. 4 Performance qualification. Testing of cleaning performance, microbiological testing of routinely used endoscopes, and the quality of the final rinse water should be considered in all local guidelines. The extent of these tests depends on local requirements. According to the results of type testing performed during EWD development, other parameters can be tested if local regulatory authorities accept this. Chemical residues on endoscope surfaces should be searched for, if acceptable test methods are available. 5 Routine inspections. National guidelines should consider both technical and performance criteria. Individual risk analyses performed in the validation and requalification processes are helpful for defining appropriate test frequencies for routine inspections. © Georg Thieme Verlag KG Stuttgart · New York.
Miller, Ezer; Huppert, Amit; Novikov, Ilya; Warburg, Alon; Hailu, Asrat; Abbasi, Ibrahim; Freedman, Laurence S
2015-11-10
In this work, we describe a two-stage sampling design to estimate the infection prevalence in a population. In the first stage, an imperfect diagnostic test was performed on a random sample of the population. In the second stage, a different imperfect test was performed in a stratified random sample of the first sample. To estimate infection prevalence, we assumed conditional independence between the diagnostic tests and develop method of moments estimators based on expectations of the proportions of people with positive and negative results on both tests that are functions of the tests' sensitivity, specificity, and the infection prevalence. A closed-form solution of the estimating equations was obtained assuming a specificity of 100% for both tests. We applied our method to estimate the infection prevalence of visceral leishmaniasis according to two quantitative polymerase chain reaction tests performed on blood samples taken from 4756 patients in northern Ethiopia. The sensitivities of the tests were also estimated, as well as the standard errors of all estimates, using a parametric bootstrap. We also examined the impact of departures from our assumptions of 100% specificity and conditional independence on the estimated prevalence. Copyright © 2015 John Wiley & Sons, Ltd.
An Adaptive Genetic Association Test Using Double Kernel Machines
Zhan, Xiang; Epstein, Michael P.; Ghosh, Debashis
2014-01-01
Recently, gene set-based approaches have become very popular in gene expression profiling studies for assessing how genetic variants are related to disease outcomes. Since most genes are not differentially expressed, existing pathway tests considering all genes within a pathway suffer from considerable noise and power loss. Moreover, for a differentially expressed pathway, it is of interest to select important genes that drive the effect of the pathway. In this article, we propose an adaptive association test using double kernel machines (DKM), which can both select important genes within the pathway as well as test for the overall genetic pathway effect. This DKM procedure first uses the garrote kernel machines (GKM) test for the purposes of subset selection and then the least squares kernel machine (LSKM) test for testing the effect of the subset of genes. An appealing feature of the kernel machine framework is that it can provide a flexible and unified method for multi-dimensional modeling of the genetic pathway effect allowing for both parametric and nonparametric components. This DKM approach is illustrated with application to simulated data as well as to data from a neuroimaging genetics study. PMID:26640602
ERIC Educational Resources Information Center
Osler, James Edward
2015-01-01
This monograph provides a neuroscience-based systemological, epistemological, and methodological rational for the design of an advanced and novel parametric statistical analytics designed for the biological sciences referred to as "Biotrichotomy". The aim of this new arena of statistics is to provide dual metrics designed to analyze the…
Temperature can play a significant role in the efficacy of solidifiers in removing oil slicks on water. We studied and quantified the effect of temperature on the performance of several solidifiers using 5 different types of oils under a newly developed testing protocol by condu...
Temperature can play a significant role in the efficacy of solidifiers in removing oil slicks on water. We studied and quantified the effect of temperature on the performance of several solidifiers using 5 different types of oils under a newly developed testing protocol by conduc...
ERIC Educational Resources Information Center
Schwarz, Wolf
2006-01-01
Paradigms used to study the time course of the redundant signals effect (RSE; J. O. Miller, 1986) and temporal order judgments (TOJs) share many important similarities and address related questions concerning the time course of sensory processing. The author of this article proposes and tests a new aggregate diffusion-based model to quantitatively…
A Quasi-Parametric Method for Fitting Flexible Item Response Functions
ERIC Educational Resources Information Center
Liang, Longjuan; Browne, Michael W.
2015-01-01
If standard two-parameter item response functions are employed in the analysis of a test with some newly constructed items, it can be expected that, for some items, the item response function (IRF) will not fit the data well. This lack of fit can also occur when standard IRFs are fitted to personality or psychopathology items. When investigating…
Regression Is a Univariate General Linear Model Subsuming Other Parametric Methods as Special Cases.
ERIC Educational Resources Information Center
Vidal, Sherry
Although the concept of the general linear model (GLM) has existed since the 1960s, other univariate analyses such as the t-test and the analysis of variance models have remained popular. The GLM produces an equation that minimizes the mean differences of independent variables as they are related to a dependent variable. From a computer printout…
ERIC Educational Resources Information Center
Perelman, Sergio; Santin, Daniel
2011-01-01
The aim of the present paper is to examine the observed differences in Students' test performance across public and private-voucher schools in Spain. For this purpose, we explicitly consider that education is a multi-input multi-output production process subject to inefficient behaviors, which can be identified at student level using a parametric…
Jet-induced ground effects on a parametric flat-plate model in hover
NASA Technical Reports Server (NTRS)
Wardwell, Douglas A.; Hange, Craig E.; Kuhn, Richard E.; Stewart, Vearl R.
1993-01-01
The jet-induced forces generated on short takeoff and vertical landing (STOVL) aircraft when in close proximity to the ground can have a significant effect on aircraft performance. Therefore, accurate predictions of these aerodynamic characteristics are highly desirable. Empirical procedures for estimating jet-induced forces during the vertical/short takeoff and landing (V/STOL) portions of the flight envelope are currently limited in accuracy. The jet-induced force data presented significantly add to the current STOVL configurations data base. Further development of empirical prediction methods for jet-induced forces, to provide more configuration diversity and improved overall accuracy, depends on the viability of this STOVL data base. The data base may also be used to validate computational fluid dynamics (CFD) analysis codes. The hover data obtained at the NASA Ames Jet Calibration and Hover Test (JCAHT) facility for a parametric flat-plate model is presented. The model tested was designed to allow variations in the planform aspect ratio, number of jets, nozzle shape, and jet location. There were 31 different planform/nozzle configurations tested. Each configuration had numerous pressure taps installed to measure the pressures on the undersurface of the model. All pressure data along with the balance jet-induced lift and pitching-moment increments are tabulated. For selected runs, pressure data are presented in the form of contour plots that show lines of constant pressure coefficient on the model undersurface. Nozzle-thrust calibrations and jet flow-pressure survey information are also provided.
Sharma, Prashant P; Santiago, Marc A; Kriebel, Ricardo; Lipps, Savana M; Buenavente, Perry A C; Diesmos, Arvin C; Janda, Milan; Boyer, Sarah L; Clouse, Ronald M; Wheeler, Ward C
2017-01-01
The taxonomy and systematics of the armored harvestmen (suborder Laniatores) are based on various sets of morphological characters pertaining to shape, armature, pedipalpal setation, and the number of articles of the walking leg tarsi. Few studies have tested the validity of these historical character systems in a comprehensive way, with reference to an independent data class, i.e., molecular sequence data. We examined as a test case the systematics of Podoctidae, a family distributed throughout the Indo-Pacific. We tested the validity of the three subfamilies of Podoctidae using a five-locus phylogeny, and examined the evolution of dorsal shape as a proxy for taxonomic utility, using parametric shape analysis. Here we show that two of the three subfamilies, Ibaloniinae and Podoctinae, are non-monophyletic, with the third subfamily, Erecananinae, recovered as non-monophyletic in a subset of analyses. Various genera were also recovered as non-monophyletic. As first steps toward revision of Podoctidae, the subfamilies Erecananinae Roewer, 1912 and Ibaloniinae Roewer, 1912 are synonymized with Podoctinae Roewer, 1912 new synonymies, thereby abolishing unsubstantiated subfamilial divisions within Podoctidae. We once again synonymize the genus Paralomanius Goodnight & Goodnight, 1948 with Lomanius Roewer, 1923 revalidated. We additionally show that eggs carried on the legs of male Podoctidae are not conspecific to the males, falsifying the hypothesis of paternal care in this group. Copyright © 2016 Elsevier Inc. All rights reserved.
Cummings, Brian J; Engesser-Cesar, Christie; Cadena, Gilbert; Anderson, Aileen J
2007-02-27
Locomotor impairments after spinal cord injury (SCI) are often assessed using open-field rating scales. These tasks have the advantage of spanning the range from complete paralysis to normal walking; however, they lack sensitivity at specific levels of recovery. Additionally, most supplemental assessments were developed in rats, not mice. For example, the horizontal ladder beam has been used to measure recovery in the rat after SCI. This parametric task results in a videotaped archival record of the event, is easily administered, and is unambiguously scored. Although a ladder beam apparatus for mice is available, its use in the assessment of recovery in SCI mice is rare, possibly because normative data for uninjured mice and the type of step misplacements injured mice exhibit is lacking. We report the development of a modified ladder beam instrument and scoring system to measure hindlimb recovery in vertebral T9 contusion spinal cord injured mice. The mouse ladder beam allows for the use of standard parametric statistical tests to assess locomotor recovery. Ladder beam performance is consistent across four strains of mice, there are no sex differences, and inter-rater reliability between observers is high. The ladder beam score is proportional to injury severity and can be used to easily separate mice capable of weight-supported stance up to mice with consistent forelimb to hindlimb coordination. Critically, horizontal ladder beam testing discriminates between mice that score identically in terms of stepping frequency in open-field testing.
Cummings, Brian J.; Engesser-Cesar, Christie; Anderson, Aileen J.
2007-01-01
Locomotor impairments after spinal cord injury (SCI) are often assessed using open-field rating scales. These tasks have the advantage of spanning the range from complete paralysis to normal walking; however, they lack sensitivity at specific levels of recovery. Additionally, most supplemental assessments were developed in rats, not mice. For example, the horizontal ladder beam has been used to measure recovery in the rat after SCI. This parametric task results in a videotaped archival record of the event, is easily administered, and is unambiguously scored. Although a ladder beam apparatus for mice is available, its use in the assessment of recovery in SCI mice is rare, possibly because normative data for uninjured mice and the type of step misplacements injured mice exhibit is lacking. We report the development of a modified ladder beam instrument and scoring system to measure hindlimb recovery in vertebral T9 contusion spinal cord injured mice. The mouse ladder beam allows for the use of standard parametric statistical tests to assess locomotor recovery. Ladder beam performance is consistent across four strains of mice, there are no sex differences, and inter-rater reliability between observers is high. The ladder beam score is proportional to injury severity and can be used to easily separate mice capable of weight-supported stance up to mice with consistent forelimb to hindlimb coordination. Critically, horizontal ladder beam testing discriminates between mice that score identically in terms of stepping frequency in open-field testing. PMID:17197044
Marable, Brian R; Maurissen, Jacques P J
2004-01-01
Neurotoxicity regulatory guidelines mandate that automated test systems be validated using chemicals. However, in some cases, chemicals may not necessarily be needed to prove test system validity. To examine this issue, two independent experiments were conducted to validate an automated auditory startle response (ASR) system. In Experiment 1, we used adult (PND 63) and weanling (PND 22) Sprague-Dawley rats (10/sex/dose) to determine the effect of either d-amphetamine (4.0 or 8.0 mg/kg) or clonidine (0.4 or 0.8 mg/kg) on the ASR peak amplitude (ASR PA). The startle response of each rat to a short burst of white noise (120 dB SPL) was recorded over 50 consecutive trials. The ASR PA was significantly decreased (by clonidine) and increased (by d-amphetamine) compared to controls in PND 63 rats. In PND 22 rats, the response to clonidine was similar to adults, but d-amphetamine effects were not significant. Neither drug affected the rate of the decrease in ASR PA over time (habituation). In Experiment 2, PND 31 Sprague-Dawley rats (8/sex) were presented with 150 trials consisting of either white noise bursts of variable intensity (70-120 dB SPL in 10 dB increments, presented in random order) or null (0 dB SPL) trials. Statistically significant sex- and intensity-dependent differences were detected in the ASR PA. These results suggest that in some cases, parametric modulation may be an alternative to using chemicals for test system validation.
Booth, Brian G; Keijsers, Noël L W; Sijbers, Jan; Huysmans, Toon
2018-05-03
Pedobarography produces large sets of plantar pressure samples that are routinely subsampled (e.g. using regions of interest) or aggregated (e.g. center of pressure trajectories, peak pressure images) in order to simplify statistical analysis and provide intuitive clinical measures. We hypothesize that these data reductions discard gait information that can be used to differentiate between groups or conditions. To test the hypothesis of null information loss, we created an implementation of statistical parametric mapping (SPM) for dynamic plantar pressure datasets (i.e. plantar pressure videos). Our SPM software framework brings all plantar pressure videos into anatomical and temporal correspondence, then performs statistical tests at each sampling location in space and time. Novelly, we introduce non-linear temporal registration into the framework in order to normalize for timing differences within the stance phase. We refer to our software framework as STAPP: spatiotemporal analysis of plantar pressure measurements. Using STAPP, we tested our hypothesis on plantar pressure videos from 33 healthy subjects walking at different speeds. As walking speed increased, STAPP was able to identify significant decreases in plantar pressure at mid-stance from the heel through the lateral forefoot. The extent of these plantar pressure decreases has not previously been observed using existing plantar pressure analysis techniques. We therefore conclude that the subsampling of plantar pressure videos - a task which led to the discarding of gait information in our study - can be avoided using STAPP. Copyright © 2018 Elsevier B.V. All rights reserved.
To Invest or Not to Invest, That Is the Question: Analysis of Firm Behavior under Anticipated Shocks
Kovac, Dejan; Vukovic, Vuk; Kleut, Nikola; Podobnik, Boris
2016-01-01
When companies are faced with an upcoming and expected economic shock some of them tend to react better than others. They adapt by initiating investments thus successfully weathering the storm, while others, even though they possess the same information set, fail to adopt the same business strategy and eventually succumb to the crisis. We use a unique setting of the recent financial crisis in Croatia as an exogenous shock that hit the country with a time lag, allowing the domestic firms to adapt. We perform a survival analysis on the entire population of 144,000 firms in Croatia during the period from 2003 to 2015, and test whether investment prior to the anticipated shock makes firms more likely to survive the recession. We find that small and micro firms, which decided to invest, had between 60 and 70% higher survival rates than similar firms that chose not to invest. This claim is supported by both non-parametric and parametric tests in the survival analysis. From a normative perspective this finding could be important in mitigating the negative effects on aggregate demand during strong recessionary periods. PMID:27508896
NASA Astrophysics Data System (ADS)
Cernesson, Flavie; Tournoud, Marie-George; Lalande, Nathalie
2018-06-01
Among the various parameters monitored in river monitoring networks, bioindicators provide very informative data. Analysing time variations in bioindicator data is tricky for water managers because the data sets are often short, irregular, and non-normally distributed. It is then a challenging methodological issue for scientists, as it is in Saône basin (30 000 km2, France) where, between 1998 and 2010, among 812 IBGN (French macroinvertebrate bioindicator) monitoring stations, only 71 time series have got more than 10 data values and were studied here. Combining various analytical tools (three parametric and non-parametric statistical tests plus a graphical analysis), 45 IBGN time series were classified as stationary and 26 as non-stationary (only one of which showing a degradation). Series from sampling stations located within the same hydroecoregion showed similar trends, while river size classes seemed to be non-significant to explain temporal trends. So, from a methodological point of view, combining statistical tests and graphical analysis is a relevant option when striving to improve trend detection. Moreover, it was possible to propose a way to summarise series in order to analyse links between ecological river quality indicators and land use stressors.
NASA Technical Reports Server (NTRS)
Prakash, OM, II
1991-01-01
Three linear controllers are desiged to regulate the end effector of the Space Shuttle Remote Manipulator System (SRMS) operating in Position Hold Mode. In this mode of operation, jet firings of the Orbiter can be treated as disturbances while the controller tries to keep the end effector stationary in an orbiter-fixed reference frame. The three design techniques used include: the Linear Quadratic Regulator (LQR), H2 optimization, and H-infinity optimization. The nonlinear SRMS is linearized by modelling the effects of the significant nonlinearities as uncertain parameters. Each regulator design is evaluated for robust stability in light of the parametric uncertanties using both the small gain theorem with an H-infinity norm and the less conservative micro-analysis test. All three regulator designs offer significant improvement over the current system on the nominal plant. Unfortunately, even after dropping performance requirements and designing exclusively for robust stability, robust stability cannot be achieved. The SRMS suffers from lightly damped poles with real parametric uncertainties. Such a system renders the micro-analysis test, which allows for complex peturbations, too conservative.
Kovac, Dejan; Vukovic, Vuk; Kleut, Nikola; Podobnik, Boris
2016-01-01
When companies are faced with an upcoming and expected economic shock some of them tend to react better than others. They adapt by initiating investments thus successfully weathering the storm, while others, even though they possess the same information set, fail to adopt the same business strategy and eventually succumb to the crisis. We use a unique setting of the recent financial crisis in Croatia as an exogenous shock that hit the country with a time lag, allowing the domestic firms to adapt. We perform a survival analysis on the entire population of 144,000 firms in Croatia during the period from 2003 to 2015, and test whether investment prior to the anticipated shock makes firms more likely to survive the recession. We find that small and micro firms, which decided to invest, had between 60 and 70% higher survival rates than similar firms that chose not to invest. This claim is supported by both non-parametric and parametric tests in the survival analysis. From a normative perspective this finding could be important in mitigating the negative effects on aggregate demand during strong recessionary periods.
Development of a subway operation incident delay model using accelerated failure time approaches.
Weng, Jinxian; Zheng, Yang; Yan, Xuedong; Meng, Qiang
2014-12-01
This study aims to develop a subway operational incident delay model using the parametric accelerated time failure (AFT) approach. Six parametric AFT models including the log-logistic, lognormal and Weibull models, with fixed and random parameters are built based on the Hong Kong subway operation incident data from 2005 to 2012, respectively. In addition, the Weibull model with gamma heterogeneity is also considered to compare the model performance. The goodness-of-fit test results show that the log-logistic AFT model with random parameters is most suitable for estimating the subway incident delay. First, the results show that a longer subway operation incident delay is highly correlated with the following factors: power cable failure, signal cable failure, turnout communication disruption and crashes involving a casualty. Vehicle failure makes the least impact on the increment of subway operation incident delay. According to these results, several possible measures, such as the use of short-distance and wireless communication technology (e.g., Wifi and Zigbee) are suggested to shorten the delay caused by subway operation incidents. Finally, the temporal transferability test results show that the developed log-logistic AFT model with random parameters is stable over time. Copyright © 2014 Elsevier Ltd. All rights reserved.
Selecting a separable parametric spatiotemporal covariance structure for longitudinal imaging data.
George, Brandon; Aban, Inmaculada
2015-01-15
Longitudinal imaging studies allow great insight into how the structure and function of a subject's internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures and the spatial from the outcomes of interest being observed at multiple points in a patient's body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on types I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be performed in practice, as well as how covariance structure choice can change inferences about fixed effects. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Fernández-Llamazares, Álvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción
2014-04-01
Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.
Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S
2015-04-01
To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.
NASA Astrophysics Data System (ADS)
Vidya Sagar, R.; Raghu Prasad, B. K.
2012-03-01
This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.
NASA Astrophysics Data System (ADS)
Kralik, Martin
2017-04-01
The application of nitrogen and oxygen isotopes in nitrate allows, under favourable circumstances, to identify potential sources such as precipitation, chemical fertilisers and manure or sewage water. Without any additional tracer, the source distinction of nitrate from manure or sewage water is still difficult. Even the application of boron isotopes can in some cases not avoid ambiguous interpretation. Therefore, the Environment Agency Austria developed a new multi parametrical indicator test to allow the identification and quantification of pollution by domestic sewage water. The test analyses 8 substances well known to occur in sewage water: Acesulfame and sucralose (two artificial, calorie-free sweeteners), benzotriazole and tolyltriazole (two industrial chemicals/corrosion inhibitors), metoprolol, sotalol, carbamazepine and the metabolite 10,11-Dihydro-10,11-dihydroxycarbamazepine (pharmaceuticals) [1]. These substances are polar and degradation in the aquatic system by microbiological processes is not documented. These 8 Substances do not occur naturally which make them ideal tracers. The test can detect wastewater in the analysed water sample down to 0.1 %. This ideal coupling of these analytic tests helps to identify the nitrogen sources in the groundwater body Marchfeld East of Vienna to a high confidence level. In addition, the results allow a reasonable quantification of nitrogen sources from different types of fertilizers as well as sewage water contributions close to villages and in wells recharged by bank filtration. Recent investigations of groundwater in selected wells in Marchfeld [2] indicated a clear nitrogen contribution by wastewater leakages (sewers or septic tanks) to the total nitrogen budget. However, this contribution is shrinking and the main source comes still from agricultural activities. [1] Humer, F.; Weiss, S.; Reinnicke, S.; Clara, M.; Grath, J.; Windhofer, G. (2013): Multi parametrical indicator test for urban wastewater influence. EGU General Assembly 2013, held 7-12 April, 2013 in Vienna, Austria, id. EGU2013-5332, EGU2013-5332. [2] Kralik, M.; Humer, F. & Grath, J. (2008): Pilotprojekt Grundwasseralter: Herkunftsanalyse von Nitrat mittels Stickstoff-, Sauerstoff-, Schwefel und Kohlenstoffisotopen. 57 S.2, Environment Agency Austria/Ministry of Agriculture, Forestry, Environment and Water Management, Vienna.
Development of a thermal storage module using modified anhydrous sodium hydroxide
NASA Technical Reports Server (NTRS)
Rice, R. E.; Rowny, P. E.
1980-01-01
The laboratory scale testing of a modified anhydrous NaOH latent heat storage concept for small solar thermal power systems such as total energy systems utilizing organic Rankine systems is discussed. A diagnostic test on the thermal energy storage module and an investigation of alternative heat transfer fluids and heat exchange concepts are specifically addressed. A previously developed computer simulation model is modified to predict the performance of the module in a solar total energy system environment. In addition, the computer model is expanded to investigate parametrically the incorporation of a second heat exchange inside the module which will vaporize and superheat the Rankine cycle power fluid.
Design of a CO2 Twin Rotary Compressor for a Heat Pump Water Heater
NASA Astrophysics Data System (ADS)
Ahn, Jong Min; Kim, Woo Young; Kim, Hyun Jin; Cho, Sung Oug; Seo, Jong Cheun
2010-06-01
For a CO2 heat pump water heater, one-stage twin rotary compressor has been designed. As a design tool, computer simulation program for the compressor performance has been made. Validation of the simulation program has been carried out for a bench model compressor in a compressor calorimeter. Cooling capacity and the compressor input power were reasonably well compared between the simulation and the calorimeter test. Good agreement on P-V diagram between the simulation and the test was also obtained. With this validated compressor simulation program, parametric study has been performed to arrive at optimum dimensions for the compression chamber.
NASA Astrophysics Data System (ADS)
Freire, Paulo; Wex, Norbert
In this talk, we present a re-parameterization of the Shapiro delay as observed in the timing of radio pulses of binary pulsars. We express the Shapiro delay as a sum of harmonics of the orbital period of the system, and use the harmonic coefficients as the main parameters of a much improved description of the effect. This includes a superior description of the constraints on the masses and orbital inclination introduced by a measurement of the Shapiro delay. In some cases (which we discuss) this leads to dramatically improved parametric tests of general relativity with binary pulsars.
Han, Meng; Wang, Na; Guo, Shifang; Chang, Nan; Lu, Shukuan; Wan, Mingxi
2018-07-01
Nowadays, both thermal and mechanical ablation techniques of HIFU associated with cavitation have been developed for noninvasive treatment. A specific challenge for the successful clinical implementation of HIFU is to achieve real-time imaging for the evaluation and determination of therapy outcomes such as necrosis or homogenization. Ultrasound Nakagami-m parametric imaging highlights the degrading shadowing effects of bubbles and can be used for tissue characterization. The aim of this study is to investigate the performance of Nakagami-m parametric imaging for evaluating and differentiating thermal coagulation and cavitation erosion induced by HIFU. Lesions were induced in basic bovine serum albumin (BSA) phantoms and ex vivo porcine livers using a 1.6 MHz single-element transducer. Thermal and mechanical lesions induced by two types of HIFU sequences respectively were evaluated using Nakagami-m parametric imaging and ultrasound B-mode imaging. The lesion sizes estimated using Nakagami-m parametric imaging technique were all closer to the actual sizes than those of B-mode imaging. The p-value obtained from the t-test between the mean m values of thermal coagulation and cavitation erosion was smaller than 0.05, demonstrating that the m values of thermal lesions were significantly different from that of mechanical lesions, which was confirmed by ex vivo experiments and histologic examination showed that different changes result from HIFU exposure, one of tissue dehydration resulting from the thermal effect, and the other of tissue homogenate resulting from mechanical effect. This study demonstrated that Nakagami-m parametric imaging is a potential real-time imaging technique for evaluating and differentiating thermal coagulation and cavitation erosion. Copyright © 2018 Elsevier B.V. All rights reserved.
Parametric instability, inverse cascade and the range of solar-wind turbulence
NASA Astrophysics Data System (ADS)
Chandran, Benjamin D. G.
2018-02-01
In this paper, weak-turbulence theory is used to investigate the nonlinear evolution of the parametric instability in three-dimensional low- plasmas at wavelengths much greater than the ion inertial length under the assumption that slow magnetosonic waves are strongly damped. It is shown analytically that the parametric instability leads to an inverse cascade of Alfvén wave quanta, and several exact solutions to the wave kinetic equations are presented. The main results of the paper concern the parametric decay of Alfvén waves that initially satisfy +\\gg e-$ , where +$ and -$ are the frequency ( ) spectra of Alfvén waves propagating in opposite directions along the magnetic field lines. If +$ initially has a peak frequency 0$ (at which +$ is maximized) and an `infrared' scaling p$ at smaller with , then +$ acquires an -1$ scaling throughout a range of frequencies that spreads out in both directions from 0$ . At the same time, -$ acquires an -2$ scaling within this same frequency range. If the plasma parameters and infrared +$ spectrum are chosen to match conditions in the fast solar wind at a heliocentric distance of 0.3 astronomical units (AU), then the nonlinear evolution of the parametric instability leads to an +$ spectrum that matches fast-wind measurements from the Helios spacecraft at 0.3 AU, including the observed -1$ scaling at -4~\\text{Hz}$ . The results of this paper suggest that the -1$ spectrum seen by Helios in the fast solar wind at -4~\\text{Hz}$ is produced in situ by parametric decay and that the -1$ range of +$ extends over an increasingly narrow range of frequencies as decreases below 0.3 AU. This prediction will be tested by measurements from the Parker Solar Probe.
Duarte, João Valente; Faustino, Ricardo; Lobo, Mercês; Cunha, Gil; Nunes, César; Ferreira, Carlos; Januário, Cristina; Castelo-Branco, Miguel
2016-10-01
Machado-Joseph Disease, inherited type 3 spinocerebellar ataxia (SCA3), is the most common form worldwide. Neuroimaging and neuropathology have consistently demonstrated cerebellar alterations. Here we aimed to discover whole-brain functional biomarkers, based on parametric performance-level-dependent signals. We assessed 13 patients with early SCA3 and 14 healthy participants. We used a combined parametric behavioral/functional neuroimaging design to investigate disease fingerprints, as a function of performance levels, coupled with structural MRI and voxel-based morphometry. Functional magnetic resonance imaging (fMRI) was designed to parametrically analyze behavior and neural responses to audio-paced bilateral thumb movements at temporal frequencies of 1, 3, and 5 Hz. Our performance-level-based design probing neuronal correlates of motor coordination enabled the discovery that neural activation and behavior show critical loss of parametric modulation specifically in SCA3, associated with frequency-dependent cortico/subcortical activation/deactivation patterns. Cerebellar/cortical rate-dependent dissociation patterns could clearly differentiate between groups irrespective of grey matter loss. Our findings suggest functional reorganization of the motor network and indicate a possible role of fMRI as a tool to monitor disease progression in SCA3. Accordingly, fMRI patterns proved to be potential biomarkers in early SCA3, as tested by receiver operating characteristic analysis of both behavior and neural activation at different frequencies. Discrimination analysis based on BOLD signal in response to the applied parametric finger-tapping task significantly often reached >80% sensitivity and specificity in single regions-of-interest.Functional fingerprints based on cerebellar and cortical BOLD performance dependent signal modulation can thus be combined as diagnostic and/or therapeutic targets in hereditary ataxia. Hum Brain Mapp 37:3656-3668, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Stellar parametrization from Gaia RVS spectra
NASA Astrophysics Data System (ADS)
Recio-Blanco, A.; de Laverny, P.; Allende Prieto, C.; Fustes, D.; Manteiga, M.; Arcay, B.; Bijaoui, A.; Dafonte, C.; Ordenovic, C.; Ordoñez Blanco, D.
2016-01-01
Context. Among the myriad of data collected by the ESA Gaia satellite, about 150 million spectra will be delivered by the Radial Velocity Spectrometer (RVS) for stars as faint as GRVS~ 16. A specific stellar parametrization will be performed on most of these RVS spectra, I.e. those with enough high signal-to-noise ratio (S/N), which should correspond to single stars that have a magnitude in the RVS band brighter than ~14.5. Some individual chemical abundances will also be estimated for the brightest targets. Aims: We describe the different parametrization codes that have been specifically developed or adapted for RVS spectra within the GSP-Spec working group of the analysis consortium. The tested codes are based on optimisation (FERRE and GAUGUIN), projection (MATISSE), or pattern-recognition methods (Artificial Neural Networks). We present and discuss each of their expected performances in the recovered stellar atmospheric parameters (effective temperature, surface gravity, overall metallicity) for B- to K-type stars. The performances for determining of [α/Fe] ratios are also presented for cool stars. Methods: Each code has been homogeneously tested with a large grid of RVS simulated synthetic spectra of BAFGK-spectral types (dwarfs and giants), with metallicities varying from 10-2.5 to 10+ 0.5 the solar metallicity, and taking variations of ±0.4 dex in the composition of the α-elements into consideration. The tests were performed for S/N ranging from ten to 350. Results: For all the stellar types we considered, stars brighter than GRVS~ 12.5 are very efficiently parametrized by the GSP-Spec pipeline, including reliable estimations of [α/Fe]. Typical internal errors for FGK metal-rich and metal-intermediate stars are around 40 K in Teff, 0.10 dex in log(g), 0.04 dex in [M/H], and 0.03 dex in [α/Fe] at GRVS = 10.3. They degrade to 155 K in Teff, 0.15 dex in log(g), 0.10 dex in [M/H], and 0.1 dex in [α/Fe] at GRVS~ 12. Similar accuracies in Teff and [M/H] are found for A-type stars, while the log(g) derivation is more accurate (errors of 0.07 and 0.12 dex at GRVS = 12.6 and 13.4, respectively). For the faintest stars, with GRVS≳ 13-14, a Teff input from the spectrophotometric-derived parameters will allow the final GSP-Spec parametrization to be improved. Conclusions: The reported results, while neglecting possible mismatches between synthetic and real spectra, show that the contribution of the RVS-based stellar parameters will be unique in the brighter part of the Gaia survey, which allows for crucial age estimations and accurate chemical abundances. This will constitute a unique and precious sample, providing many pieces of the Milky Way history puzzle with unprecedented precision and statistical relevance.
Uncertainty in determining extreme precipitation thresholds
NASA Astrophysics Data System (ADS)
Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili
2013-10-01
Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.
Control structural interaction testbed: A model for multiple flexible body verification
NASA Technical Reports Server (NTRS)
Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.
1993-01-01
Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.
Modeling a failure criterion for U-Mo/Al dispersion fuel
NASA Astrophysics Data System (ADS)
Oh, Jae-Yong; Kim, Yeon Soo; Tahk, Young-Wook; Kim, Hyun-Jung; Kong, Eui-Hyun; Yim, Jeong-Sik
2016-05-01
The breakaway swelling in U-Mo/Al dispersion fuel is known to be caused by large pore formation enhanced by interaction layer (IL) growth between fuel particles and Al matrix. In this study, a critical IL thickness was defined as a criterion for the formation of a large pore in U-Mo/Al dispersion fuel. Specifically, the critical IL thickness is given when two neighboring fuel particles come into contact with each other in the developed IL. The model was verified using the irradiation data from the RERTR tests and KOMO-4 test. The model application to full-sized sample irradiations such as IRISs, FUTURE, E-FUTURE, and AFIP-1 tests resulted in conservative predictions. The parametric study revealed that the fuel particle size and the homogeneity of the fuel particle distribution are influential for fuel performance.