Sample records for parametric test series

  1. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    NASA Astrophysics Data System (ADS)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  2. Characterizing rainfall of hot arid region by using time-series modeling and sustainability approaches: a case study from Gujarat, India

    NASA Astrophysics Data System (ADS)

    Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi

    2016-05-01

    This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable compared with that of other stations. The highest values of sustainability index at Mundra (0.120) and Naliya (0.112) stations confirmed the earlier findings of R y- R e- V y approach. In general, annual rainfall of the study area is less reliable, less resilient, and moderately vulnerable, which emphasizes the need of developing suitable strategies for managing water resources of the area on sustainable basis. Finally, it is recommended that multiple statistical tests (at least two) should be used in time-series modeling for making reliable decisions. Moreover, methodology and findings of the sustainability concept in rainfall time series can easily be adopted in other arid regions of the world.

  3. Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 5, Appendix D

    NASA Technical Reports Server (NTRS)

    Klute, A.

    1979-01-01

    The electrical characterization and qualification test results are presented for the RCA MWS 5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Average input high current, worst case input high current, output low current, and data setup time are some of the results presented.

  4. Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 4, Appendix C

    NASA Technical Reports Server (NTRS)

    Klute, A.

    1979-01-01

    The electrical characterization and qualification test results are presented for the RCA MWS5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Statistical analysis data is supplied along with write pulse width, read cycle time, write cycle time, and chip enable time data.

  5. Electrical Evaluation of RCA MWS5501D Random Access Memory, Volume 2, Appendix a

    NASA Technical Reports Server (NTRS)

    Klute, A.

    1979-01-01

    The electrical characterization and qualification test results are presented for the RCA MWS5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. The address access time, address readout time, the data hold time, and the data setup time are some of the results surveyed.

  6. Observed changes in relative humidity and dew point temperature in coastal regions of Iran

    NASA Astrophysics Data System (ADS)

    Hosseinzadeh Talaee, P.; Sabziparvar, A. A.; Tabari, Hossein

    2012-12-01

    The analysis of trends in hydroclimatic parameters and assessment of their statistical significance have recently received a great concern to clarify whether or not there is an obvious climate change. In the current study, parametric linear regression and nonparametric Mann-Kendall tests were applied for detecting annual and seasonal trends in the relative humidity (RH) and dew point temperature ( T dew) time series at ten coastal weather stations in Iran during 1966-2005. The serial structure of the data was considered, and the significant serial correlations were eliminated using the trend-free pre-whitening method. The results showed that annual RH increased by 1.03 and 0.28 %/decade at the northern and southern coastal regions of the country, respectively, while annual T dew increased by 0.29 and 0.15°C per decade at the northern and southern regions, respectively. The significant trends were frequent in the T dew series, but they were observed only at 2 out of the 50 RH series. The results showed that the difference between the results of the parametric and nonparametric tests was small, although the parametric test detected larger significant trends in the RH and T dew time series. Furthermore, the differences between the results of the trend tests were not related to the normality of the statistical distribution.

  7. Characterization of Vertical Impact Device Acceleration Pulses Using Parametric Assessment: Phase IV Dual Impact Pulses

    DTIC Science & Technology

    2017-01-04

    response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and maintaining...configurations with a restrained manikin, was evaluated in four different test series . Test Series 1 was conducted to determine the materials and...5 ms TTP. Test Series 2 was conducted to determine the materials and drop heights required for energy attenuation of the seat pan to generate a 4 m

  8. Trend analysis of Arctic sea ice extent

    NASA Astrophysics Data System (ADS)

    Silva, M. E.; Barbosa, S. M.; Antunes, Luís; Rocha, Conceição

    2009-04-01

    The extent of Arctic sea ice is a fundamental parameter of Arctic climate variability. In the context of climate change, the area covered by ice in the Arctic is a particularly useful indicator of recent changes in the Arctic environment. Climate models are in near universal agreement that Arctic sea ice extent will decline through the 21st century as a consequence of global warming and many studies predict a ice free Arctic as soon as 2012. Time series of satellite passive microwave observations allow to assess the temporal changes in the extent of Arctic sea ice. Much of the analysis of the ice extent time series, as in most climate studies from observational data, have been focussed on the computation of deterministic linear trends by ordinary least squares. However, many different processes, including deterministic, unit root and long-range dependent processes can engender trend like features in a time series. Several parametric tests have been developed, mainly in econometrics, to discriminate between stationarity (no trend), deterministic trend and stochastic trends. Here, these tests are applied in the trend analysis of the sea ice extent time series available at National Snow and Ice Data Center. The parametric stationary tests, Augmented Dickey-Fuller (ADF), Phillips-Perron (PP) and the KPSS, do not support an overall deterministic trend in the time series of Arctic sea ice extent. Therefore, alternative parametrizations such as long-range dependence should be considered for characterising long-term Arctic sea ice variability.

  9. Development of suspended core soft glass fibers for far-detuned parametric conversion

    NASA Astrophysics Data System (ADS)

    Rampur, Anupamaa; Ciąćka, Piotr; Cimek, Jarosław; Kasztelanic, Rafał; Buczyński, Ryszard; Klimczak, Mariusz

    2018-04-01

    Light sources utilizing χ (2) parametric conversion combine high brightness with attractive operation wavelengths in the near and mid-infrared. In optical fibers, it is possible to use χ (3) degenerate four-wave mixing in order to obtain signal-to-idler frequency detuning of over 100 THz. We report on a test series of nonlinear soft glass suspended core fibers intended for parametric conversion of 1000-1100 nm signal wavelengths available from an array of mature lasers into the near-to-mid-infrared range of 2700-3500 nm under pumping with an erbium sub-picosecond laser system. The presented discussion includes modelling of the fiber properties, details of their physical development and characterization, and experimental tests of parametric conversion.

  10. Thoracic Injury Risk Curves for Rib Deflections of the SID-IIs Build Level D.

    PubMed

    Irwin, Annette L; Crawford, Greg; Gorman, David; Wang, Sikui; Mertz, Harold J

    2016-11-01

    Injury risk curves for SID-IIs thorax and abdomen rib deflections proposed for future NCAP side impact evaluations were developed from tests conducted with the SID-IIs FRG. Since the floating rib guide is known to reduce the magnitude of the peak rib deflections, injury risk curves developed from SID-IIs FRG data are not appropriate for use with SID-IIs build level D. PMHS injury data from three series of sled tests and one series of whole-body drop tests are paired with thoracic rib deflections from equivalent tests with SID-IIs build level D. Where possible, the rib deflections of SID-IIs build level D were scaled to adjust for differences in impact velocity between the PMHS and SID-IIs tests. Injury risk curves developed by the Mertz-Weber modified median rank method are presented and compared to risk curves developed by other parametric and non-parametric methods.

  11. Electrical Characterization of the RCA CDP1822SD Random Access Memory, Volume 1, Appendix a

    NASA Technical Reports Server (NTRS)

    Klute, A.

    1979-01-01

    Electrical characteristization tests were performed on 35 RCA CDP1822SD, 256-by-4-bit, CMOS, random access memories. The tests included three functional tests, AC and DC parametric tests, a series of schmoo plots, rise/fall time screening, and a data retention test. All tests were performed on an automated IC test system with temperatures controlled by a thermal airstream unit. All the functional tests, the data retention test, and the AC and DC parametric tests were performed at ambient temperatures of 25 C, -20 C, -55 C, 85 C, and 125 C. The schmoo plots were performed at ambient temperatures of 25 C, -55 C, and 125 C. The data retention test was performed at 25 C. Five devices failed one or more functional tests and four of these devices failed to meet the expected limits of a number of AC parametric tests. Some of the schmoo plots indicated a small degree of interaction between parameters.

  12. Parametric tests of a 40-Ah bipolar nickel-hydrogen battery

    NASA Technical Reports Server (NTRS)

    Cataldo, R. L.

    1986-01-01

    A series of tests were performed to characterize battery performance relating to certain operating parameters which include charge current, discharge current, temperature, and pressure. The parameters were varied to confirm battery design concepts and to determine optimal operating conditions.

  13. Design, construction, operation, and evaluation of a prototype culm combustion boiler/heater unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Aciermo, J.; Richards, H.; Spindler, F.

    1983-10-01

    A process for utilizing anthracite culm in a fluidized bed combustion system was demonstrated by the design and construction of a prototype steam plant at Shamokin, PA, and operation of the plant for parametric tests and a nine month extended durability test. The parametric tests evaluated turndown capability of the plant and established turndown techniques to be used to achieve best performance. Throughout the test program the fluidized bed boiler durability was excellent, showing very high resistence to corrosion and erosion. A series of 39 parametric tests was performed in order to demonstrate turndown capabilities of the atmospheric fluidized bedmore » boiler burning anthracite culm. Four tests were performed with bituminous coal waste (called gob) which contains 4.8 to 5.5% sulfur. Heating value of both fuels is approximately 3000 Btu/lb and ash content is approximately 70%. Combustion efficiency, boiler efficiency, and emissions of NO/sub x/ and SO/sub 2/ were also determined for the tests.« less

  14. Detecting trend on ecological river status - how to deal with short incomplete bioindicator time series? Methodological and operational issues

    NASA Astrophysics Data System (ADS)

    Cernesson, Flavie; Tournoud, Marie-George; Lalande, Nathalie

    2018-06-01

    Among the various parameters monitored in river monitoring networks, bioindicators provide very informative data. Analysing time variations in bioindicator data is tricky for water managers because the data sets are often short, irregular, and non-normally distributed. It is then a challenging methodological issue for scientists, as it is in Saône basin (30 000 km2, France) where, between 1998 and 2010, among 812 IBGN (French macroinvertebrate bioindicator) monitoring stations, only 71 time series have got more than 10 data values and were studied here. Combining various analytical tools (three parametric and non-parametric statistical tests plus a graphical analysis), 45 IBGN time series were classified as stationary and 26 as non-stationary (only one of which showing a degradation). Series from sampling stations located within the same hydroecoregion showed similar trends, while river size classes seemed to be non-significant to explain temporal trends. So, from a methodological point of view, combining statistical tests and graphical analysis is a relevant option when striving to improve trend detection. Moreover, it was possible to propose a way to summarise series in order to analyse links between ecological river quality indicators and land use stressors.

  15. Detection of trends and break points in temperature: the case of Umbria (Italy) and Guadalquivir Valley (Spain)

    NASA Astrophysics Data System (ADS)

    Herrera-Grimaldi, Pascual; García-Marín, Amanda; Ayuso-Muñoz, José Luís; Flamini, Alessia; Morbidelli, Renato; Ayuso-Ruíz, José Luís

    2018-02-01

    The increase of air surface temperature at global scale is a fact with values around 0.85 °C since the late nineteen century. Nevertheless, the increase is not equally distributed all over the world, varying from one region to others. Thus, it becomes interesting to study the evolution of temperature indices for a certain area in order to analyse the existence of climatic trend in it. In this work, monthly temperature time series from two Mediterranean areas are used: the Umbria region in Italy, and the Guadalquivir Valley in southern Spain. For the available stations, six temperature indices (three annual and three monthly) of mean, average maximum and average minimum temperature have been obtained, and the existence of trends has been studied by applying the non-parametric Mann-Kendall test. Both regions show a general increase in all temperature indices, being the pattern of the trends clearer in Spain than in Italy. The Italian area is the only one at which some negative trends are detected. The presence of break points in the temperature series has been also studied by using the non-parametric Pettit test and the parametric standard normal homogeneity test (SNHT), most of which may be due to natural phenomena.

  16. TOWARD HIGH-PRECISION SEISMIC STUDIES OF WHITE DWARF STARS: PARAMETRIZATION OF THE CORE AND TESTS OF ACCURACY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giammichele, N.; Fontaine, G.; Brassard, P.

    We present a prescription for parametrizing the chemical profile in the core of white dwarfs in light of the recent discovery that pulsation modes may sometimes be deeply confined in some cool pulsating white dwarfs. Such modes may be used as unique probes of the complicated chemical stratification that results from several processes that occurred in previous evolutionary phases of intermediate-mass stars. This effort is part of our ongoing quest for more credible and realistic seismic models of white dwarfs using static, parametrized equilibrium structures. Inspired by successful techniques developed in design optimization fields (such as aerodynamics), we exploit Akimamore » splines for the tracing of the chemical profile of oxygen (carbon) in the core of a white dwarf model. A series of tests are then presented to better seize the precision and significance of the results that can be obtained in an asteroseismological context. We also show that the new parametrization passes an essential basic test, as it successfully reproduces the chemical stratification of a full evolutionary model.« less

  17. Toward High-precision Seismic Studies of White Dwarf Stars: Parametrization of the Core and Tests of Accuracy

    NASA Astrophysics Data System (ADS)

    Giammichele, N.; Charpinet, S.; Fontaine, G.; Brassard, P.

    2017-01-01

    We present a prescription for parametrizing the chemical profile in the core of white dwarfs in light of the recent discovery that pulsation modes may sometimes be deeply confined in some cool pulsating white dwarfs. Such modes may be used as unique probes of the complicated chemical stratification that results from several processes that occurred in previous evolutionary phases of intermediate-mass stars. This effort is part of our ongoing quest for more credible and realistic seismic models of white dwarfs using static, parametrized equilibrium structures. Inspired by successful techniques developed in design optimization fields (such as aerodynamics), we exploit Akima splines for the tracing of the chemical profile of oxygen (carbon) in the core of a white dwarf model. A series of tests are then presented to better seize the precision and significance of the results that can be obtained in an asteroseismological context. We also show that the new parametrization passes an essential basic test, as it successfully reproduces the chemical stratification of a full evolutionary model.

  18. Parametric vs. non-parametric daily weather generator: validation and comparison

    NASA Astrophysics Data System (ADS)

    Dubrovsky, Martin

    2016-04-01

    As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30 years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database.

  19. Characterization of Vertical Impact Device Acceleration Pulses Using Parametric Assessment: Phase II Accelerated Free-Fall

    DTIC Science & Technology

    2016-04-30

    support contractor , Infoscitex, conducted a series of tests to identify the performance capabilities of the Vertical Impact Device (VID). The VID is a...C. Table 3. AFD Evaluation with Red IMPAC Programmer: Data Summary Showing Means and Standard Deviations Test Cell Drop Ht . (in) Mean Peak

  20. The Importance of Practice in the Development of Statistics.

    DTIC Science & Technology

    1983-01-01

    RESOLUTION TEST CHART NATIONAL BUREAU OIF STANDARDS 1963 -A NRC Technical Summary Report #2471 C THE IMORTANCE OF PRACTICE IN to THE DEVELOPMENT OF STATISTICS...component analysis, bioassay, limits for a ratio, quality control, sampling inspection, non-parametric tests , transformation theory, ARIMA time series...models, sequential tests , cumulative sum charts, data analysis plotting techniques, and a resolution of the Bayes - frequentist controversy. It appears

  1. A Bit Stream Scalable Speech/Audio Coder Combining Enhanced Regular Pulse Excitation and Parametric Coding

    NASA Astrophysics Data System (ADS)

    Riera-Palou, Felip; den Brinker, Albertus C.

    2007-12-01

    This paper introduces a new audio and speech broadband coding technique based on the combination of a pulse excitation coder and a standardized parametric coder, namely, MPEG-4 high-quality parametric coder. After presenting a series of enhancements to regular pulse excitation (RPE) to make it suitable for the modeling of broadband signals, it is shown how pulse and parametric codings complement each other and how they can be merged to yield a layered bit stream scalable coder able to operate at different points in the quality bit rate plane. The performance of the proposed coder is evaluated in a listening test. The major result is that the extra functionality of the bit stream scalability does not come at the price of a reduced performance since the coder is competitive with standardized coders (MP3, AAC, SSC).

  2. Coupled oscillators in identification of nonlinear damping of a real parametric pendulum

    NASA Astrophysics Data System (ADS)

    Olejnik, Paweł; Awrejcewicz, Jan

    2018-01-01

    A damped parametric pendulum with friction is identified twice by means of its precise and imprecise mathematical model. A laboratory test stand designed for experimental investigations of nonlinear effects determined by a viscous resistance and the stick-slip phenomenon serves as the model mechanical system. An influence of accurateness of mathematical modeling on the time variability of the nonlinear damping coefficient of the oscillator is proved. A free decay response of a precisely and imprecisely modeled physical pendulum is dependent on two different time-varying coefficients of damping. The coefficients of the analyzed parametric oscillator are identified with the use of a new semi-empirical method based on a coupled oscillators approach, utilizing the fractional order derivative of the discrete measurement series treated as an input to the numerical model. Results of application of the proposed method of identification of the nonlinear coefficients of the damped parametric oscillator have been illustrated and extensively discussed.

  3. Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 1

    NASA Technical Reports Server (NTRS)

    Klute, A.

    1979-01-01

    Electrical characterization and qualification tests were performed on the RCA MWS5001D, 1024 by 1-bit, CMOS, random access memory. Characterization tests were performed on five devices. The tests included functional tests, AC parametric worst case pattern selection test, determination of worst-case transition for setup and hold times and a series of schmoo plots. The qualification tests were performed on 32 devices and included a 2000 hour burn in with electrical tests performed at 0 hours and after 168, 1000, and 2000 hours of burn in. The tests performed included functional tests and AC and DC parametric tests. All of the tests in the characterization phase, with the exception of the worst-case transition test, were performed at ambient temperatures of 25, -55 and 125 C. The worst-case transition test was performed at 25 C. The preburn in electrical tests were performed at 25, -55, and 125 C. All burn in endpoint tests were performed at 25, -40, -55, 85, and 125 C.

  4. Validation of two (parametric vs non-parametric) daily weather generators

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Skalak, P.

    2015-12-01

    As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed-like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30-years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database. Acknowledgements: The weather generator is developed and validated within the frame of projects WG4VALUE (sponsored by the Ministry of Education, Youth and Sports of CR), and VALUE (COST ES 1102 action).

  5. Non-parametric directionality analysis - Extension for removal of a single common predictor and application to time series.

    PubMed

    Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob

    2016-08-01

    The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Experimental Characterization of Gas Turbine Emissions at Simulated Flight Altitude Conditions

    NASA Technical Reports Server (NTRS)

    Howard, R. P.; Wormhoudt, J. C.; Whitefield, P. D.

    1996-01-01

    NASA's Atmospheric Effects of Aviation Project (AEAP) is developing a scientific basis for assessment of the atmospheric impact of subsonic and supersonic aviation. A primary goal is to assist assessments of United Nations scientific organizations and hence, consideration of emissions standards by the International Civil Aviation Organization (ICAO). Engine tests have been conducted at AEDC to fulfill the need of AEAP. The purpose of these tests is to obtain a comprehensive database to be used for supplying critical information to the atmospheric research community. It includes: (1) simulated sea-level-static test data as well as simulated altitude data; and (2) intrusive (extractive probe) data as well as non-intrusive (optical techniques) data. A commercial-type bypass engine with aviation fuel was used in this test series. The test matrix was set by parametrically selecting the temperature, pressure, and flow rate at sea-level-static and different altitudes to obtain a parametric set of data.

  7. Uncertainty in determining extreme precipitation thresholds

    NASA Astrophysics Data System (ADS)

    Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili

    2013-10-01

    Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.

  8. Comparing biomarker measurements to a normal range: when to use standard error of the mean (SEM) or standard deviation (SD) confidence intervals tests

    EPA Science Inventory

    This commentary is the second of a series outlining one specific concept in interpreting biomarkers data. In the first, an observational method was presented for assessing the distribution of measurements before making parametric calculations. Here, the discussion revolves around...

  9. Parametric and nonparametric Granger causality testing: Linkages between international stock markets

    NASA Astrophysics Data System (ADS)

    De Gooijer, Jan G.; Sivarajasingham, Selliah

    2008-04-01

    This study investigates long-term linear and nonlinear causal linkages among eleven stock markets, six industrialized markets and five emerging markets of South-East Asia. We cover the period 1987-2006, taking into account the on-set of the Asian financial crisis of 1997. We first apply a test for the presence of general nonlinearity in vector time series. Substantial differences exist between the pre- and post-crisis period in terms of the total number of significant nonlinear relationships. We then examine both periods, using a new nonparametric test for Granger noncausality and the conventional parametric Granger noncausality test. One major finding is that the Asian stock markets have become more internationally integrated after the Asian financial crisis. An exception is the Sri Lankan market with almost no significant long-term linear and nonlinear causal linkages with other markets. To ensure that any causality is strictly nonlinear in nature, we also examine the nonlinear causal relationships of VAR filtered residuals and VAR filtered squared residuals for the post-crisis sample. We find quite a few remaining significant bi- and uni-directional causal nonlinear relationships in these series. Finally, after filtering the VAR-residuals with GARCH-BEKK models, we show that the nonparametric test statistics are substantially smaller in both magnitude and statistical significance than those before filtering. This indicates that nonlinear causality can, to a large extent, be explained by simple volatility effects.

  10. Characterization of Vertical Impact Device Acceleration Pulses Using Parametric Assessment: Phase 3 Wiaman Seat

    DTIC Science & Technology

    2016-07-01

    711 HPW/RHCPT) and their in-house technical support contractor , Infoscitex, conducted a series of tests to identify the performance capabilities of...Cell Seat Configuration Drop Ht . (in) Mean Peak Acceleration (G) Mean Velocity Change (ft/s) SH1 WS1 20 80.08 ± 3.71 13.54 ± 0.49 SH2...6. Test Matrix for VID Response with WS2 Test Cell Seat (Felt) Configuration Drop Ht . (in) Mean Peak Acceleration (G

  11. NONLINEAR OPTICAL EFFECTS AND FIBER OPTICS: Use of an open resonator in a parametric free-electron laser

    NASA Astrophysics Data System (ADS)

    Alekseev, V. I.; Bessonov, Evgenii G.; Serov, Alexander V.

    1988-12-01

    Parametric free-electron lasers utilizing open resonators and beams consisting of a series of identical particle bunches are analyzed theoretically. It is shown that the use of a resonator in a parametric laser system can increase the radiation intensity and its monochromaticity.

  12. Accelerated stress testing of terrestrial solar cells

    NASA Technical Reports Server (NTRS)

    Prince, J. L.; Lathrop, J. W.

    1979-01-01

    A program to investigate the reliability characteristics of unencapsulated low-cost terrestrial solar cells using accelerated stress testing is described. Reliability (or parametric degradation) factors appropriate to the cell technologies and use conditions were studied and a series of accelerated stress tests was synthesized. An electrical measurement procedure and a data analysis and management system was derived, and stress test fixturing and material flow procedures were set up after consideration was given to the number of cells to be stress tested and measured and the nature of the information to be obtained from the process. Selected results and conclusions are presented.

  13. Confidence intervals and hypothesis testing for the Permutation Entropy with an application to epilepsy

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; O. Redelico, Francisco

    2018-04-01

    In nonlinear dynamics, and to a lesser extent in other fields, a widely used measure of complexity is the Permutation Entropy. But there is still no known method to determine the accuracy of this measure. There has been little research on the statistical properties of this quantity that characterize time series. The literature describes some resampling methods of quantities used in nonlinear dynamics - as the largest Lyapunov exponent - but these seems to fail. In this contribution, we propose a parametric bootstrap methodology using a symbolic representation of the time series to obtain the distribution of the Permutation Entropy estimator. We perform several time series simulations given by well-known stochastic processes: the 1/fα noise family, and show in each case that the proposed accuracy measure is as efficient as the one obtained by the frequentist approach of repeating the experiment. The complexity of brain electrical activity, measured by the Permutation Entropy, has been extensively used in epilepsy research for detection in dynamical changes in electroencephalogram (EEG) signal with no consideration of the variability of this complexity measure. An application of the parametric bootstrap methodology is used to compare normal and pre-ictal EEG signals.

  14. Comparison of different synthetic 5-min rainfall time series regarding their suitability for urban drainage modelling

    NASA Astrophysics Data System (ADS)

    van der Heijden, Sven; Callau Poduje, Ana; Müller, Hannes; Shehu, Bora; Haberlandt, Uwe; Lorenz, Manuel; Wagner, Sven; Kunstmann, Harald; Müller, Thomas; Mosthaf, Tobias; Bárdossy, András

    2015-04-01

    For the design and operation of urban drainage systems with numerical simulation models, long, continuous precipitation time series with high temporal resolution are necessary. Suitable observed time series are rare. As a result, intelligent design concepts often use uncertain or unsuitable precipitation data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic precipitation data for urban drainage modelling are advanced, tested, and compared. The presented study compares four different approaches of precipitation models regarding their ability to reproduce rainfall and runoff characteristics. These include one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model, and one disaggregation approach based on daily precipitation measurements. All four models produce long precipitation time series with a temporal resolution of five minutes. The synthetic time series are first compared to observed rainfall reference time series. Comparison criteria include event based statistics like mean dry spell and wet spell duration, wet spell amount and intensity, long term means of precipitation sum and number of events, and extreme value distributions for different durations. Then they are compared regarding simulated discharge characteristics using an urban hydrological model on a fictitious sewage network. First results show a principal suitability of all rainfall models but with different strengths and weaknesses regarding the different rainfall and runoff characteristics considered.

  15. Non-parametric characterization of long-term rainfall time series

    NASA Astrophysics Data System (ADS)

    Tiwari, Harinarayan; Pandey, Brij Kishor

    2018-03-01

    The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.

  16. Significance testing of clinical data using virus dynamics models with a Markov chain Monte Carlo method: application to emergence of lamivudine-resistant hepatitis B virus.

    PubMed Central

    Burroughs, N J; Pillay, D; Mutimer, D

    1999-01-01

    Bayesian analysis using a virus dynamics model is demonstrated to facilitate hypothesis testing of patterns in clinical time-series. Our Markov chain Monte Carlo implementation demonstrates that the viraemia time-series observed in two sets of hepatitis B patients on antiviral (lamivudine) therapy, chronic carriers and liver transplant patients, are significantly different, overcoming clinical trial design differences that question the validity of non-parametric tests. We show that lamivudine-resistant mutants grow faster in transplant patients than in chronic carriers, which probably explains the differences in emergence times and failure rates between these two sets of patients. Incorporation of dynamic models into Bayesian parameter analysis is of general applicability in medical statistics. PMID:10643081

  17. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  18. Robust, automatic GPS station velocities and velocity time series

    NASA Astrophysics Data System (ADS)

    Blewitt, G.; Kreemer, C.; Hammond, W. C.

    2014-12-01

    Automation in GPS coordinate time series analysis makes results more objective and reproducible, but not necessarily as robust as the human eye to detect problems. Moreover, it is not a realistic option to manually scan our current load of >20,000 time series per day. This motivates us to find an automatic way to estimate station velocities that is robust to outliers, discontinuities, seasonality, and noise characteristics (e.g., heteroscedasticity). Here we present a non-parametric method based on the Theil-Sen estimator, defined as the median of velocities vij=(xj-xi)/(tj-ti) computed between all pairs (i, j). Theil-Sen estimators produce statistically identical solutions to ordinary least squares for normally distributed data, but they can tolerate up to 29% of data being problematic. To mitigate seasonality, our proposed estimator only uses pairs approximately separated by an integer number of years (N-δt)<(tj-ti )<(N+δt), where δt is chosen to be small enough to capture seasonality, yet large enough to reduce random error. We fix N=1 to maximally protect against discontinuities. In addition to estimating an overall velocity, we also use these pairs to estimate velocity time series. To test our methods, we process real data sets that have already been used with velocities published in the NA12 reference frame. Accuracy can be tested by the scatter of horizontal velocities in the North American plate interior, which is known to be stable to ~0.3 mm/yr. This presents new opportunities for time series interpretation. For example, the pattern of velocity variations at the interannual scale can help separate tectonic from hydrological processes. Without any step detection, velocity estimates prove to be robust for stations affected by the Mw7.2 2010 El Mayor-Cucapah earthquake, and velocity time series show a clear change after the earthquake, without any of the usual parametric constraints, such as relaxation of postseismic velocities to their preseismic values.

  19. Determination of in vivo mechanical properties of long bones from their impedance response curves

    NASA Technical Reports Server (NTRS)

    Borders, S. G.

    1981-01-01

    A mathematical model consisting of a uniform, linear, visco-elastic, Euler-Bernoulli beam to represent the ulna or tibia of the vibrating forearm or leg system is developed. The skin and tissue compressed between the probe and bone is represented by a spring in series with the beam. The remaining skin and tissue surrounding the bone is represented by a visco-elastic foundation with mass. An extensive parametric study is carried out to determine the effect of each parameter of the mathematical model on its impedance response. A system identification algorithm is developed and programmed on a digital computer to determine the parametric values of the model which best simulate the data obtained from an impedance test.

  20. Illiquidity premium and expected stock returns in the UK: A new approach

    NASA Astrophysics Data System (ADS)

    Chen, Jiaqi; Sherif, Mohamed

    2016-09-01

    This study examines the relative importance of liquidity risk for the time-series and cross-section of stock returns in the UK. We propose a simple way to capture the multidimensionality of illiquidity. Our analysis indicates that existing illiquidity measures have considerable asset specific components, which justifies our new approach. Further, we use an alternative test of the Amihud (2002) measure and parametric and non-parametric methods to investigate whether liquidity risk is priced in the UK. We find that the inclusion of the illiquidity factor in the capital asset pricing model plays a significant role in explaining the cross-sectional variation in stock returns, in particular with the Fama-French three-factor model. Further, using Hansen-Jagannathan non-parametric bounds, we find that the illiquidity-augmented capital asset pricing models yield a small distance error, other non-liquidity based models fail to yield economically plausible distance values. Our findings have important implications for managing the liquidity risk of equity portfolios.

  1. Exploiting the spatial locality of electron correlation within the parametric two-electron reduced-density-matrix method

    NASA Astrophysics Data System (ADS)

    DePrince, A. Eugene; Mazziotti, David A.

    2010-01-01

    The parametric variational two-electron reduced-density-matrix (2-RDM) method is applied to computing electronic correlation energies of medium-to-large molecular systems by exploiting the spatial locality of electron correlation within the framework of the cluster-in-molecule (CIM) approximation [S. Li et al., J. Comput. Chem. 23, 238 (2002); J. Chem. Phys. 125, 074109 (2006)]. The 2-RDMs of individual molecular fragments within a molecule are determined, and selected portions of these 2-RDMs are recombined to yield an accurate approximation to the correlation energy of the entire molecule. In addition to extending CIM to the parametric 2-RDM method, we (i) suggest a more systematic selection of atomic-orbital domains than that presented in previous CIM studies and (ii) generalize the CIM method for open-shell quantum systems. The resulting method is tested with a series of polyacetylene molecules, water clusters, and diazobenzene derivatives in minimal and nonminimal basis sets. Calculations show that the computational cost of the method scales linearly with system size. We also compute hydrogen-abstraction energies for a series of hydroxyurea derivatives. Abstraction of hydrogen from hydroxyurea is thought to be a key step in its treatment of sickle cell anemia; the design of hydroxyurea derivatives that oxidize more rapidly is one approach to devising more effective treatments.

  2. Registration of parametric dynamic F-18-FDG PET/CT breast images with parametric dynamic Gd-DTPA breast images

    NASA Astrophysics Data System (ADS)

    Magri, Alphonso; Krol, Andrzej; Lipson, Edward; Mandel, James; McGraw, Wendy; Lee, Wei; Tillapaugh-Fay, Gwen; Feiglin, David

    2009-02-01

    This study was undertaken to register 3D parametric breast images derived from Gd-DTPA MR and F-18-FDG PET/CT dynamic image series. Nonlinear curve fitting (Levenburg-Marquardt algorithm) based on realistic two-compartment models was performed voxel-by-voxel separately for MR (Brix) and PET (Patlak). PET dynamic series consists of 50 frames of 1-minute duration. Each consecutive PET image was nonrigidly registered to the first frame using a finite element method and fiducial skin markers. The 12 post-contrast MR images were nonrigidly registered to the precontrast frame using a free-form deformation (FFD) method. Parametric MR images were registered to parametric PET images via CT using FFD because the first PET time frame was acquired immediately after the CT image on a PET/CT scanner and is considered registered to the CT image. We conclude that nonrigid registration of PET and MR parametric images using CT data acquired during PET/CT scan and the FFD method resulted in their improved spatial coregistration. The success of this procedure was limited due to relatively large target registration error, TRE = 15.1+/-7.7 mm, as compared to spatial resolution of PET (6-7 mm), and swirling image artifacts created in MR parametric images by the FFD. Further refinement of nonrigid registration of PET and MR parametric images is necessary to enhance visualization and integration of complex diagnostic information provided by both modalities that will lead to improved diagnostic performance.

  3. A Power Series Expansion and Its Applications

    ERIC Educational Resources Information Center

    Chen, Hongwei

    2006-01-01

    Using the power series solution of a differential equation and the computation of a parametric integral, two elementary proofs are given for the power series expansion of (arcsin x)[squared], as well as some applications of this expansion.

  4. Change point detection of the Persian Gulf sea surface temperature

    NASA Astrophysics Data System (ADS)

    Shirvani, A.

    2017-01-01

    In this study, the Student's t parametric and Mann-Whitney nonparametric change point models (CPMs) were applied to detect change point in the annual Persian Gulf sea surface temperature anomalies (PGSSTA) time series for the period 1951-2013. The PGSSTA time series, which were serially correlated, were transformed to produce an uncorrelated pre-whitened time series. The pre-whitened PGSSTA time series were utilized as the input file of change point models. Both the applied parametric and nonparametric CPMs estimated the change point in the PGSSTA in 1992. The PGSSTA follow the normal distribution up to 1992 and thereafter, but with a different mean value after year 1992. The estimated slope of linear trend in PGSSTA time series for the period 1951-1992 was negative; however, that was positive after the detected change point. Unlike the PGSSTA, the applied CPMs suggested no change point in the Niño3.4SSTA time series.

  5. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    NASA Technical Reports Server (NTRS)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.

  6. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    NASA Technical Reports Server (NTRS)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (1) mode blockage, (2) liner insertion loss, (3) short ducts, and (4) mode reflection.

  7. Preprototype nitrogen supply subsystem development

    NASA Technical Reports Server (NTRS)

    Heppner, D. B.; Fort, J. H.; Schubert, F. H.

    1982-01-01

    The design and development of a test stand for the Nitrogen Generation Module (NGM) and a series of tests which verified its operation and performance capability are described. Over 900 hours of parametric testing were achieved. The results from this testing were then used to design an advanced NGM and a self contained, preprototype Nitrogen Supply Subsystem. The NGM consists of three major components: nitrogen generation module, pressure controller and hydrazine storage tank and ancillary components. The most important improvement is the elimination of all sealing surfaces, achieved with a total welded or brazed construction. Additionally, performance was improved by increasing hydrogen separating capability by 20% with no increase in overall packaging size.

  8. A Methodology for the Parametric Reconstruction of Non-Steady and Noisy Meteorological Time Series

    NASA Astrophysics Data System (ADS)

    Rovira, F.; Palau, J. L.; Millán, M.

    2009-09-01

    Climatic and meteorological time series often show some persistence (in time) in the variability of certain features. One could regard annual, seasonal and diurnal time variability as trivial persistence in the variability of some meteorological magnitudes (as, e.g., global radiation, air temperature above surface, etc.). In these cases, the traditional Fourier transform into frequency space will show the principal harmonics as the components with the largest amplitude. Nevertheless, meteorological measurements often show other non-steady (in time) variability. Some fluctuations in measurements (at different time scales) are driven by processes that prevail on some days (or months) of the year but disappear on others. By decomposing a time series into time-frequency space through the continuous wavelet transformation, one is able to determine both the dominant modes of variability and how those modes vary in time. This study is based on a numerical methodology to analyse non-steady principal harmonics in noisy meteorological time series. This methodology combines both the continuous wavelet transform and the development of a parametric model that includes the time evolution of the principal and the most statistically significant harmonics of the original time series. The parameterisation scheme proposed in this study consists of reproducing the original time series by means of a statistically significant finite sum of sinusoidal signals (waves), each defined by using the three usual parameters: amplitude, frequency and phase. To ensure the statistical significance of the parametric reconstruction of the original signal, we propose a standard statistical t-student analysis of the confidence level of the amplitude in the parametric spectrum for the different wave components. Once we have assured the level of significance of the different waves composing the parametric model, we can obtain the statistically significant principal harmonics (in time) of the original time series by using the Fourier transform of the modelled signal. Acknowledgements The CEAM Foundation is supported by the Generalitat Valenciana and BANCAIXA (València, Spain). This study has been partially funded by the European Commission (FP VI, Integrated Project CIRCE - No. 036961) and by the Ministerio de Ciencia e Innovación, research projects "TRANSREG” (CGL2007-65359/CLI) and "GRACCIE” (CSD2007-00067, Program CONSOLIDER-INGENIO 2010).

  9. GREENHOUSE GAS (GHG) VERIFICATION GUIDELINE SERIES: ANR Pipeline Company PARAMETRIC EMISSIONS MONITORING SYSTEM (PEMS) VERSION 1.0

    EPA Science Inventory

    The Environmental Technology Verification report discusses the technology and performance of the Parametric Emissions Monitoring System (PEMS) manufactured by ANR Pipeline Company, a subsidiary of Coastal Corporation, now El Paso Corporation. The PEMS predicts carbon doixide (CO2...

  10. Helicopter model rotor-blade vortex interaction impulsive noise: Scalability and parametric variations

    NASA Technical Reports Server (NTRS)

    Splettstoesser, W. R.; Schultz, K. J.; Boxwell, D. A.; Schmitz, F. H.

    1984-01-01

    Acoustic data taken in the anechoic Deutsch-Niederlaendischer Windkanal (DNW) have documented the blade vortex interaction (BVI) impulsive noise radiated from a 1/7-scale model main rotor of the AH-1 series helicopter. Averaged model scale data were compared with averaged full scale, inflight acoustic data under similar nondimensional test conditions. At low advance ratios (mu = 0.164 to 0.194), the data scale remarkable well in level and waveform shape, and also duplicate the directivity pattern of BVI impulsive noise. At moderate advance ratios (mu = 0.224 to 0.270), the scaling deteriorates, suggesting that the model scale rotor is not adequately simulating the full scale BVI noise; presently, no proved explanation of this discrepancy exists. Carefully performed parametric variations over a complete matrix of testing conditions have shown that all of the four governing nondimensional parameters - tip Mach number at hover, advance ratio, local inflow ratio, and thrust coefficient - are highly sensitive to BVI noise radiation.

  11. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were first tested on synthetic examples, to have a complete control of the impact of several variables such as minimum amount of data required to obtain reliable statistical distributions from the selected parametric functions. Then, we applied the methodology to precipitation datasets collected in the Vancouver area and on a mining site in Peru.

  12. A statistical approach to bioclimatic trend detection in the airborne pollen records of Catalonia (NE Spain)

    NASA Astrophysics Data System (ADS)

    Fernández-Llamazares, Álvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción

    2014-04-01

    Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.

  13. The parametric modified limited penetrable visibility graph for constructing complex networks from time series

    NASA Astrophysics Data System (ADS)

    Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang

    2018-02-01

    This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.

  14. Comparing biomarker measurements to a normal range: when to use standard error of the mean (SEM) or standard deviation (SD) confidence intervals tests.

    PubMed

    Pleil, Joachim D

    2016-01-01

    This commentary is the second of a series outlining one specific concept in interpreting biomarkers data. In the first, an observational method was presented for assessing the distribution of measurements before making parametric calculations. Here, the discussion revolves around the next step, the choice of using standard error of the mean or the calculated standard deviation to compare or predict measurement results.

  15. A finite element method for the thermochemical decomposition of polymeric materials. II - Carbon phenolic composites

    NASA Technical Reports Server (NTRS)

    Sullivan, R. M.; Salamon, N. J.

    1992-01-01

    A previously developed formulation for modeling the thermomechanical behavior of chemically decomposing, polymeric materials is verified by simulating the response of carbon phenolic specimens during two high temperature tests: restrained thermal growth and free thermal expansion. Plane strain and plane stress models are used to simulate the specimen response, respectively. In addition, the influence of the poroelasticity constants upon the specimen response is examined through a series of parametric studies.

  16. A New Hybrid-Multiscale SSA Prediction of Non-Stationary Time Series

    NASA Astrophysics Data System (ADS)

    Ghanbarzadeh, Mitra; Aminghafari, Mina

    2016-02-01

    Singular spectral analysis (SSA) is a non-parametric method used in the prediction of non-stationary time series. It has two parameters, which are difficult to determine and very sensitive to their values. Since, SSA is a deterministic-based method, it does not give good results when the time series is contaminated with a high noise level and correlated noise. Therefore, we introduce a novel method to handle these problems. It is based on the prediction of non-decimated wavelet (NDW) signals by SSA and then, prediction of residuals by wavelet regression. The advantages of our method are the automatic determination of parameters and taking account of the stochastic structure of time series. As shown through the simulated and real data, we obtain better results than SSA, a non-parametric wavelet regression method and Holt-Winters method.

  17. Parametric study of flame radiation characteristics of a tubular-can combustor

    NASA Technical Reports Server (NTRS)

    Humenik, F. M.; Claus, R. W.; Neely, G. M.

    1983-01-01

    A series of combustor tests were conducted with a tubular-can combustor to study flame radiation characteristics and effects with parametric variations in combustor operating conditions. Two alternate combustor assemblies using a different fuel nozzle were compared. Spectral and total radiation detectors were positioned at three stations along the length of the combustor can. Data were obtained for a range of pressures from 0.34 to 2.07 MPa (50 to 300 psia), inlet temperatures from 533 to 700K (500 to 800 F), for Jet A (13.9 deg hydrogen) and ERBS (12.9% hydrogen) fuels, and with fuel-air ratios nominally from 0.008 to 0.021. Spectral radiation data, total radiant heat flux data, and liner temperature data are presented to illustrate the flame radiation characteristics and effects in the primary, secondary, and tertiary combustion zones.

  18. Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure

    PubMed Central

    Berisha, Visar; Wisler, Alan; Hero, Alfred O.; Spanias, Andreas

    2015-01-01

    Information divergence functions play a critical role in statistics and information theory. In this paper we show that a non-parametric f-divergence measure can be used to provide improved bounds on the minimum binary classification probability of error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between training and test distributions. We confirm the theoretical results by designing feature selection algorithms using the criteria from these bounds and by evaluating the algorithms on a series of pathological speech classification tasks. PMID:26807014

  19. Self-organising mixture autoregressive model for non-stationary time series modelling.

    PubMed

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  20. Analysis of Parametric Adaptive Signal Detection with Applications to Radars and Hyperspectral Imaging

    DTIC Science & Technology

    2010-02-01

    98 8.4.5 Training Screening ............................. .................................................................99 8.5 Experimental...associated with the proposed parametric model. Several im- portant issues are discussed, including model order selection, training screening , and time...parameters associated with the NS-AR model. In addition, we develop model order selection, training screening , and time-series based whitening and

  1. Electrical characterization of standard and radiation-hardened RCA CDP1856D 4-BIT, CMOS, bus buffer/separator

    NASA Technical Reports Server (NTRS)

    Stokes, R. L.

    1979-01-01

    Tests performed to determine accuracy and efficiency of bus separators used in microprocessors are presented. Functional, AC parametric, and DC parametric tests were performed in a Tektronix S-3260 automated test system. All the devices passed the functional tests and yielded nominal values in the parametric test.

  2. Spectral and cross-spectral analysis of uneven time series with the smoothed Lomb-Scargle periodogram and Monte Carlo evaluation of statistical significance

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.

    2012-12-01

    Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and real data) are described in this paper to corroborate the methodology and the implementation of these two new programs.

  3. Parametric Analyses of Potential Effects on Upper Tropospheric/Lower Stratospheric Ozone Chemistry by a Future Fleet of High Speed Civil Transport (HSCT) Type Aircraft

    NASA Technical Reports Server (NTRS)

    Dutta, Mayurakshi; Patten, Kenneth O.; Wuebbles,Donald J.

    2005-01-01

    This report analyzed the potential impact of projected fleets of HSCT aircraft (currently not under development) through a series of parametric analyses that examine the envelope of potential effects on ozone over a range of total fuel burns, emission indices of nitrogen oxides, and cruise altitudes.

  4. [Multivariate Adaptive Regression Splines (MARS), an alternative for the analysis of time series].

    PubMed

    Vanegas, Jairo; Vásquez, Fabián

    Multivariate Adaptive Regression Splines (MARS) is a non-parametric modelling method that extends the linear model, incorporating nonlinearities and interactions between variables. It is a flexible tool that automates the construction of predictive models: selecting relevant variables, transforming the predictor variables, processing missing values and preventing overshooting using a self-test. It is also able to predict, taking into account structural factors that might influence the outcome variable, thereby generating hypothetical models. The end result could identify relevant cut-off points in data series. It is rarely used in health, so it is proposed as a tool for the evaluation of relevant public health indicators. For demonstrative purposes, data series regarding the mortality of children under 5 years of age in Costa Rica were used, comprising the period 1978-2008. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  5. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  6. Measuring predictability in ultrasonic signals: an application to scattering material characterization.

    PubMed

    Carrión, Alicia; Miralles, Ramón; Lara, Guillermo

    2014-09-01

    In this paper, we present a novel and completely different approach to the problem of scattering material characterization: measuring the degree of predictability of the time series. Measuring predictability can provide information of the signal strength of the deterministic component of the time series in relation to the whole time series acquired. This relationship can provide information about coherent reflections in material grains with respect to the rest of incoherent noises that typically appear in non-destructive testing using ultrasonics. This is a non-parametric technique commonly used in chaos theory that does not require making any kind of assumptions about attenuation profiles. In highly scattering media (low SNR), it has been shown theoretically that the degree of predictability allows material characterization. The experimental results obtained in this work with 32 cement probes of 4 different porosities demonstrate the ability of this technique to do classification. It has also been shown that, in this particular application, the measurement of predictability can be used as an indicator of the percentages of porosity of the test samples with great accuracy. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Improving the performance of streamflow forecasting model using data-preprocessing technique in Dungun River Basin

    NASA Astrophysics Data System (ADS)

    Khai Tiu, Ervin Shan; Huang, Yuk Feng; Ling, Lloyd

    2018-03-01

    An accurate streamflow forecasting model is important for the development of flood mitigation plan as to ensure sustainable development for a river basin. This study adopted Variational Mode Decomposition (VMD) data-preprocessing technique to process and denoise the rainfall data before putting into the Support Vector Machine (SVM) streamflow forecasting model in order to improve the performance of the selected model. Rainfall data and river water level data for the period of 1996-2016 were used for this purpose. Homogeneity tests (Standard Normal Homogeneity Test, the Buishand Range Test, the Pettitt Test and the Von Neumann Ratio Test) and normality tests (Shapiro-Wilk Test, Anderson-Darling Test, Lilliefors Test and Jarque-Bera Test) had been carried out on the rainfall series. Homogenous and non-normally distributed data were found in all the stations, respectively. From the recorded rainfall data, it was observed that Dungun River Basin possessed higher monthly rainfall from November to February, which was during the Northeast Monsoon. Thus, the monthly and seasonal rainfall series of this monsoon would be the main focus for this research as floods usually happen during the Northeast Monsoon period. The predicted water levels from SVM model were assessed with the observed water level using non-parametric statistical tests (Biased Method, Kendall's Tau B Test and Spearman's Rho Test).

  8. Temporal changes and variability in temperature series over Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Suhaila, Jamaludin

    2015-02-01

    With the current concern over climate change, the descriptions on how temperature series changed over time are very useful. Annual mean temperature has been analyzed for several stations over Peninsular Malaysia. Non-parametric statistical techniques such as Mann-Kendall test and Theil-Sen slope estimation are used primarily for assessing the significance and detection of trends, while a nonparametric Pettitt's test and sequential Mann-Kendall test are adopted to detect any abrupt climate change. Statistically significance increasing trends for annual mean temperature are detected for almost all studied stations with the magnitude of significant trend varied from 0.02°C to 0.05°C per year. The results shows that climate over Peninsular Malaysia is getting warmer than before. In addition, the results of the abrupt changes in temperature using Pettitt's and sequential Mann-Kendall test reveal the beginning of trends which can be related to El Nino episodes that occur in Malaysia. In general, the analysis results can help local stakeholders and water managers to understand the risks and vulnerabilities related to climate change in terms of mean events in the region.

  9. A unified framework for weighted parametric multiple test procedures.

    PubMed

    Xi, Dong; Glimm, Ekkehard; Maurer, Willi; Bretz, Frank

    2017-09-01

    We describe a general framework for weighted parametric multiple test procedures based on the closure principle. We utilize general weighting strategies that can reflect complex study objectives and include many procedures in the literature as special cases. The proposed weighted parametric tests bridge the gap between rejection rules using either adjusted significance levels or adjusted p-values. This connection is made by allowing intersection hypotheses of the underlying closed test procedure to be tested at level smaller than α. This may be also necessary to take certain study situations into account. For such cases we introduce a subclass of exact α-level parametric tests that satisfy the consonance property. When the correlation is known only for certain subsets of the test statistics, a new procedure is proposed to fully utilize this knowledge within each subset. We illustrate the proposed weighted parametric tests using a clinical trial example and conduct a simulation study to investigate its operating characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. A Study on the Role of Grain-Boundary Engineering in Promoting High-Cycle Fatigue Resistance and Improving Reliability in Metallic Alloys for Propulsion Systems

    DTIC Science & Technology

    2005-04-30

    in addition, air cooling instead of water or oil quenching was adopted to avoid quench cracking. Based on a series of preliminary multi -parametric...microstructures were then grain- boundary engineered using four cycles of strain and high-temperature annealing of the single- phase alloy, specifically...automated load- shedding at a normalized K-gradient of -0.08 mm-, as specified in the standard. Multi -sample tests were conducted to verify the effect of

  11. Variational formulation of high performance finite elements: Parametrized variational principles

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.; Militello, Carmello

    1991-01-01

    High performance elements are simple finite elements constructed to deliver engineering accuracy with coarse arbitrary grids. This is part of a series on the variational basis of high-performance elements, with emphasis on those constructed with the free formulation (FF) and assumed natural strain (ANS) methods. Parametrized variational principles that provide a foundation for the FF and ANS methods, as well as for a combination of both are presented.

  12. Static Strength of Adhesively-bonded Woven Fabric Kenaf Composite Plates

    NASA Astrophysics Data System (ADS)

    Hilton, Ahmad; Lee, Sim Yee; Supar, Khairi

    2017-06-01

    Natural fibers are potentially used as reinforcing materials and combined with epoxy resin as matrix system to form a superior specific strength (or stiffness) materials known as composite materials. The advantages of implementing natural fibers such as kenaf fibers are renewable, less hazardous during fabrication and handling process; and relatively cheap compared to synthetic fibers. The aim of current work is to conduct a parametric study on static strength of adhesively bonded woven fabric kenaf composite plates. Fabrication of composite panels were conducted using hand lay-up techniques, with variation of stacking sequence, over-lap length, joint types and lay-up types as identified in testing series. Quasi-static testing was carried out using mechanical testing following code of practice. Load-displacement profiles were analyzed to study its structural response prior to ultimate failures. It was found that cross-ply lay-up demonstrates better static strength compared to quasi-isotropic lay-up counterparts due to larger volume of 0° plies exhibited in cross-ply lay-up. Consequently, larger overlap length gives better joining strength, as expected, however this promotes to weight penalty in the joining structure. Most samples showed failures within adhesive region known as cohesive failure modes, however, few sample demonstrated interface failure. Good correlations of parametric study were found and discussed in the respective section.

  13. Why preferring parametric forecasting to nonparametric methods?

    PubMed

    Jabot, Franck

    2015-05-07

    A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Design and nonlinear modeling of a sensitive sensor for the measurement of flow in mice.

    PubMed

    Bou Jawde, Samer; Smith, Bradford J; Sonnenberg, Adam; Bates, Jason H T; Suki, Bela

    2018-06-07

    While many studies rely on flow and pressure measurements in small animal models of respiratory disease, such measurements can however be inaccurate and difficult to obtain. Thus, the goal of this study was to design and implement an easy to manufacture and accurate sensor capable of monitoring flow. We designed and 3-D printed a flowmeter and utilized parametric (resistance and inertance) and nonparametric (polynomial and Volterra series) system identification to characterize the device. The sensor was tested in a closed system for apparent flow using the common mode rejection ratio (CMRR). The sensor properly measured tidal volumes and respiratory rates in spontaneously breathing mice. The device was used to evaluate a ventilator's ability to deliver a prescribed volume before and after lung injury. The parametric and polynomial models provided a reasonable prediction of the independently measured flow (Coefficient of determination (Cv)=0.9591 and 0.9147 respectively), but the Volterra series of the 1st, 2nd, and 3rd order with a memory of six time points provided better fits (Cv=0.9775, 0.9787, and 0.9954, respectively). At and below the mouse breathing frequency (1-5 Hz), CMRR was higher than 40 dB. Following lung injury, the sensor revealed a significant drop in delivered tidal volume. We demonstrate that the application of nonparametric nonlinear Volterra series modeling in combination with 3-D printing technology allows the inexpensive and rapid fabrication of an accurate flow sensor for continuously measuring small flows in various physiological conditions. © 2018 Institute of Physics and Engineering in Medicine.

  15. Cavity Heating Experiments Supporting Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    Everhart, Joel L.; Berger, Karen T.; Bey, Kim S.; Merski, N. Ronald; Wood, William A.

    2011-01-01

    The two-color thermographic phosphor method has been used to map the local heating augmentation of scaled idealized cavities at conditions simulating the windward surface of the Shuttle Orbiter Columbia during flight STS-107. Two experiments initiated in support of the Columbia Accident Investigation were conducted in the Langley 20-Inch Mach 6 Tunnel. Generally, the first test series evaluated open (length-to-depth less than 10) rectangular cavity geometries proposed as possible damage scenarios resulting from foam and ice impact during launch at several discrete locations on the vehicle windward surface, though some closed (length-to-depth greater than 13) geometries were briefly examined. The second test series was designed to parametrically evaluate heating augmentation in closed rectangular cavities. The tests were conducted under laminar cavity entry conditions over a range of local boundary layer edge-flow parameters typical of re-entry. Cavity design parameters were developed using laminar computational predictions, while the experimental boundary layer state conditions were inferred from the heating measurements. An analysis of the aeroheating caused by cavities allowed exclusion of non-breeching damage from the possible loss scenarios being considered during the investigation.

  16. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    NASA Astrophysics Data System (ADS)

    Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.

    2016-04-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  17. Determination of ankle external fixation stiffness by expedited interactive finite element analysis.

    PubMed

    Nielsen, Jonathan K; Saltzman, Charles L; Brown, Thomas D

    2005-11-01

    Interactive finite element analysis holds the potential to quickly and accurately determine the mechanical stiffness of alternative external fixator frame configurations. Using as an example Ilizarov distraction of the ankle, a finite element model and graphical user interface were developed that provided rapid, construct-specific information on fixation rigidity. After input of specific construct variables, the finite element software determined the resulting tibial displacement for a given configuration in typically 15s. The formulation was employed to investigate constructs used to treat end-stage arthritis, both in a parametric series and for five specific clinical distraction cases. Parametric testing of 15 individual variables revealed that tibial half-pins were much more effective than transfixion wires in limiting axial tibial displacement. Factors most strongly contributing to stiffening the construct included placing the tibia closer to the fixator rings, and mounting the pins to the rings at the nearest circumferential location to the bone. Benchtop mechanical validation results differed inappreciably from the finite element computations.

  18. Microwave Semiconductor Equipment Produced in Poland,

    DTIC Science & Technology

    1984-01-20

    was started on varactors for parametric amplifiers, which took place in the Institute for Basic Problems of Technology of the PAN [1. The research unit...technology of varactors intended for parametric amplifiers and harmonic generators. As a result of this a series of types of germanium, silicon and gallium...arsenide varactors were produced [2-141. These varactors were used for example in Avia A and Avia B radar. The working out of the production of

  19. Four photon parametric amplification. [in unbiased Josephson junction

    NASA Technical Reports Server (NTRS)

    Parrish, P. T.; Feldman, M. J.; Ohta, H.; Chiao, R. Y.

    1974-01-01

    An analysis is presented describing four-photon parametric amplification in an unbiased Josephson junction. Central to the theory is the model of the Josephson effect as a nonlinear inductance. Linear, small signal analysis is applied to the two-fluid model of the Josephson junction. The gain, gain-bandwidth product, high frequency limit, and effective noise temperature are calculated for a cavity reflection amplifier. The analysis is extended to multiple (series-connected) junctions and subharmonic pumping.

  20. Parametric study of closed wet cooling tower thermal performance

    NASA Astrophysics Data System (ADS)

    Qasim, S. M.; Hayder, M. J.

    2017-08-01

    The present study involves experimental and theoretical analysis to evaluate the thermal performance of modified Closed Wet Cooling Tower (CWCT). The experimental study includes: design, manufacture and testing prototype of a modified counter flow forced draft CWCT. The modification based on addition packing to the conventional CWCT. A series of experiments was carried out at different operational parameters. In view of energy analysis, the thermal performance parameters of the tower are: cooling range, tower approach, cooling capacity, thermal efficiency, heat and mass transfer coefficients. The theoretical study included develops Artificial Neural Network (ANN) models to predicting various thermal performance parameters of the tower. Utilizing experimental data for training and testing, the models simulated by multi-layer back propagation algorithm for varying all operational parameters stated in experimental test.

  1. The Homogeneity of the Potsdam Solar Radiation Data

    NASA Astrophysics Data System (ADS)

    Behrens, K.

    2009-04-01

    At Meteorological Station in Potsdam (Germany) the measurement of sunshine duration started already in 1983. Later on, in 1937 the registration of global, diffuse and direct solar radiation was begun with pyranometers and a pyrheliometer. Since 1983 sunshine duration has been measured with the same method, the Campbell-Stokes sunshine recorder, at the same site, while the measurements of solar radiation changed as well as in equipment, measurement methods and location. Furthermore, it was firstly necessary to supplement some missing data within the time series and secondly, it was desirable to extend the series of global radiation by regression with the sunshine duration backward to 1893. Because solar radiation, especially global radiation, is one of the most important quantities for climate research, it is necessary to investigate the homogeneity of these time series. At first the history was studied and as much as possible information about all parameters, which could influence the data, were gathered. In a second step these metadata were reviewed critically followed by a discussion about the potential effects of local factors on the homogeneity of the data. In a first step of data rehabilitation the so-called engineering correction (data levelling to WRR and SI units) were made followed by the supplementation of gaps. Finally, for every month and the year the so generated time series of measured data (1937/2008) and the complete series, prolonged by regression and measurements (1893/2008), were tested on homogeneity with the following distribution-free tests: WILCOXON (U) test, MANN-KENDALL test and progressive analysis were used for the examination of the stability of the mean and the dispersion, while with the Wald-Wolfowitz test the first order autocorrelation was checked. These non-parametric test were used, because frequently radiation data do not fulfil the assumption of a GAUSSian or normal distribution. The investigations showed, that discontinuities which were found in most cases are not in relation to metadata marking changes of site, equipment etc. Also, the point of intersection, where the calculated time series were connected to the measurements were not marked. This means that the time series are stable and measurements and the calculated part are in good agreement.

  2. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  3. Experimental evaluation of exhaust mixers for an Energy Efficient Engine

    NASA Technical Reports Server (NTRS)

    Kozlowski, H.; Kraft, G.

    1980-01-01

    Static scale model tests were conducted to evaluate exhaust system mixers for a high bypass ratio engine as part of the NASA sponsored Energy Efficient program. Gross thrust coefficients were measured for a series of mixer configurations which included variations in the number of mixer lobes, tailpipe length, mixer penetration, and length. All of these parameters have a significant impact on exhaust system performance. In addition, flow visualization pictures and pressure/temperature traverses were obtained for selected configurations. Parametric performance trends are discussed and the results considered relative to the Energy Efficient Engine program goals.

  4. Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?

    NASA Astrophysics Data System (ADS)

    Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy

    2016-10-01

    The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.

  5. Identification of trends in rainfall, rainy days and 24 h maximum rainfall over subtropical Assam in Northeast India

    NASA Astrophysics Data System (ADS)

    Jhajharia, Deepak; Yadav, Brijesh K.; Maske, Sunil; Chattopadhyay, Surajit; Kar, Anil K.

    2012-01-01

    Trends in rainfall, rainy days and 24 h maximum rainfall are investigated using the Mann-Kendall non-parametric test at twenty-four sites of subtropical Assam located in the northeastern region of India. The trends are statistically confirmed by both the parametric and non-parametric methods and the magnitudes of significant trends are obtained through the linear regression test. In Assam, the average monsoon rainfall (rainy days) during the monsoon months of June to September is about 1606 mm (70), which accounts for about 70% (64%) of the annual rainfall (rainy days). On monthly time scales, sixteen and seventeen sites (twenty-one sites each) witnessed decreasing trends in the total rainfall (rainy days), out of which one and three trends (seven trends each) were found to be statistically significant in June and July, respectively. On the other hand, seventeen sites witnessed increasing trends in rainfall in the month of September, but none were statistically significant. In December (February), eighteen (twenty-two) sites witnessed decreasing (increasing) trends in total rainfall, out of which five (three) trends were statistically significant. For the rainy days during the months of November to January, twenty-two or more sites witnessed decreasing trends in Assam, but for nine (November), twelve (January) and eighteen (December) sites, these trends were statistically significant. These observed changes in rainfall, although most time series are not convincing as they show predominantly no significance, along with the well-reported climatic warming in monsoon and post-monsoon seasons may have implications for human health and water resources management over bio-diversity rich Northeast India.

  6. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    PubMed

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  7. Parametric modeling studies of turbulent non-premixed jet flames with thin reaction zones

    NASA Astrophysics Data System (ADS)

    Wang, Haifeng

    2013-11-01

    The Sydney piloted jet flame series (Flames L, B, and M) feature thinner reaction zones and hence impose greater challenges to modeling than the Sanida Piloted jet flames (Flames D, E, and F). Recently, the Sydney flames received renewed interest due to these challenges. Several new modeling efforts have emerged. However, no systematic parametric modeling studies have been reported for the Sydney flames. A large set of modeling computations of the Sydney flames is presented here by using the coupled large eddy simulation (LES)/probability density function (PDF) method. Parametric studies are performed to gain insight into the model performance, its sensitivity and the effect of numerics.

  8. Fuzzy interval Finite Element/Statistical Energy Analysis for mid-frequency analysis of built-up systems with mixed fuzzy and interval parameters

    NASA Astrophysics Data System (ADS)

    Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan

    2016-10-01

    This paper introduces mixed fuzzy and interval parametric uncertainties into the FE components of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model for mid-frequency analysis of built-up systems, thus an uncertain ensemble combining non-parametric with mixed fuzzy and interval parametric uncertainties comes into being. A fuzzy interval Finite Element/Statistical Energy Analysis (FIFE/SEA) framework is proposed to obtain the uncertain responses of built-up systems, which are described as intervals with fuzzy bounds, termed as fuzzy-bounded intervals (FBIs) in this paper. Based on the level-cut technique, a first-order fuzzy interval perturbation FE/SEA (FFIPFE/SEA) and a second-order fuzzy interval perturbation FE/SEA method (SFIPFE/SEA) are developed to handle the mixed parametric uncertainties efficiently. FFIPFE/SEA approximates the response functions by the first-order Taylor series, while SFIPFE/SEA improves the accuracy by considering the second-order items of Taylor series, in which all the mixed second-order items are neglected. To further improve the accuracy, a Chebyshev fuzzy interval method (CFIM) is proposed, in which the Chebyshev polynomials is used to approximate the response functions. The FBIs are eventually reconstructed by assembling the extrema solutions at all cut levels. Numerical results on two built-up systems verify the effectiveness of the proposed methods.

  9. Tri-Center Analysis: Determining Measures of Trichotomous Central Tendency for the Parametric Analysis of Tri-Squared Test Results

    ERIC Educational Resources Information Center

    Osler, James Edward

    2014-01-01

    This monograph provides an epistemological rational for the design of a novel post hoc statistical measure called "Tri-Center Analysis". This new statistic is designed to analyze the post hoc outcomes of the Tri-Squared Test. In Tri-Center Analysis trichotomous parametric inferential parametric statistical measures are calculated from…

  10. Low Velocity Earth-Penetration Test and Analysis

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Jones, Yvonne; Knight, Norman F., Jr.; Kellas, Sotiris

    2001-01-01

    Modeling and simulation of structural impacts into soil continue to challenge analysts to develop accurate material models and detailed analytical simulations to predict the soil penetration event. This paper discusses finite element modeling of a series of penetrometer drop tests into soft clay. Parametric studies are performed with penetrometers of varying diameters, masses, and impact speeds to a maximum of 45 m/s. Parameters influencing the simulation such as the contact penalty factor and the material model representing the soil are also studied. An empirical relationship between key parameters is developed and is shown to correlate experimental and analytical results quite well. The results provide preliminary design guidelines for Earth impact that may be useful for future space exploration sample return missions.

  11. Empirical intrinsic geometry for nonlinear modeling and time series filtering.

    PubMed

    Talmon, Ronen; Coifman, Ronald R

    2013-07-30

    In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.

  12. Spectral decompositions of multiple time series: a Bayesian non-parametric approach.

    PubMed

    Macaro, Christian; Prado, Raquel

    2014-01-01

    We consider spectral decompositions of multiple time series that arise in studies where the interest lies in assessing the influence of two or more factors. We write the spectral density of each time series as a sum of the spectral densities associated to the different levels of the factors. We then use Whittle's approximation to the likelihood function and follow a Bayesian non-parametric approach to obtain posterior inference on the spectral densities based on Bernstein-Dirichlet prior distributions. The prior is strategically important as it carries identifiability conditions for the models and allows us to quantify our degree of confidence in such conditions. A Markov chain Monte Carlo (MCMC) algorithm for posterior inference within this class of frequency-domain models is presented.We illustrate the approach by analyzing simulated and real data via spectral one-way and two-way models. In particular, we present an analysis of functional magnetic resonance imaging (fMRI) brain responses measured in individuals who participated in a designed experiment to study pain perception in humans.

  13. A new parametric method to smooth time-series data of metabolites in metabolic networks.

    PubMed

    Miyawaki, Atsuko; Sriyudthsak, Kansuporn; Hirai, Masami Yokota; Shiraishi, Fumihide

    2016-12-01

    Mathematical modeling of large-scale metabolic networks usually requires smoothing of metabolite time-series data to account for measurement or biological errors. Accordingly, the accuracy of smoothing curves strongly affects the subsequent estimation of model parameters. Here, an efficient parametric method is proposed for smoothing metabolite time-series data, and its performance is evaluated. To simplify parameter estimation, the method uses S-system-type equations with simple power law-type efflux terms. Iterative calculation using this method was found to readily converge, because parameters are estimated stepwise. Importantly, smoothing curves are determined so that metabolite concentrations satisfy mass balances. Furthermore, the slopes of smoothing curves are useful in estimating parameters, because they are probably close to their true behaviors regardless of errors that may be present in the actual data. Finally, calculations for each differential equation were found to converge in much less than one second if initial parameters are set at appropriate (guessed) values. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Modeling the full-bridge series-resonant power converter

    NASA Technical Reports Server (NTRS)

    King, R. J.; Stuart, T. A.

    1982-01-01

    A steady state model is derived for the full-bridge series-resonant power converter. Normalized parametric curves for various currents and voltages are then plotted versus the triggering angle of the switching devices. The calculations are compared with experimental measurements made on a 50 kHz converter and a discussion of certain operating problems is presented.

  15. Characterizing time series via complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.

    2017-06-01

    The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.

  16. Cardiac-gated parametric images from 82 Rb PET from dynamic frames and direct 4D reconstruction.

    PubMed

    Germino, Mary; Carson, Richard E

    2018-02-01

    Cardiac perfusion PET data can be reconstructed as a dynamic sequence and kinetic modeling performed to quantify myocardial blood flow, or reconstructed as static gated images to quantify function. Parametric images from dynamic PET are conventionally not gated, to allow use of all events with lower noise. An alternative method for dynamic PET is to incorporate the kinetic model into the reconstruction algorithm itself, bypassing the generation of a time series of emission images and directly producing parametric images. So-called "direct reconstruction" can produce parametric images with lower noise than the conventional method because the noise distribution is more easily modeled in projection space than in image space. In this work, we develop direct reconstruction of cardiac-gated parametric images for 82 Rb PET with an extension of the Parametric Motion compensation OSEM List mode Algorithm for Resolution-recovery reconstruction for the one tissue model (PMOLAR-1T). PMOLAR-1T was extended to accommodate model terms to account for spillover from the left and right ventricles into the myocardium. The algorithm was evaluated on a 4D simulated 82 Rb dataset, including a perfusion defect, as well as a human 82 Rb list mode acquisition. The simulated list mode was subsampled into replicates, each with counts comparable to one gate of a gated acquisition. Parametric images were produced by the indirect (separate reconstructions and modeling) and direct methods for each of eight low-count and eight normal-count replicates of the simulated data, and each of eight cardiac gates for the human data. For the direct method, two initialization schemes were tested: uniform initialization, and initialization with the filtered iteration 1 result of the indirect method. For the human dataset, event-by-event respiratory motion compensation was included. The indirect and direct methods were compared for the simulated dataset in terms of bias and coefficient of variation as a function of iteration. Convergence of direct reconstruction was slow with uniform initialization; lower bias was achieved in fewer iterations by initializing with the filtered indirect iteration 1 images. For most parameters and regions evaluated, the direct method achieved the same or lower absolute bias at matched iteration as the indirect method, with 23%-65% lower noise. Additionally, the direct method gave better contrast between the perfusion defect and surrounding normal tissue than the indirect method. Gated parametric images from the human dataset had comparable relative performance of indirect and direct, in terms of mean parameter values per iteration. Changes in myocardial wall thickness and blood pool size across gates were readily visible in the gated parametric images, with higher contrast between myocardium and left ventricle blood pool in parametric images than gated SUV images. Direct reconstruction can produce parametric images with less noise than the indirect method, opening the potential utility of gated parametric imaging for perfusion PET. © 2017 American Association of Physicists in Medicine.

  17. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    PubMed

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Variability of rainfall over Lake Kariba catchment area in the Zambezi river basin, Zimbabwe

    NASA Astrophysics Data System (ADS)

    Muchuru, Shepherd; Botai, Joel O.; Botai, Christina M.; Landman, Willem A.; Adeola, Abiodun M.

    2016-04-01

    In this study, average monthly and annual rainfall totals recorded for the period 1970 to 2010 from a network of 13 stations across the Lake Kariba catchment area of the Zambezi river basin were analyzed in order to characterize the spatial-temporal variability of rainfall across the catchment area. In the analysis, the data were subjected to intervention and homogeneity analysis using the Cumulative Summation (CUSUM) technique and step change analysis using rank-sum test. Furthermore, rainfall variability was characterized by trend analysis using the non-parametric Mann-Kendall statistic. Additionally, the rainfall series were decomposed and the spectral characteristics derived using Cross Wavelet Transform (CWT) and Wavelet Coherence (WC) analysis. The advantage of using the wavelet-based parameters is that they vary in time and can therefore be used to quantitatively detect time-scale-dependent correlations and phase shifts between rainfall time series at various localized time-frequency scales. The annual and seasonal rainfall series were homogeneous and demonstrated no apparent significant shifts. According to the inhomogeneity classification, the rainfall series recorded across the Lake Kariba catchment area belonged to category A (useful) and B (doubtful), i.e., there were zero to one and two absolute tests rejecting the null hypothesis (at 5 % significance level), respectively. Lastly, the long-term variability of the rainfall series across the Lake Kariba catchment area exhibited non-significant positive and negative trends with coherent oscillatory modes that are constantly locked in phase in the Morlet wavelet space.

  19. Problems of the design of low-noise input devices. [parametric amplifiers

    NASA Technical Reports Server (NTRS)

    Manokhin, V. M.; Nemlikher, Y. A.; Strukov, I. A.; Sharfov, Y. A.

    1974-01-01

    An analysis is given of the requirements placed on the elements of parametric centimeter waveband amplifiers for achievement of minimal noise temperatures. A low-noise semiconductor parametric amplifier using germanium parametric diodes for a receiver operating in the 4 GHz band was developed and tested confirming the possibility of satisfying all requirements.

  20. Characteristics of stereo reproduction with parametric loudspeakers

    NASA Astrophysics Data System (ADS)

    Aoki, Shigeaki; Toba, Masayoshi; Tsujita, Norihisa

    2012-05-01

    A parametric loudspeaker utilizes nonlinearity of a medium and is known as a super-directivity loudspeaker. The parametric loudspeaker is one of the prominent applications of nonlinear ultrasonics. So far, the applications have been limited monaural reproduction sound system for public address in museum, station and street etc. In this paper, we discussed characteristics of stereo reproduction with two parametric loudspeakers by comparing with those with two ordinary dynamic loudspeakers. In subjective tests, three typical listening positions were selected to investigate the possibility of correct sound localization in a wide listening area. The binaural information was ILD (Interaural Level Difference) or ITD (Interaural Time Delay). The parametric loudspeaker was an equilateral hexagon. The inner and outer diameters were 99 and 112 mm, respectively. Signals were 500 Hz, 1 kHz, 2 kHz and 4 kHz pure tones and pink noise. Three young males listened to test signals 10 times in each listening condition. Subjective test results showed that listeners at the three typical listening positions perceived correct sound localization of all signals using the parametric loudspeakers. It was almost similar to those using the ordinary dynamic loudspeakers, however, except for the case of sinusoidal waves with ITD. It was determined the parametric loudspeaker could exclude the contradiction between the binaural information ILD and ITD that occurred in stereo reproduction with ordinary dynamic loudspeakers because the super directivity of parametric loudspeaker suppressed the cross talk components.

  1. Direct reconstruction of parametric images for brain PET with event-by-event motion correction: evaluation in two tracers across count levels

    NASA Astrophysics Data System (ADS)

    Germino, Mary; Gallezot, Jean-Dominque; Yan, Jianhua; Carson, Richard E.

    2017-07-01

    Parametric images for dynamic positron emission tomography (PET) are typically generated by an indirect method, i.e. reconstructing a time series of emission images, then fitting a kinetic model to each voxel time activity curve. Alternatively, ‘direct reconstruction’, incorporates the kinetic model into the reconstruction algorithm itself, directly producing parametric images from projection data. Direct reconstruction has been shown to achieve parametric images with lower standard error than the indirect method. Here, we present direct reconstruction for brain PET using event-by-event motion correction of list-mode data, applied to two tracers. Event-by-event motion correction was implemented for direct reconstruction in the Parametric Motion-compensation OSEM List-mode Algorithm for Resolution-recovery reconstruction. The direct implementation was tested on simulated and human datasets with tracers [11C]AFM (serotonin transporter) and [11C]UCB-J (synaptic density), which follow the 1-tissue compartment model. Rigid head motion was tracked with the Vicra system. Parametric images of K 1 and distribution volume (V T  =  K 1/k 2) were compared to those generated by the indirect method by regional coefficient of variation (CoV). Performance across count levels was assessed using sub-sampled datasets. For simulated and real datasets at high counts, the two methods estimated K 1 and V T with comparable accuracy. At lower count levels, the direct method was substantially more robust to outliers than the indirect method. Compared to the indirect method, direct reconstruction reduced regional K 1 CoV by 35-48% (simulated dataset), 39-43% ([11C]AFM dataset) and 30-36% ([11C]UCB-J dataset) across count levels (averaged over regions at matched iteration); V T CoV was reduced by 51-58%, 54-60% and 30-46%, respectively. Motion correction played an important role in the dataset with larger motion: correction increased regional V T by 51% on average in the [11C]UCB-J dataset. Direct reconstruction of dynamic brain PET with event-by-event motion correction is achievable and dramatically more robust to noise in V T images than the indirect method.

  2. Improved hydrological model parametrization for climate change impact assessment under data scarcity - The potential of field monitoring techniques and geostatistics.

    PubMed

    Meyer, Swen; Blaschek, Michael; Duttmann, Rainer; Ludwig, Ralf

    2016-02-01

    According to current climate projections, Mediterranean countries are at high risk for an even pronounced susceptibility to changes in the hydrological budget and extremes. These changes are expected to have severe direct impacts on the management of water resources, agricultural productivity and drinking water supply. Current projections of future hydrological change, based on regional climate model results and subsequent hydrological modeling schemes, are very uncertain and poorly validated. The Rio Mannu di San Sperate Basin, located in Sardinia, Italy, is one test site of the CLIMB project. The Water Simulation Model (WaSiM) was set up to model current and future hydrological conditions. The availability of measured meteorological and hydrological data is poor as it is common for many Mediterranean catchments. In this study we conducted a soil sampling campaign in the Rio Mannu catchment. We tested different deterministic and hybrid geostatistical interpolation methods on soil textures and tested the performance of the applied models. We calculated a new soil texture map based on the best prediction method. The soil model in WaSiM was set up with the improved new soil information. The simulation results were compared to standard soil parametrization. WaSiMs was validated with spatial evapotranspiration rates using the triangle method (Jiang and Islam, 1999). WaSiM was driven with the meteorological forcing taken from 4 different ENSEMBLES climate projections for a reference (1971-2000) and a future (2041-2070) times series. The climate change impact was assessed based on differences between reference and future time series. The simulated results show a reduction of all hydrological quantities in the future in the spring season. Furthermore simulation results reveal an earlier onset of dry conditions in the catchment. We show that a solid soil model setup based on short-term field measurements can improve long-term modeling results, which is especially important in ungauged catchments. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Parametric tests of a traction drive retrofitted to an automotive gas turbine

    NASA Technical Reports Server (NTRS)

    Rohn, D. A.; Lowenthal, S. H.; Anderson, N. E.

    1980-01-01

    The results of a test program to retrofit a high performance fixed ratio Nasvytis Multiroller Traction Drive in place of a helical gear set to a gas turbine engine are presented. Parametric tests up to a maximum engine power turbine speed of 45,500 rpm and to a power level of 11 kW were conducted. Comparisons were made to similar drives that were parametrically tested on a back-to-back test stand. The drive showed good compatibility with the gas turbine engine. Specific fuel consumption of the engine with the traction drive speed reducer installed was comparable to the original helical gearset equipped engine.

  4. Parametric analysis of ATM solar array.

    NASA Technical Reports Server (NTRS)

    Singh, B. K.; Adkisson, W. B.

    1973-01-01

    The paper discusses the methods used for the calculation of ATM solar array performance characteristics and provides the parametric analysis of solar panels used in SKYLAB. To predict the solar array performance under conditions other than test conditions, a mathematical model has been developed. Four computer programs have been used to convert the solar simulator test data to the parametric curves. The first performs module summations, the second determines average solar cell characteristics which will cause a mathematical model to generate a curve matching the test data, the third is a polynomial fit program which determines the polynomial equations for the solar cell characteristics versus temperature, and the fourth program uses the polynomial coefficients generated by the polynomial curve fit program to generate the parametric data.

  5. Cryogenic storage tank thermal analysis

    NASA Technical Reports Server (NTRS)

    Wright, J. P.

    1976-01-01

    Parametric study discusses relationship between cryogenic boil-off and factors such as tank size, insulation thickness and performance, structural-support heat leaks and use of vapor-cooled shields. Data presented as series of nomographs and curves.

  6. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  7. A Note on the Assumption of Identical Distributions for Nonparametric Tests of Location

    ERIC Educational Resources Information Center

    Nordstokke, David W.; Colp, S. Mitchell

    2018-01-01

    Often, when testing for shift in location, researchers will utilize nonparametric statistical tests in place of their parametric counterparts when there is evidence or belief that the assumptions of the parametric test are not met (i.e., normally distributed dependent variables). An underlying and often unattended to assumption of nonparametric…

  8. Effect of Intercalated Water on Potassium Ion Transport through Kv1.2 Channels Studied via On-the-Fly Free-Energy Parametrization.

    PubMed

    Paz, S Alexis; Maragliano, Luca; Abrams, Cameron F

    2018-05-08

    We introduce a two-dimensional version of the method called on-the-fly free energy parametrization (OTFP) to reconstruct free-energy surfaces using Molecular Dynamics simulations, which we name OTFP-2D. We first test the new method by reconstructing the well-known dihedral angles free energy surface of solvated alanine dipeptide. Then, we use it to investigate the process of K + ions translocation inside the Kv1.2 channel. By comparing a series of two-dimensional free energy surfaces for ion movement calculated with different conditions on the intercalated water molecules, we first recapitulate the widely accepted knock-on mechanism for ion translocation and then confirm that permeation occurs with water molecules alternated among the ions, in accordance with the latest experimental findings. From a methodological standpoint, our new OTFP-2D algorithm demonstrates the excellent sampling acceleration of temperature-accelerated molecular dynamics and the ability to efficiently compute 2D free-energy surfaces. It will therefore be useful in large variety complex biomacromolecular simulations.

  9. Seasonal trend analysis and ARIMA modeling of relative humidity and wind speed time series around Yamula Dam

    NASA Astrophysics Data System (ADS)

    Eymen, Abdurrahman; Köylü, Ümran

    2018-02-01

    Local climate change is determined by analysis of long-term recorded meteorological data. In the statistical analysis of the meteorological data, the Mann-Kendall rank test, which is one of the non-parametrical tests, has been used; on the other hand, for determining the power of the trend, Theil-Sen method has been used on the data obtained from 16 meteorological stations. The stations cover the provinces of Kayseri, Sivas, Yozgat, and Nevşehir in the Central Anatolia region of Turkey. Changes in land-use affect local climate. Dams are structures that cause major changes on the land. Yamula Dam is located 25 km northwest of Kayseri. The dam has huge water body which is approximately 85 km2. The mentioned tests have been used for detecting the presence of any positive or negative trend in meteorological data. The meteorological data in relation to the seasonal average, maximum, and minimum values of the relative humidity and seasonal average wind speed have been organized as time series and the tests have been conducted accordingly. As a result of these tests, the following have been identified: increase was observed in minimum relative humidity values in the spring, summer, and autumn seasons. As for the seasonal average wind speed, decrease was detected for nine stations in all seasons, whereas increase was observed in four stations. After the trend analysis, pre-dam mean relative humidity time series were modeled with Autoregressive Integrated Moving Averages (ARIMA) model which is statistical modeling tool. Post-dam relative humidity values were predicted by ARIMA models.

  10. Parametric vs. non-parametric statistics of low resolution electromagnetic tomography (LORETA).

    PubMed

    Thatcher, R W; North, D; Biver, C

    2005-01-01

    This study compared the relative statistical sensitivity of non-parametric and parametric statistics of 3-dimensional current sources as estimated by the EEG inverse solution Low Resolution Electromagnetic Tomography (LORETA). One would expect approximately 5% false positives (classification of a normal as abnormal) at the P < .025 level of probability (two tailed test) and approximately 1% false positives at the P < .005 level. EEG digital samples (2 second intervals sampled 128 Hz, 1 to 2 minutes eyes closed) from 43 normal adult subjects were imported into the Key Institute's LORETA program. We then used the Key Institute's cross-spectrum and the Key Institute's LORETA output files (*.lor) as the 2,394 gray matter pixel representation of 3-dimensional currents at different frequencies. The mean and standard deviation *.lor files were computed for each of the 2,394 gray matter pixels for each of the 43 subjects. Tests of Gaussianity and different transforms were computed in order to best approximate a normal distribution for each frequency and gray matter pixel. The relative sensitivity of parametric vs. non-parametric statistics were compared using a "leave-one-out" cross validation method in which individual normal subjects were withdrawn and then statistically classified as being either normal or abnormal based on the remaining subjects. Log10 transforms approximated Gaussian distribution in the range of 95% to 99% accuracy. Parametric Z score tests at P < .05 cross-validation demonstrated an average misclassification rate of approximately 4.25%, and range over the 2,394 gray matter pixels was 27.66% to 0.11%. At P < .01 parametric Z score cross-validation false positives were 0.26% and ranged from 6.65% to 0% false positives. The non-parametric Key Institute's t-max statistic at P < .05 had an average misclassification error rate of 7.64% and ranged from 43.37% to 0.04% false positives. The nonparametric t-max at P < .01 had an average misclassification rate of 6.67% and ranged from 41.34% to 0% false positives of the 2,394 gray matter pixels for any cross-validated normal subject. In conclusion, adequate approximation to Gaussian distribution and high cross-validation can be achieved by the Key Institute's LORETA programs by using a log10 transform and parametric statistics, and parametric normative comparisons had lower false positive rates than the non-parametric tests.

  11. A capacitive ultrasonic transducer based on parametric resonance.

    PubMed

    Surappa, Sushruta; Satir, Sarp; Levent Degertekin, F

    2017-07-24

    A capacitive ultrasonic transducer based on a parametric resonator structure is described and experimentally demonstrated. The transducer structure, which we call capacitive parametric ultrasonic transducer (CPUT), uses a parallel plate capacitor with a movable membrane as part of a degenerate parametric series RLC resonator circuit with a resonance frequency of f o . When the capacitor plate is driven with an incident harmonic ultrasonic wave at the pump frequency of 2f o with sufficient amplitude, the RLC circuit becomes unstable and ultrasonic energy can be efficiently converted to an electrical signal at f o frequency in the RLC circuit. An important characteristic of the CPUT is that unlike other electrostatic transducers, it does not require DC bias or permanent charging to be used as a receiver. We describe the operation of the CPUT using an analytical model and numerical simulations, which shows drive amplitude dependent operation regimes including parametric resonance when a certain threshold is exceeded. We verify these predictions by experiments with a micromachined membrane based capacitor structure in immersion where ultrasonic waves incident at 4.28 MHz parametrically drive a signal with significant amplitude in the 2.14 MHz RLC circuit. With its unique features, the CPUT can be particularly advantageous for applications such as wireless power transfer for biomedical implants and acoustic sensing.

  12. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  13. Results of Two-Stage Light-Gas Gun Development Efforts and Hypervelocity Impact Tests of Advanced Thermal Protection Materials

    NASA Technical Reports Server (NTRS)

    Cornelison, C. J.; Watts, Eric T.

    1998-01-01

    Gun development efforts to increase the launching capabilities of the NASA Ames 0.5-inch two-stage light-gas gun have been investigated. A gun performance simulation code was used to guide initial parametric variations and hardware modifications, in order to increase the projectile impact velocity capability to 8 km/s, while maintaining acceptable levels of gun barrel erosion and gun component stresses. Concurrent with this facility development effort, a hypervelocity impact testing series in support of the X-33/RLV program was performed in collaboration with Rockwell International. Specifically, advanced thermal protection system materials were impacted with aluminum spheres to simulate impacts with on-orbit space debris. Materials tested included AETB-8, AETB-12, AETB-20, and SIRCA-25 tiles, tailorable advanced blanket insulation (TABI), and high temperature AFRSI (HTA). The ballistic limit for several Thermal Protection System (TPS) configurations was investigated to determine particle sizes which cause threshold TPS/structure penetration. Crater depth in tiles was measured as a function of impact particle size. The relationship between coating type and crater morphology was also explored. Data obtained during this test series was used to perform a preliminary analysis of the risks to a typical orbital vehicle from the meteoroid and space debris environment.

  14. Study of magnetic resonance with parametric modulation in a potassium vapor cell

    NASA Astrophysics Data System (ADS)

    Zhang, Rui; Wang, Zhiguo; Peng, Xiang; Li, Wenhao; Li, Songjian; Guo, Hong; Cream Team

    2017-04-01

    A typical magnetic-resonance scheme employs a static bias magnetic field and an orthogonal driving magnetic field oscillating at the Larmor frequency, at which the atomic polarization precesses around the static magnetic field. We demonstrate in a potassium vapor cell the variations of the resonance condition and the spin precession dynamics resulting from the parametric modulation of the bias field, which are in well agreement with theoretical predictions from the Bloch equation. We show that, the driving magnetic field with the frequency detuned by different harmonics of the parametric modulation frequency can lead to resonance as well. Also, a series of frequency sidebands centered at the driving frequency and spaced by the parametric modulation frequency can be observed in the precession of the atomic polarization. These effects could be used in different atomic magnetometry applications. This work is supported by the National Science Fund for Distinguished Young Scholars of China (Grant No. 61225003) and the National Natural Science Foundation of China (Grant Nos. 61531003 and 61571018).

  15. Communication: Analytic continuation of the virial series through the critical point using parametric approximants.

    PubMed

    Barlow, Nathaniel S; Schultz, Andrew J; Weinstein, Steven J; Kofke, David A

    2015-08-21

    The mathematical structure imposed by the thermodynamic critical point motivates an approximant that synthesizes two theoretically sound equations of state: the parametric and the virial. The former is constructed to describe the critical region, incorporating all scaling laws; the latter is an expansion about zero density, developed from molecular considerations. The approximant is shown to yield an equation of state capable of accurately describing properties over a large portion of the thermodynamic parameter space, far greater than that covered by each treatment alone.

  16. Amplification of microwaves by superconducting microbridges in a four-wave parametric mode

    NASA Technical Reports Server (NTRS)

    Parrish, P. T.; Chiao, R. Y.

    1974-01-01

    Parametric amplification of microwaves was observed using thin-film junctions of the Anderson-Dayem type. A series of 80 such junctions were incorporated into the upper conductor of a broadband 50-ohm microstrip transmission line with no DC bias. The amplifier was operated in the 'doubly degenerate' mode with signal, pump, and idler frequencies closely and equally spaced. An electronic gain of 12 dB at 10 GHz was observed. The bandwidth was measured to be 1 GHz and the noise temperature to be less than 20 K.

  17. Quasi-phase-matched χ(3 )-parametric interactions in sinusoidally tapered waveguides

    NASA Astrophysics Data System (ADS)

    Saleh, Mohammed F.

    2018-01-01

    In this article, I show how periodically tapered waveguides can be employed as efficient quasi-phase-matching schemes for four-wave mixing parametric processes in third-order nonlinear materials. As an example, a thorough study of enhancing third-harmonic generation in sinusoidally tapered fibers has been conducted. The quasi-phase-matching condition has been obtained for nonlinear parametric interactions in these structures using Fourier-series analysis. The dependencies of the conversion efficiency of the third harmonic on the modulation amplitude, tapering period, longitudinal-propagation direction, and pump wavelength have been studied. In comparison to uniform waveguides, the conversion efficiency has been enhanced by orders of magnitudes. I envisage that this work will have a great impact in the field of guided nonlinear optics using centrosymmetric materials.

  18. Parametric analyses of planned flowing uranium hexafluoride critical experiments

    NASA Technical Reports Server (NTRS)

    Rodgers, R. J.; Latham, T. S.

    1976-01-01

    Analytical investigations were conducted to determine preliminary design and operating characteristics of flowing uranium hexafluoride (UF6) gaseous nuclear reactor experiments in which a hybrid core configuration comprised of UF6 gas and a region of solid fuel will be employed. The investigations are part of a planned program to perform a series of experiments of increasing performance, culminating in an approximately 5 MW fissioning uranium plasma experiment. A preliminary design is described for an argon buffer gas confined, UF6 flow loop system for future use in flowing critical experiments. Initial calculations to estimate the operating characteristics of the gaseous fissioning UF6 in a confined flow test at a pressure of 4 atm, indicate temperature increases of approximately 100 and 1000 K in the UF6 may be obtained for total test power levels of 100 kW and 1 MW for test times of 320 and 32 sec, respectively.

  19. Electric-car simulation

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.; Slusser, R. A.

    1980-01-01

    PARAMET, interactive simulation program for parametric studies of electric vehicles, guides user through simulation by menu and series of prompts for input parameters. Program considers aerodynamic drag, rolling resistance, linear and rotational acceleration, and road gradient as forces acting on vehicle.

  20. Results of the JIMO Follow-on Destinations Parametric Studies

    NASA Technical Reports Server (NTRS)

    Noca, Muriel A.; Hack, Kurt J.

    2005-01-01

    NASA's proposed Jupiter Icy Moon Orbiter (JIMO) mission currently in conceptual development is to be the first one of a series of highly capable Nuclear Electric Propulsion (NEP) science driven missions. To understand the implications of a multi-mission capability requirement on the JIMO vehicle and mission, the NASA Prometheus Program initiated a set of parametric high-level studies to be followed by a series of more in-depth studies. The JIMO potential follow-on destinations identified include a Saturn system tour, a Neptune system tour, a Kuiper Belt Objects rendezvous, an Interstellar Precursor mission, a Multiple Asteroid Sample Return and a Comet Sample Return. This paper shows that the baseline JIMO reactor and design envelop can satisfy five out of six of the follow-on destinations. Flight time to these destinations can significantly be reduced by increasing the launch energy or/and by inserting gravity assists to the heliocentric phase.

  1. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  2. Comparison of different synthetic 5-min rainfall time series on the results of rainfall runoff simulations in urban drainage modelling

    NASA Astrophysics Data System (ADS)

    Krämer, Stefan; Rohde, Sophia; Schröder, Kai; Belli, Aslan; Maßmann, Stefanie; Schönfeld, Martin; Henkel, Erik; Fuchs, Lothar

    2015-04-01

    The design of urban drainage systems with numerical simulation models requires long, continuous rainfall time series with high temporal resolution. However, suitable observed time series are rare. As a result, usual design concepts often use uncertain or unsuitable rainfall data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic rainfall data as input for urban drainage modelling are advanced, tested, and compared. Synthetic rainfall time series of three different precipitation model approaches, - one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model-, are provided for three catchments with different sewer system characteristics in different climate regions in Germany: - Hamburg (northern Germany): maritime climate, mean annual rainfall: 770 mm; combined sewer system length: 1.729 km (City center of Hamburg), storm water sewer system length (Hamburg Harburg): 168 km - Brunswick (Lower Saxony, northern Germany): transitional climate from maritime to continental, mean annual rainfall: 618 mm; sewer system length: 278 km, connected impervious area: 379 ha, height difference: 27 m - Friburg in Brisgau (southern Germany): Central European transitional climate, mean annual rainfall: 908 mm; sewer system length: 794 km, connected impervious area: 1 546 ha, height difference 284 m Hydrodynamic models are set up for each catchment to simulate rainfall runoff processes in the sewer systems. Long term event time series are extracted from the - three different synthetic rainfall time series (comprising up to 600 years continuous rainfall) provided for each catchment and - observed gauge rainfall (reference rainfall) according national hydraulic design standards. The synthetic and reference long term event time series are used as rainfall input for the hydrodynamic sewer models. For comparison of the synthetic rainfall time series against the reference rainfall and against each other the number of - surcharged manholes, - surcharges per manhole, - and the average surcharge volume per manhole are applied as hydraulic performance criteria. The results are discussed and assessed to answer the following questions: - Are the synthetic rainfall approaches suitable to generate high resolution rainfall series and do they produce, - in combination with numerical rainfall runoff models - valid results for design of urban drainage systems? - What are the bounds of uncertainty in the runoff results depending on the synthetic rainfall model and on the climate region? The work is carried out within the SYNOPSE project, funded by the German Federal Ministry of Education and Research (BMBF).

  3. Non-linear auto-regressive models for cross-frequency coupling in neural time series

    PubMed Central

    Tallot, Lucille; Grabot, Laetitia; Doyère, Valérie; Grenier, Yves; Gramfort, Alexandre

    2017-01-01

    We address the issue of reliably detecting and quantifying cross-frequency coupling (CFC) in neural time series. Based on non-linear auto-regressive models, the proposed method provides a generative and parametric model of the time-varying spectral content of the signals. As this method models the entire spectrum simultaneously, it avoids the pitfalls related to incorrect filtering or the use of the Hilbert transform on wide-band signals. As the model is probabilistic, it also provides a score of the model “goodness of fit” via the likelihood, enabling easy and legitimate model selection and parameter comparison; this data-driven feature is unique to our model-based approach. Using three datasets obtained with invasive neurophysiological recordings in humans and rodents, we demonstrate that these models are able to replicate previous results obtained with other metrics, but also reveal new insights such as the influence of the amplitude of the slow oscillation. Using simulations, we demonstrate that our parametric method can reveal neural couplings with shorter signals than non-parametric methods. We also show how the likelihood can be used to find optimal filtering parameters, suggesting new properties on the spectrum of the driving signal, but also to estimate the optimal delay between the coupled signals, enabling a directionality estimation in the coupling. PMID:29227989

  4. Multiscale Reconstruction for Magnetic Resonance Fingerprinting

    PubMed Central

    Pierre, Eric Y.; Ma, Dan; Chen, Yong; Badve, Chaitra; Griswold, Mark A.

    2015-01-01

    Purpose To reduce acquisition time needed to obtain reliable parametric maps with Magnetic Resonance Fingerprinting. Methods An iterative-denoising algorithm is initialized by reconstructing the MRF image series at low image resolution. For subsequent iterations, the method enforces pixel-wise fidelity to the best-matching dictionary template then enforces fidelity to the acquired data at slightly higher spatial resolution. After convergence, parametric maps with desirable spatial resolution are obtained through template matching of the final image series. The proposed method was evaluated on phantom and in-vivo data using the highly-undersampled, variable-density spiral trajectory and compared with the original MRF method. The benefits of additional sparsity constraints were also evaluated. When available, gold standard parameter maps were used to quantify the performance of each method. Results The proposed approach allowed convergence to accurate parametric maps with as few as 300 time points of acquisition, as compared to 1000 in the original MRF work. Simultaneous quantification of T1, T2, proton density (PD) and B0 field variations in the brain was achieved in vivo for a 256×256 matrix for a total acquisition time of 10.2s, representing a 3-fold reduction in acquisition time. Conclusions The proposed iterative multiscale reconstruction reliably increases MRF acquisition speed and accuracy. PMID:26132462

  5. Towards an Empirically Based Parametric Explosion Spectral Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S R; Walter, W R; Ruppert, S

    2009-08-31

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never before been tested. The focus of our work is on the local and regional distances (< 2000 km) and phases (Pn, Pg, Sn, Lg) necessary to see small explosions. We are developing a parametric model of the nuclear explosion seismic source spectrum that is compatible with the earthquake-based geometrical spreading and attenuation models developed using the Magnitude Distance Amplitude Correction (MDAC) techniques (Walter and Taylor, 2002). The explosion parametric model will be particularly important in regions without any priormore » explosion data for calibration. The model is being developed using the available body of seismic data at local and regional distances for past nuclear explosions at foreign and domestic test sites. Parametric modeling is a simple and practical approach for widespread monitoring applications, prior to the capability to carry out fully deterministic modeling. The achievable goal of our parametric model development is to be able to predict observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing. The relationship between the parametric equations and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source.« less

  6. Subsonic longitudinal and lateral aerodynamic characteristics for a systematic series of strake-wing configurations

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.

    1979-01-01

    A systematic wind tunnel study was conducted in the Langley 7 by 10 foot high speed tunnel to help establish a parametric data base of the longitudinal and lateral aerodynamic characteristics for configurations incorporating strake-wing geometries indicative of current and proposed maneuvering aircraft. The configurations employed combinations of strakes with reflexed planforms having exposed spans of 10%, 20%, and 30% of the reference wing span and wings with trapezoidal planforms having leading edge sweep angles of approximately 30, 40, 44, 50, and 60 deg. Tests were conducted at Mach numbers ranging from 0.3 to 0.8 and at angles of attack from approximately -4 to 48 deg at zero sideslip.

  7. A new approach for measuring power spectra and reconstructing time series in active galactic nuclei

    NASA Astrophysics Data System (ADS)

    Li, Yan-Rong; Wang, Jian-Min

    2018-05-01

    We provide a new approach to measure power spectra and reconstruct time series in active galactic nuclei (AGNs) based on the fact that the Fourier transform of AGN stochastic variations is a series of complex Gaussian random variables. The approach parametrizes a stochastic series in frequency domain and transforms it back to time domain to fit the observed data. The parameters and their uncertainties are derived in a Bayesian framework, which also allows us to compare the relative merits of different power spectral density models. The well-developed fast Fourier transform algorithm together with parallel computation enables an acceptable time complexity for the approach.

  8. Radon anomalies: When are they possible to be detected?

    NASA Astrophysics Data System (ADS)

    Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik

    2017-04-01

    Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission rate, but the results are strongly dependent on the length of the time window and/or type of frequency filtering. More importantly, when raw time-series contain cyclic components (e.g. seasonal or diurnal variation), the quest of anomalies related to transients becomes meaningless. We conclude that an objective identification of transient changes can be performed only after filtering the raw time-series for the physically meaningful frequency content.

  9. Comparison of parametric and bootstrap method in bioequivalence test.

    PubMed

    Ahn, Byung-Jin; Yim, Dong-Seok

    2009-10-01

    The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.

  10. Comparison of Parametric and Bootstrap Method in Bioequivalence Test

    PubMed Central

    Ahn, Byung-Jin

    2009-01-01

    The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption. PMID:19915699

  11. A robust semi-parametric warping estimator of the survivor function with an application to two-group comparisons

    PubMed Central

    Hutson, Alan D

    2018-01-01

    In this note, we develop a new and novel semi-parametric estimator of the survival curve that is comparable to the product-limit estimator under very relaxed assumptions. The estimator is based on a beta parametrization that warps the empirical distribution of the observed censored and uncensored data. The parameters are obtained using a pseudo-maximum likelihood approach adjusting the survival curve accounting for the censored observations. In the univariate setting, the new estimator tends to better extend the range of the survival estimation given a high degree of censoring. However, the key feature of this paper is that we develop a new two-group semi-parametric exact permutation test for comparing survival curves that is generally superior to the classic log-rank and Wilcoxon tests and provides the best global power across a variety of alternatives. The new test is readily extended to the k group setting. PMID:26988931

  12. Study of parametric instability in gravitational wave detectors with silicon test masses

    NASA Astrophysics Data System (ADS)

    Zhang, Jue; Zhao, Chunnong; Ju, Li; Blair, David

    2017-03-01

    Parametric instability is an intrinsic risk in high power laser interferometer gravitational wave detectors, in which the optical cavity modes interact with the acoustic modes of the mirrors, leading to exponential growth of the acoustic vibration. In this paper, we investigate the potential parametric instability for a proposed next generation gravitational wave detector, the LIGO Voyager blue design, with cooled silicon test masses of size 45 cm in diameter and 55 cm in thickness. It is shown that there would be about two unstable modes per test mass at an arm cavity power of 3 MW, with the highest parametric gain of  ∼76. While this is less than the predicted number of unstable modes for Advanced LIGO (∼40 modes with max gain of  ∼32 at the designed operating power of 830 kW), the importance of developing suitable instability suppression schemes is emphasized.

  13. Development of in-series piezoelectric bimorph bending beam actuators for active flow control applications

    NASA Astrophysics Data System (ADS)

    Chan, Wilfred K.; Clingman, Dan J.; Amitay, Michael

    2016-04-01

    Piezoelectric materials have long been used for active flow control purposes in aerospace applications to increase the effectiveness of aerodynamic surfaces on aircraft, wind turbines, and more. Piezoelectric actuators are an appropriate choice due to their low mass, small dimensions, simplistic design, and frequency response. This investigation involves the development of piezoceramic-based actuators with two bimorphs placed in series. Here, the main desired characteristic was the achievable displacement amplitude at specific driving voltages and frequencies. A parametric study was performed, in which actuators with varying dimensions were fabricated and tested. These devices were actuated with a sinusoidal waveform, resulting in an oscillating platform on which to mount active flow control devices, such as dynamic vortex generators. The main quantification method consisted of driving these devices with different voltages and frequencies to determine their free displacement, blocking force, and frequency response. It was found that resonance frequency increased with shorter and thicker actuators, while free displacement increased with longer and thinner actuators. Integration of the devices into active flow control test modules is noted. In addition to physical testing, a quasi-static analytical model was developed and compared with experimental data, which showed close correlation for both free displacement and blocking force.

  14. Modelling of extreme rainfall events in Peninsular Malaysia based on annual maximum and partial duration series

    NASA Astrophysics Data System (ADS)

    Zin, Wan Zawiah Wan; Shinyie, Wendy Ling; Jemain, Abdul Aziz

    2015-02-01

    In this study, two series of data for extreme rainfall events are generated based on Annual Maximum and Partial Duration Methods, derived from 102 rain-gauge stations in Peninsular from 1982-2012. To determine the optimal threshold for each station, several requirements must be satisfied and Adapted Hill estimator is employed for this purpose. A semi-parametric bootstrap is then used to estimate the mean square error (MSE) of the estimator at each threshold and the optimal threshold is selected based on the smallest MSE. The mean annual frequency is also checked to ensure that it lies in the range of one to five and the resulting data is also de-clustered to ensure independence. The two data series are then fitted to Generalized Extreme Value and Generalized Pareto distributions for annual maximum and partial duration series, respectively. The parameter estimation methods used are the Maximum Likelihood and the L-moment methods. Two goodness of fit tests are then used to evaluate the best-fitted distribution. The results showed that the Partial Duration series with Generalized Pareto distribution and Maximum Likelihood parameter estimation provides the best representation for extreme rainfall events in Peninsular Malaysia for majority of the stations studied. Based on these findings, several return values are also derived and spatial mapping are constructed to identify the distribution characteristic of extreme rainfall in Peninsular Malaysia.

  15. New angles on energy correlation functions

    DOE PAGES

    Moult, Ian; Necib, Lina; Thaler, Jesse

    2016-12-29

    Jet substructure observables, designed to identify specific features within jets, play an essential role at the Large Hadron Collider (LHC), both for searching for signals beyond the Standard Model and for testing QCD in extreme phase space regions. In this paper, we systematically study the structure of infrared and collinear safe substructure observables, defining a generalization of the energy correlation functions to probe n-particle correlations within a jet. These generalized correlators provide a flexible basis for constructing new substructure observables optimized for specific purposes. Focusing on three major targets of the jet substructure community — boosted top tagging, boosted W/Z/Hmore » tagging, and quark/gluon discrimination — we use power-counting techniques to identify three new series of powerful discriminants: M i, N i, and U i. The Mi series is designed for use on groomed jets, providing a novel example of observables with improved discrimination power after the removal of soft radiation. The N i series behave parametrically like the N -subjettiness ratio observables, but are defined without respect to subjet axes, exhibiting improved behavior in the unresolved limit. Finally, the U i series improves quark/gluon discrimination by using higher-point correlators to simultaneously probe multiple emissions within a jet. Taken together, these observables broaden the scope for jet substructure studies at the LHC.« less

  16. New angles on energy correlation functions

    NASA Astrophysics Data System (ADS)

    Moult, Ian; Necib, Lina; Thaler, Jesse

    2016-12-01

    Jet substructure observables, designed to identify specific features within jets, play an essential role at the Large Hadron Collider (LHC), both for searching for signals beyond the Standard Model and for testing QCD in extreme phase space regions. In this paper, we systematically study the structure of infrared and collinear safe substructure observables, defining a generalization of the energy correlation functions to probe n-particle correlations within a jet. These generalized correlators provide a flexible basis for constructing new substructure observables optimized for specific purposes. Focusing on three major targets of the jet substructure community — boosted top tagging, boosted W/Z/H tagging, and quark/gluon discrimination — we use power-counting techniques to identify three new series of powerful discriminants: M i , N i , and U i . The M i series is designed for use on groomed jets, providing a novel example of observables with improved discrimination power after the removal of soft radiation. The N i series behave parametrically like the N -subjettiness ratio observables, but are defined without respect to subjet axes, exhibiting improved behavior in the unresolved limit. Finally, the U i series improves quark/gluon discrimination by using higher-point correlators to simultaneously probe multiple emissions within a jet. Taken together, these observables broaden the scope for jet substructure studies at the LHC.

  17. Parametric Methods for Dynamic 11C-Phenytoin PET Studies.

    PubMed

    Mansor, Syahir; Yaqub, Maqsood; Boellaard, Ronald; Froklage, Femke E; de Vries, Anke; Bakker, Esther D M; Voskuyl, Rob A; Eriksson, Jonas; Schwarte, Lothar A; Verbeek, Joost; Windhorst, Albert D; Lammertsma, Adriaan A

    2017-03-01

    In this study, the performance of various methods for generating quantitative parametric images of dynamic 11 C-phenytoin PET studies was evaluated. Methods: Double-baseline 60-min dynamic 11 C-phenytoin PET studies, including online arterial sampling, were acquired for 6 healthy subjects. Parametric images were generated using Logan plot analysis, a basis function method, and spectral analysis. Parametric distribution volume (V T ) and influx rate ( K 1 ) were compared with those obtained from nonlinear regression analysis of time-activity curves. In addition, global and regional test-retest (TRT) variability was determined for parametric K 1 and V T values. Results: Biases in V T observed with all parametric methods were less than 5%. For K 1 , spectral analysis showed a negative bias of 16%. The mean TRT variabilities of V T and K 1 were less than 10% for all methods. Shortening the scan duration to 45 min provided similar V T and K 1 with comparable TRT performance compared with 60-min data. Conclusion: Among the various parametric methods tested, the basis function method provided parametric V T and K 1 values with the least bias compared with nonlinear regression data and showed TRT variabilities lower than 5%, also for smaller volume-of-interest sizes (i.e., higher noise levels) and shorter scan duration. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  18. Free response approach in a parametric system

    NASA Astrophysics Data System (ADS)

    Huang, Dishan; Zhang, Yueyue; Shao, Hexi

    2017-07-01

    In this study, a new approach to predict the free response in a parametric system is investigated. It is proposed in the special form of a trigonometric series with an exponentially decaying function of time, based on the concept of frequency splitting. By applying harmonic balance, the parametric vibration equation is transformed into an infinite set of homogeneous linear equations, from which the principal oscillation frequency can be computed, and all coefficients of harmonic components can be obtained. With initial conditions, arbitrary constants in a general solution can be determined. To analyze the computational accuracy and consistency, an approach error function is defined, which is used to assess the computational error in the proposed approach and in the standard numerical approach based on the Runge-Kutta algorithm. Furthermore, an example of a dynamic model of airplane wing flutter on a turbine engine is given to illustrate the applicability of the proposed approach. Numerical solutions show that the proposed approach exhibits high accuracy in mathematical expression, and it is valuable for theoretical research and engineering applications of parametric systems.

  19. Introduction to Permutation and Resampling-Based Hypothesis Tests

    ERIC Educational Resources Information Center

    LaFleur, Bonnie J.; Greevy, Robert A.

    2009-01-01

    A resampling-based method of inference--permutation tests--is often used when distributional assumptions are questionable or unmet. Not only are these methods useful for obvious departures from parametric assumptions (e.g., normality) and small sample sizes, but they are also more robust than their parametric counterparts in the presences of…

  20. Nearly noiseless amplification of microwave signals with a Josephson parametric amplifier

    NASA Astrophysics Data System (ADS)

    Castellanos-Beltran, Manuel

    2009-03-01

    A degenerate parametric amplifier transforms an incident coherent state by amplifying one of its quadrature components while deamplifying the other. This transformation, when performed by an ideal parametric amplifier, is completely deterministic and reversible; therefore the amplifier in principle can be noiseless. We attempt to realize a noiseless amplifier of this type at microwave frequencies with a Josephson parametric amplifier (JPA). To this end, we have built a superconducting microwave cavity containing many dc-SQUIDs. This arrangement creates a non-linear medium in a cavity and it is closely analogous to an optical parametric amplifier. In my talk, I will describe the current performance of this circuit, where I show I can amplify signals with less added noise than a quantum-limited amplifier that amplifies both quadratures. In addition, the JPA also squeezes the electromagnetic vacuum fluctuations by 10 dB. Finally, I will discuss our effort to put two such amplifiers in series in order to undo the first stage of squeezing with a second stage of amplification, demonstrating that the amplification process is truly reversible.[4pt] M. A. Castellanos-Beltran, K. D. Irwin, G. C. Hilton, L. R. Vale and K. W. Lehnert, Nature Physics, published on line, http://dx.doi.org/10.1038/nphys1090 (2008).

  1. Force-field parametrization and molecular dynamics simulations of Congo red

    NASA Astrophysics Data System (ADS)

    Król, Marcin; Borowski, Tomasz; Roterman, Irena; Piekarska, Barbara; Stopa, Barbara; Rybarska, Joanna; Konieczny, Leszek

    2004-01-01

    Congo red, a diazo dye widely used in medical diagnosis, is known to form supramolecular systems in solution. Such a supramolecular system may interact with various proteins. In order to examine the nature of such complexes empirical force field parameters for the Congo red molecule were developed. The parametrization of bonding terms closely followed the methodology used in the development of the charmm22 force field, except for the calculation of charges. Point charges were calculated from a fit to a quantum mechanically derived electrostatic potential using the CHELP-BOW method. Obtained parameters were tested in a series of molecular dynamics simulations of both a single molecule and a micelle composed of Congo red molecules. It is shown that newly developed parameters define a stable minimum on the hypersurface of the potential energy and crystal and ab initio geometries and rotational barriers are well reproduced. Furthermore, rotations around C-N bonds are similar to torsional vibrations observed in crystals of diphenyl-diazene, which confirms that the flexibility of the molecule is correct. Comparison of results obtained from micelles molecular dynamics simulations with experimental data shows that the thermal dependence of micelle creation is well reproduced.

  2. Biostatistics Series Module 3: Comparing Groups: Numerical Variables.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Numerical data that are normally distributed can be analyzed with parametric tests, that is, tests which are based on the parameters that define a normal distribution curve. If the distribution is uncertain, the data can be plotted as a normal probability plot and visually inspected, or tested for normality using one of a number of goodness of fit tests, such as the Kolmogorov-Smirnov test. The widely used Student's t-test has three variants. The one-sample t-test is used to assess if a sample mean (as an estimate of the population mean) differs significantly from a given population mean. The means of two independent samples may be compared for a statistically significant difference by the unpaired or independent samples t-test. If the data sets are related in some way, their means may be compared by the paired or dependent samples t-test. The t-test should not be used to compare the means of more than two groups. Although it is possible to compare groups in pairs, when there are more than two groups, this will increase the probability of a Type I error. The one-way analysis of variance (ANOVA) is employed to compare the means of three or more independent data sets that are normally distributed. Multiple measurements from the same set of subjects cannot be treated as separate, unrelated data sets. Comparison of means in such a situation requires repeated measures ANOVA. It is to be noted that while a multiple group comparison test such as ANOVA can point to a significant difference, it does not identify exactly between which two groups the difference lies. To do this, multiple group comparison needs to be followed up by an appropriate post hoc test. An example is the Tukey's honestly significant difference test following ANOVA. If the assumptions for parametric tests are not met, there are nonparametric alternatives for comparing data sets. These include Mann-Whitney U-test as the nonparametric counterpart of the unpaired Student's t-test, Wilcoxon signed-rank test as the counterpart of the paired Student's t-test, Kruskal-Wallis test as the nonparametric equivalent of ANOVA and the Friedman's test as the counterpart of repeated measures ANOVA.

  3. A Cartesian parametrization for the numerical analysis of material instability

    DOE PAGES

    Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.; ...

    2016-02-25

    We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less

  4. A Cartesian parametrization for the numerical analysis of material instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.

    We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less

  5. Determining the multi-scale hedge ratios of stock index futures using the lower partial moments method

    NASA Astrophysics Data System (ADS)

    Dai, Jun; Zhou, Haigang; Zhao, Shaoquan

    2017-01-01

    This paper considers a multi-scale future hedge strategy that minimizes lower partial moments (LPM). To do this, wavelet analysis is adopted to decompose time series data into different components. Next, different parametric estimation methods with known distributions are applied to calculate the LPM of hedged portfolios, which is the key to determining multi-scale hedge ratios over different time scales. Then these parametric methods are compared with the prevailing nonparametric kernel metric method. Empirical results indicate that in the China Securities Index 300 (CSI 300) index futures and spot markets, hedge ratios and hedge efficiency estimated by the nonparametric kernel metric method are inferior to those estimated by parametric hedging model based on the features of sequence distributions. In addition, if minimum-LPM is selected as a hedge target, the hedging periods, degree of risk aversion, and target returns can affect the multi-scale hedge ratios and hedge efficiency, respectively.

  6. SEC sensor parametric test and evaluation system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    This system provides the necessary automated hardware required to carry out, in conjunction with the existing 70 mm SEC television camera, the sensor evaluation tests which are described in detail. The Parametric Test Set (PTS) was completed and is used in a semiautomatic data acquisition and control mode to test the development of the 70 mm SEC sensor, WX 32193. Data analysis of raw data is performed on the Princeton IBM 360-91 computer.

  7. Non-invasive breast biopsy method using GD-DTPA contrast enhanced MRI series and F-18-FDG PET/CT dynamic image series

    NASA Astrophysics Data System (ADS)

    Magri, Alphonso William

    This study was undertaken to develop a nonsurgical breast biopsy from Gd-DTPA Contrast Enhanced Magnetic Resonance (CE-MR) images and F-18-FDG PET/CT dynamic image series. A five-step process was developed to accomplish this. (1) Dynamic PET series were nonrigidly registered to the initial frame using a finite element method (FEM) based registration that requires fiducial skin markers to sample the displacement field between image frames. A commercial FEM package (ANSYS) was used for meshing and FEM calculations. Dynamic PET image series registrations were evaluated using similarity measurements SAVD and NCC. (2) Dynamic CE-MR series were nonrigidly registered to the initial frame using two registration methods: a multi-resolution free-form deformation (FFD) registration driven by normalized mutual information, and a FEM-based registration method. Dynamic CE-MR image series registrations were evaluated using similarity measurements, localization measurements, and qualitative comparison of motion artifacts. FFD registration was found to be superior to FEM-based registration. (3) Nonlinear curve fitting was performed for each voxel of the PET/CT volume of activity versus time, based on a realistic two-compartmental Patlak model. Three parameters for this model were fitted; two of them describe the activity levels in the blood and in the cellular compartment, while the third characterizes the washout rate of F-18-FDG from the cellular compartment. (4) Nonlinear curve fitting was performed for each voxel of the MR volume of signal intensity versus time, based on a realistic two-compartment Brix model. Three parameters for this model were fitted: rate of Gd exiting the compartment, representing the extracellular space of a lesion; rate of Gd exiting a blood compartment; and a parameter that characterizes the strength of signal intensities. Curve fitting used for PET/CT and MR series was accomplished by application of the Levenburg-Marquardt nonlinear regression algorithm. The best-fit parameters were used to create 3D parametric images. Compartmental modeling evaluation was based on the ability of parameter values to differentiate between tissue types. This evaluation was used on registered and unregistered image series and found that registration improved results. (5) PET and MR parametric images were registered through FEM- and FFD-based registration. Parametric image registration was evaluated using similarity measurements, target registration error, and qualitative comparison. Comparing FFD and FEM-based registration results showed that the FEM method is superior. This five-step process constitutes a novel multifaceted approach to a nonsurgical breast biopsy that successfully executes each step. Comparison of this method to biopsy still needs to be done with a larger set of subject data.

  8. Phase transition in the parametric natural visibility graph.

    PubMed

    Snarskii, A A; Bezsudnov, I V

    2016-10-01

    We investigate time series by mapping them to the complex networks using a parametric natural visibility graph (PNVG) algorithm that generates graphs depending on arbitrary continuous parameter-the angle of view. We study the behavior of the relative number of clusters in PNVG near the critical value of the angle of view. Artificial and experimental time series of different nature are used for numerical PNVG investigations to find critical exponents above and below the critical point as well as the exponent in the finite size scaling regime. Altogether, they allow us to find the critical exponent of the correlation length for PNVG. The set of calculated critical exponents satisfies the basic Widom relation. The PNVG is found to demonstrate scaling behavior. Our results reveal the similarity between the behavior of the relative number of clusters in PNVG and the order parameter in the second-order phase transitions theory. We show that the PNVG is another example of a system (in addition to magnetic, percolation, superconductivity, etc.) with observed second-order phase transition.

  9. Multiscale reconstruction for MR fingerprinting.

    PubMed

    Pierre, Eric Y; Ma, Dan; Chen, Yong; Badve, Chaitra; Griswold, Mark A

    2016-06-01

    To reduce the acquisition time needed to obtain reliable parametric maps with Magnetic Resonance Fingerprinting. An iterative-denoising algorithm is initialized by reconstructing the MRF image series at low image resolution. For subsequent iterations, the method enforces pixel-wise fidelity to the best-matching dictionary template then enforces fidelity to the acquired data at slightly higher spatial resolution. After convergence, parametric maps with desirable spatial resolution are obtained through template matching of the final image series. The proposed method was evaluated on phantom and in vivo data using the highly undersampled, variable-density spiral trajectory and compared with the original MRF method. The benefits of additional sparsity constraints were also evaluated. When available, gold standard parameter maps were used to quantify the performance of each method. The proposed approach allowed convergence to accurate parametric maps with as few as 300 time points of acquisition, as compared to 1000 in the original MRF work. Simultaneous quantification of T1, T2, proton density (PD), and B0 field variations in the brain was achieved in vivo for a 256 × 256 matrix for a total acquisition time of 10.2 s, representing a three-fold reduction in acquisition time. The proposed iterative multiscale reconstruction reliably increases MRF acquisition speed and accuracy. Magn Reson Med 75:2481-2492, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  10. Control of thermal balance by a liquid circulating garment based on a mathematical representation of the human thermoregulatory system. Ph.D. Thesis - California Univ., Berkeley

    NASA Technical Reports Server (NTRS)

    Kuznetz, L. H.

    1976-01-01

    Test data and a mathematical model of the human thermoregulatory system were used to investigate control of thermal balance by means of a liquid circulating garment (LCG). The test data were derived from five series of experiments in which environmental and metabolic conditions were varied parametrically as a function of several independent variables, including LCG flowrate, LCG inlet temperature, net environmental heat exchange, surrounding gas ventilation rate, ambient pressure, metabolic rate, and subjective/obligatory cooling control. The resultant data were used to relate skin temperature to LCG water temperature and flowrate, to assess a thermal comfort band, to demonstrate the relationship between metabolic rate and LCG heat dissipation, and so forth. The usefulness of the mathematical model as a tool for data interpretation and for generation of trends and relationships among the various physiological parameters was also investigated and verified.

  11. kruX: matrix-based non-parametric eQTL discovery.

    PubMed

    Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom

    2014-01-14

    The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.

  12. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    PubMed

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Inference on periodicity of circadian time series.

    PubMed

    Costa, Maria J; Finkenstädt, Bärbel; Roche, Véronique; Lévi, Francis; Gould, Peter D; Foreman, Julia; Halliday, Karen; Hall, Anthony; Rand, David A

    2013-09-01

    Estimation of the period length of time-course data from cyclical biological processes, such as those driven by the circadian pacemaker, is crucial for inferring the properties of the biological clock found in many living organisms. We propose a methodology for period estimation based on spectrum resampling (SR) techniques. Simulation studies show that SR is superior and more robust to non-sinusoidal and noisy cycles than a currently used routine based on Fourier approximations. In addition, a simple fit to the oscillations using linear least squares is available, together with a non-parametric test for detecting changes in period length which allows for period estimates with different variances, as frequently encountered in practice. The proposed methods are motivated by and applied to various data examples from chronobiology.

  14. Main rotor six degree-of-freedom isolation system analysis

    NASA Technical Reports Server (NTRS)

    Eastman, L. B.

    1981-01-01

    The design requirements of the system have been defined and an isolator concept satisfies these requirements identified. Primary design objectives for the isolation system are 90% attenuation of all NP main rotor shaft loads at a weight penalty less than or equal to 1% of design gross weight. The configuration is sized for a UH-60A BLACK HAWK helicopter and its performance, risk, and system integration were evaluated through a series of parametric studies. Preliminary design was carried forward to insure that the design is practical and that the details of the integration of the isolator into the helicopter system are considered. Alternate ground and flight test demonstration programs necessary to verify the proposed isolator design are defined.

  15. Structural damping results from vibration tests of straight piping sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ware, A.G.; Thinnes, G.L.

    EG and G Idaho is assisting the USNRC and the Pressure Vessel Research Committee in supporting a final position on revised damping values for structural analyses of nuclear piping systems. As part of this program, a series of vibrational tests on 76-mm and 203-mm (3-in. amd 8-in.) Schedule 40 carbon steel piping was conducted to determine the changes in structural damping due to various parametric effects. The 10-m (33-ft) straight sections of piping were rigidly supported at the ends. Spring, rod, and constant force hangers, as well as a sway brace and snubbers were included as intermediate supports. Excitation wasmore » provided by low-force level hammer inpacts, a hydraulic shaker, and a 445-kN (50-ton) overhead crane. Data was recorded using acceleration, strain, and displacement time histories. This paper presents results from the testing showing the effect of stress level and type of supports on structural damping in piping.« less

  16. Subseasonal climate variability for North Carolina, United States

    NASA Astrophysics Data System (ADS)

    Sayemuzzaman, Mohammad; Jha, Manoj K.; Mekonnen, Ademe; Schimmel, Keith A.

    2014-08-01

    Subseasonal trends in climate variability for maximum temperature (Tmax), minimum temperature (Tmin) and precipitation were evaluated for 249 ground-based stations in North Carolina for 1950-2009. The magnitude and significance of the trends at all stations were determined using the non-parametric Theil-Sen Approach (TSA) and the Mann-Kendall (MK) test, respectively. The Sequential Mann-Kendall (SQMK) test was also applied to find the initiation of abrupt trend changes. The lag-1 serial correlation and double mass curve were employed to address the data independency and homogeneity. Using the MK trend test, statistically significant (confidence level ≥ 95% in two-tailed test) decreasing (increasing) trends by 44% (45%) of stations were found in May (June). In general, trends were decreased in Tmax and increased in Tmin data series in subseasonal scale. Using the TSA method, the magnitude of lowest (highest) decreasing (increasing) trend in Tmax is - 0.050 °C/year (+ 0.052 °C/year) in the monthly series for May (March) and for Tmin is - 0.055 °C/year (+ 0.075 °C/year) in February (December). For the precipitation time series using the TSA method, it was found that the highest (lowest) magnitude of 1.00 mm/year (- 1.20 mm/year) is in September (February). The overall trends in precipitation data series were not significant at the 95% confidence level except that 17% of stations were found to have significant (confidence level ≥ 95% in two-tailed test) decreasing trends in February. The statistically significant trend test results were used to develop a spatial distribution of trends: May for Tmax, June for Tmin, and February for precipitation. A correlative analysis of significant temperature and precipitation trend results was examined with respect to large scale circulation modes (North Atlantic Oscillation (NAO) and Southern Oscillation Index (SOI). A negative NAO index (positive-El Niño Southern Oscillation (ENSO) index) was found to be associated with the decreasing precipitation in February during 1960-1980 (2000-2009). The incremental trend in Tmin in the inter-seasonal (April-October) time scale can be associated with the positive NAO index during 1970-2000.

  17. New reconstruction of the sunspot group numbers since 1739 using direct calibration and "backbone" methods

    NASA Astrophysics Data System (ADS)

    Chatzistergos, Theodosios; Usoskin, Ilya G.; Kovaltsov, Gennady A.; Krivova, Natalie A.; Solanki, Sami K.

    2017-06-01

    Context. The group sunspot number (GSN) series constitute the longest instrumental astronomical database providing information on solar activity. This database is a compilation of observations by many individual observers, and their inter-calibration has usually been performed using linear rescaling. There are multiple published series that show different long-term trends for solar activity. Aims: We aim at producing a GSN series, with a non-linear non-parametric calibration. The only underlying assumptions are that the differences between the various series are due to different acuity thresholds of the observers, and that the threshold of each observer remains constant throughout the observing period. Methods: We used a daisy chain process with backbone (BB) observers and calibrated all overlapping observers to them. We performed the calibration of each individual observer with a probability distribution function (PDF) matrix constructed considering all daily values for the overlapping period with the BB. The calibration of the BBs was carried out in a similar manner. The final series was constructed by merging different BB series. We modelled the propagation of errors straightforwardly with Monte Carlo simulations. A potential bias due to the selection of BBs was investigated and the effect was shown to lie within the 1σ interval of the produced series. The exact selection of the reference period was shown to have a rather small effect on our calibration as well. Results: The final series extends back to 1739 and includes data from 314 observers. This series suggests moderate activity during the 18th and 19th century, which is significantly lower than the high level of solar activity predicted by other recent reconstructions applying linear regressions. Conclusions: The new series provides a robust reconstruction, based on modern and non-parametric methods, of sunspot group numbers since 1739, and it confirms the existence of the modern grand maximum of solar activity in the second half of the 20th century. Values of the group sunspot number series are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/602/A69

  18. How to Compare Parametric and Nonparametric Person-Fit Statistics Using Real Data

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2017-01-01

    Person-fit assessment (PFA) is concerned with uncovering atypical test performance as reflected in the pattern of scores on individual items on a test. Existing person-fit statistics (PFSs) include both parametric and nonparametric statistics. Comparison of PFSs has been a popular research topic in PFA, but almost all comparisons have employed…

  19. Coaxial Dump Ramjet Combustor Combustion Instabilities. Part I. Parametric Test Data.

    DTIC Science & Technology

    1981-07-01

    AD-AIII 355 COAXIAL DUP RA8.? COMBUSTOR COMBUSTION INSTABILITIES I/~ PART I PARAUER1C. 1111 AIR FORCE WRIONT AERONUTICAL LAOS WRIOIII-PATTERSOll...MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANOAROS - 193- A AFWAL-TR-81 -2047 Part 1 COAXIAL DUMP RAMJET COMBUSTOR COMBUSTION INSTABILITIES PART...COMBUSTOR Interim Report for Period COMBUSTION INSTABILITIES February 1979 - March 1980 Part I - Parametric Test Data S. PERFORMING ORG. REPORT NUMBER 7

  20. Comparison of four approaches to a rock facies classification problem

    USGS Publications Warehouse

    Dubois, M.K.; Bohling, Geoffrey C.; Chakrabarti, S.

    2007-01-01

    In this study, seven classifiers based on four different approaches were tested in a rock facies classification problem: classical parametric methods using Bayes' rule, and non-parametric methods using fuzzy logic, k-nearest neighbor, and feed forward-back propagating artificial neural network. Determining the most effective classifier for geologic facies prediction in wells without cores in the Panoma gas field, in Southwest Kansas, was the objective. Study data include 3600 samples with known rock facies class (from core) with each sample having either four or five measured properties (wire-line log curves), and two derived geologic properties (geologic constraining variables). The sample set was divided into two subsets, one for training and one for testing the ability of the trained classifier to correctly assign classes. Artificial neural networks clearly outperformed all other classifiers and are effective tools for this particular classification problem. Classical parametric models were inadequate due to the nature of the predictor variables (high dimensional and not linearly correlated), and feature space of the classes (overlapping). The other non-parametric methods tested, k-nearest neighbor and fuzzy logic, would need considerable improvement to match the neural network effectiveness, but further work, possibly combining certain aspects of the three non-parametric methods, may be justified. ?? 2006 Elsevier Ltd. All rights reserved.

  1. DFTB Parameters for the Periodic Table, Part 2: Energies and Energy Gradients from Hydrogen to Calcium.

    PubMed

    Oliveira, Augusto F; Philipsen, Pier; Heine, Thomas

    2015-11-10

    In the first part of this series, we presented a parametrization strategy to obtain high-quality electronic band structures on the basis of density-functional-based tight-binding (DFTB) calculations and published a parameter set called QUASINANO2013.1. Here, we extend our parametrization effort to include the remaining terms that are needed to compute the total energy and its gradient, commonly referred to as repulsive potential. Instead of parametrizing these terms as a two-body potential, we calculate them explicitly from the DFTB analogues of the Kohn-Sham total energy expression. This strategy requires only two further numerical parameters per element. Thus, the atomic configuration and four real numbers per element are sufficient to define the DFTB model at this level of parametrization. The QUASINANO2015 parameter set allows the calculation of energy, structure, and electronic structure of all systems composed of elements ranging from H to Ca. Extensive benchmarks show that the overall accuracy of QUASINANO2015 is comparable to that of well-established methods, including PM7 and hand-tuned DFTB parameter sets, while coverage of a much larger range of chemical systems is available.

  2. Parametric study of a passive solar-heated house with special attention on evaluating occupant thermal comfort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, A.F.; Heerwage, D.R.; Kippehan, C.J.

    A parametric study has been conducted of passive heating devices that are to be used to provide environmental conditioning for a single-family house. This study has been performed using the thermal simulation computer program UWENSOL. Climatic data used in this analysis were for Yokohama, Japan, which has a subtropical humid climate similar to Washington, D.C. (in terms of winter air temperatures and useful radiation). Initial studies considered the use of different wall thicknesses, glazing types, and orientations for a Trombe wall and alternate storage quantities for a walk-in greenhouse. Employing a number of comparative parametric studies an economical and efficientmore » combination of devices was selected. Then, using a computer routine COMFORT which is based on the Fanger Comfort Equation, another series of parametric analyses were performed to evaluate the degree of thermal comfort for the occupants of the house. The results of these analyses demonstrated that an averaged Predicted Mean Vote of less than 0.3 from a thermally-neutral condition could be maintained and that less than 10% of all occupants of such a passively-heated house would be thermally uncomfortable.« less

  3. A Bayesian goodness of fit test and semiparametric generalization of logistic regression with measurement data.

    PubMed

    Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E

    2013-06-01

    Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric Society.

  4. Location tests for biomarker studies: a comparison using simulations for the two-sample case.

    PubMed

    Scheinhardt, M O; Ziegler, A

    2013-01-01

    Gene, protein, or metabolite expression levels are often non-normally distributed, heavy tailed and contain outliers. Standard statistical approaches may fail as location tests in this situation. In three Monte-Carlo simulation studies, we aimed at comparing the type I error levels and empirical power of standard location tests and three adaptive tests [O'Gorman, Can J Stat 1997; 25: 269 -279; Keselman et al., Brit J Math Stat Psychol 2007; 60: 267- 293; Szymczak et al., Stat Med 2013; 32: 524 - 537] for a wide range of distributions. We simulated two-sample scenarios using the g-and-k-distribution family to systematically vary tail length and skewness with identical and varying variability between groups. All tests kept the type I error level when groups did not vary in their variability. The standard non-parametric U-test performed well in all simulated scenarios. It was outperformed by the two non-parametric adaptive methods in case of heavy tails or large skewness. Most tests did not keep the type I error level for skewed data in the case of heterogeneous variances. The standard U-test was a powerful and robust location test for most of the simulated scenarios except for very heavy tailed or heavy skewed data, and it is thus to be recommended except for these cases. The non-parametric adaptive tests were powerful for both normal and non-normal distributions under sample variance homogeneity. But when sample variances differed, they did not keep the type I error level. The parametric adaptive test lacks power for skewed and heavy tailed distributions.

  5. Dietary standards for school catering in France: serving moderate quantities to improve dietary quality without increasing the food-related cost of meals.

    PubMed

    Vieux, Florent; Dubois, Christophe; Allegre, Laëtitia; Mandon, Lionel; Ciantar, Laurent; Darmon, Nicole

    2013-01-01

    To assess the impact on food-related cost of meals to fulfill the new compulsory dietary standards for primary schools in France. A descriptive study assessed the relationship between the level of compliance with the standards of observed school meals and their food-related cost. An analytical study assessed the cost of series of meals published in professional journals, and complying or not with new dietary standards. The costs were based on prices actually paid for food used to prepare school meals. Food-related cost of meals. Parametric and nonparametric tests from a total of 42 and 120 series of 20 meals in the analytical and descriptive studies, respectively. The descriptive study indicated that meeting the standards was not related to cost. The analytical study showed that fulfilling the frequency guidelines increased the cost, whereas fulfilling the portion sizes criteria decreased it. Series of meals fully respecting the standards (ie, frequency and portion sizes) cost significantly less (-0.10 €/meal) than series not fulfilling them, because the standards recommend smaller portion sizes. Introducing portion sizes rules in dietary standards for school catering may help increase dietary quality without increasing the food cost of meals. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  6. Comparison of Parametric and Nonparametric Bootstrap Methods for Estimating Random Error in Equipercentile Equating

    ERIC Educational Resources Information Center

    Cui, Zhongmin; Kolen, Michael J.

    2008-01-01

    This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…

  7. Choice Inconsistencies among the Elderly: Evidence from Plan Choice in the Medicare Part D Program: Reply

    PubMed Central

    ABALUCK, JASON

    2017-01-01

    We explore the in- and out- of sample robustness of tests for choice inconsistencies based on parameter restrictions in parametric models, focusing on tests proposed by Ketcham, Kuminoff and Powers (KKP). We argue that their non-parametric alternatives are inherently conservative with respect to detecting mistakes. We then show that our parametric model is robust to KKP’s suggested specification checks, and that comprehensive goodness of fit measures perform better with our model than the expected utility model. Finally, we explore the robustness of our 2011 results to alternative normative assumptions highlighting the role of brand fixed effects and unobservable characteristics. PMID:29170561

  8. Single-arm phase II trial design under parametric cure models.

    PubMed

    Wu, Jianrong

    2015-01-01

    The current practice of designing single-arm phase II survival trials is limited under the exponential model. Trial design under the exponential model may not be appropriate when a portion of patients are cured. There is no literature available for designing single-arm phase II trials under the parametric cure model. In this paper, a test statistic is proposed, and a sample size formula is derived for designing single-arm phase II trials under a class of parametric cure models. Extensive simulations showed that the proposed test and sample size formula perform very well under different scenarios. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Detonation-flame arrester devices for gasoline cargo vapor recovery systems

    NASA Technical Reports Server (NTRS)

    Bjorklund, R. A.; Ryason, P. R.

    1980-01-01

    Empirical data on the deflagration-to-detonation run-up distance for flowing mixtures of gasoline and air in 15.2-cm- (6.0-in.-) diameter piping simulating a vapor recovery system are presented. The quenching capability of eight selected flame control devices subjected to repeated stable detonations was evaluated. The successful detonation-flame arresters were: (1) spiral-wound, crimped aluminum ribbon, (2) foamed nickel-chrome metal, (3) vertically packed bed of aluminum Ballast rings, and (4) water-trap or hydraulic back-pressure valve. Installation configurations for two of the more applicable arresters, the spiral-wound, crimped stainless-steel ribbon and the vertically packed bed of aluminum Ballast rings, were further optimized by a series of parametric tests. The final configuration of these two arresters was demonstrated with repeated detonation tests at conditions that simulated vapor recovery system operation. On these tests, the combustible mixture of gasoline and air continued to flow through the piping for periods up to 120 seconds after the initial detonation had been arrested. There was no indication of continuous burning or reignition occurring on either side of the test arresters.

  10. Effect of non-normality on test statistics for one-way independent groups designs.

    PubMed

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  11. kruX: matrix-based non-parametric eQTL discovery

    PubMed Central

    2014-01-01

    Background The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. Results We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. Conclusion kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com. PMID:24423115

  12. Shape sensing using multi-core fiber optic cable and parametric curve solutions.

    PubMed

    Moore, Jason P; Rogge, Matthew D

    2012-01-30

    The shape of a multi-core optical fiber is calculated by numerically solving a set of Frenet-Serret equations describing the path of the fiber in three dimensions. Included in the Frenet-Serret equations are curvature and bending direction functions derived from distributed fiber Bragg grating strain measurements in each core. The method offers advantages over prior art in that it determines complex three-dimensional fiber shape as a continuous parametric solution rather than an integrated series of discrete planar bends. Results and error analysis of the method using a tri-core optical fiber is presented. Maximum error expressed as a percentage of fiber length was found to be 7.2%.

  13. Monotonic trends in spatio-temporal distribution and concentration of monsoon precipitation (1901-2002), West Bengal, India

    NASA Astrophysics Data System (ADS)

    Chatterjee, Soumendu; Khan, Ansar; Akbari, Hashem; Wang, Yupeng

    2016-12-01

    This paper intended to investigate spatio-temporal monotonic trend and shift in concentration of monsoon precipitation across West Bengal, India, by analysing the time series of monthly precipitation from 18 weather stations during the period from 1901 to 2002. In dealing with, the inhomogeneity in the precipitation series, RHtestsV4 software package is used to detect, and adjust for, multiple change points (shifts) that could exist in data series. Finally, the cumulative deviation test was applied at 5% significant level to check the homogeneity (presence of historic changes by cumulative deviations test). Afterward, non-parametric Mann-Kendall (MK) test and Theil-Sen estimator (TSE) was applied to detect of nature and slope of trends; and, Sequential Mann Kendall (SQMK) test was applied for detection of turning point and magnitude of change in trends. Prior to the application of statistical tests, the pre-whitening technique was used to eliminate the effect of autocorrelation in precipitation data series. Four indices- precipitation concentration index (PCI), precipitation concentration degree (PCD), precipitation concentration period (PCP) and fulcrum (centre of gravity) were used to detect precipitation concentration and the spatial pattern in it. The application of the above-mentioned procedures has shown very notable statewide monotonic trend for monsoon precipitation time series. Regional cluster analysis by SQMK found increasing precipitation in mountain and coastal regions in general, except during the non- monsoon seasons. The results show that higher PCI values were mainly observed in South Bengal, whereas lower PCI values were mostly detected in North Bengal. The PCI values are noticeably larger in places where both monsoon total precipitation and span of rainy season are lower. The results of PCP reveal that precipitation in Gangetic Bengal mostly occurs in summer (monsoon season), and the rainy season arrives earlier in North Bengal than South Bengal, whereas the results of PCD also indicate that the precipitation in North Bengal was more dispersed within a year than that in South Bengal. The concentration characteristic of precipitation could be detected by fulcrum analysis, and significant concentration over most of West Bengal was obvious within July month band. Precipitation trend observed in West Bengal is compared with that in Central India (CI) region and comparison of precipitation departure with Indian monsoon and Gangetic Bengal can be explained by forecasting ensemble.

  14. Effects of cosmic rays on single event upsets

    NASA Technical Reports Server (NTRS)

    Venable, D. D.; Zajic, V.; Lowe, C. W.; Olidapupo, A.; Fogarty, T. N.

    1989-01-01

    Assistance was provided to the Brookhaven Single Event Upset (SEU) Test Facility. Computer codes were developed for fragmentation and secondary radiation affecting Very Large Scale Integration (VLSI) in space. A computer controlled CV (HP4192) test was developed for Terman analysis. Also developed were high speed parametric tests which are independent of operator judgment and a charge pumping technique for measurement of D(sub it) (E). The X-ray secondary effects, and parametric degradation as a function of dose rate were simulated. The SPICE simulation of static RAMs with various resistor filters was tested.

  15. On the Quality of ENSDF {gamma}-Ray Intensity Data for {gamma}-Ray Spectrometric Determination of Th and U and Their Decay Series Disequilibria, in the Assessment of the Radiation Dose Rate in Luminescence Dating of Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corte, Frans de; Vandenberghe, Dimitri; Wispelaere, Antoine de

    In luminescence dating of sediments, one of the most interesting tools for the determination of the annual radiation dose is Ge {gamma}-ray spectrometry. Indeed, it yields information on both the content of the radioelements K, Th, and U, and on the occurrence - in geological times - of disequilibria in the Th and U decay series. In the present work, two methodological variants of the {gamma}-spectrometric analysis were tested, which largely depend on the quality of the nuclear decay data involved: (1) a parametric calibration of the sediment measurements, and (2) the correction for the heavy spectral interference of themore » 226Ra 186.2 keV peak by 235U at 185.7 keV. The performance of these methods was examined via the analysis of three Certified Reference Materials, with the introduction of {gamma}-ray intensity data originating from ENSDF. Relevant conclusions were drawn as to the accuracy of the data and their uncertainties quoted.« less

  16. Blast effect on the lower extremities and its mitigation: a computational study.

    PubMed

    Dong, Liqiang; Zhu, Feng; Jin, Xin; Suresh, Mahi; Jiang, Binhui; Sevagan, Gopinath; Cai, Yun; Li, Guangyao; Yang, King H

    2013-12-01

    A series of computational studies were performed to investigate the response of the lower extremities of mounted soldiers under landmine detonation. A numerical human body model newly developed at Wayne State University was used to simulate two types of experimental studies and the model predictions were validated against test data in terms of the tibia axial force as well as bone fracture pattern. Based on the validated model, the minimum axial force causing tibia facture was found. Then a series of parametric studies was conducted to determine the critical velocity (peak velocity of the floor plate) causing tibia fracture at different upper/lower leg angles. In addition, to limit the load transmission through the vehicular floor, two types of energy absorbing materials, namely IMPAXX(®) foam and aluminum alloy honeycomb, were selected for floor matting. Their performances in terms of blast effect mitigation were compared using the validated numerical model, and it has been found that honeycomb is a more efficient material for blast injury prevention under the loading conditions studied. © 2013 Elsevier Ltd. All rights reserved.

  17. Definition study for photovoltaic residential prototype system

    NASA Technical Reports Server (NTRS)

    Shepard, N. F.; Landes, R.; Kornrumpf, W. P.

    1976-01-01

    A site evaluation was performed to assess the relative merits of different regions of the country in terms of the suitability for experimental photovoltaic powered residences. Eight sites were selected based on evaluation criteria which included population, photovoltaic systems performance and the cost of electrical energy. A parametric sensitivity analysis was performed for four selected site locations. Analytical models were developed for four different power system implementation approaches. Using the model which represents a direct (or float) charge system implementation the performance sensitivity to the following parameter variations is reported: (1) solar roof slope angle; (2) ratio of the number of series cells in the solar array to the number of series cells in the lead-acid battery; and (3) battery size. For a Cleveland site location, a system with no on site energy storage and with a maximum power tracking inverter which feeds back excess power to the utility was shown to have 19 percent greater net system output than the second place system. The experiment test plan is described. The load control and data acquisition system and the data display panel for the residence are discussed.

  18. Modelling Pollutant Dispersion in a Street Network

    NASA Astrophysics Data System (ADS)

    Salem, N. Ben; Garbero, V.; Salizzoni, P.; Lamaison, G.; Soulhac, L.

    2015-04-01

    This study constitutes a further step in the analysis of the performances of a street network model to simulate atmospheric pollutant dispersion in urban areas. The model, named SIRANE, is based on the decomposition of the urban atmosphere into two sub-domains: the urban boundary layer, whose dynamics is assumed to be well established, and the urban canopy, represented as a series of interconnected boxes. Parametric laws govern the mass exchanges between the boxes under the assumption that the pollutant dispersion within the canopy can be fully simulated by modelling three main bulk transfer phenomena: channelling along street axes, transfers at street intersections, and vertical exchange between street canyons and the overlying atmosphere. Here, we aim to evaluate the reliability of the parametrizations adopted to simulate these phenomena, by focusing on their possible dependence on the external wind direction. To this end, we test the model against concentration measurements within an idealized urban district whose geometrical layout closely matches the street network represented in SIRANE. The analysis is performed for an urban array with a fixed geometry and a varying wind incidence angle. The results show that the model provides generally good results with the reference parametrizations adopted in SIRANE and that its performances are quite robust for a wide range of the model parameters. This proves the reliability of the street network approach in simulating pollutant dispersion in densely built city districts. The results also show that the model performances may be improved by considering a dependence of the wind fluctuations at street intersections and of the vertical exchange velocity on the direction of the incident wind. This opens the way for further investigations to clarify the dependence of these parameters on wind direction and street aspect ratios.

  19. Genetic Networks and Anticipation of Gene Expression Patterns

    NASA Astrophysics Data System (ADS)

    Gebert, J.; Lätsch, M.; Pickl, S. W.; Radde, N.; Weber, G.-W.; Wünschiers, R.

    2004-08-01

    An interesting problem for computational biology is the analysis of time-series expression data. Here, the application of modern methods from dynamical systems, optimization theory, numerical algorithms and the utilization of implicit discrete information lead to a deeper understanding. In [1], we suggested to represent the behavior of time-series gene expression patterns by a system of ordinary differential equations, which we analytically and algorithmically investigated under the parametrical aspect of stability or instability. Our algorithm strongly exploited combinatorial information. In this paper, we deepen, extend and exemplify this study from the viewpoint of underlying mathematical modelling. This modelling consists in evaluating DNA-microarray measurements as the basis of anticipatory prediction, in the choice of a smooth model given by differential equations, in an approach of the right-hand side with parametric matrices, and in a discrete approximation which is a least squares optimization problem. We give a mathematical and biological discussion, and pay attention to the special case of a linear system, where the matrices do not depend on the state of expressions. Here, we present first numerical examples.

  20. Product Module Rig Test

    NASA Technical Reports Server (NTRS)

    Holdeman, James D. (Technical Monitor); Chiappetta, Louis, Jr.; Hautman, Donald J.; Ols, John T.; Padget, Frederick C., IV; Peschke, William O. T.; Shirley, John A.; Siskind, Kenneth S.

    2004-01-01

    The low emissions potential of a Rich-Quench-Lean (RQL) combustor for use in the High Speed Civil Transport (HSCT) application was evaluated as part of Work Breakdown Structure (WBS) 1.0.2.7 of the NASA Critical Propulsion Components (CPC) Program under Contract NAS3-27235. Combustion testing was conducted in cell 1E of the Jet Burner Test Stand at United Technologies Research Center. Specifically, a Rich-Quench-Lean combustor, utilizing reduced scale quench technology implemented in a quench vane concept in a product-like configuration (Product Module Rig), demonstrated the capability of achieving an emissions index of nitrogen oxides (NOx EI) of 8.5 gm/Kg fuel at the supersonic flight condition (relative to the program goal of 5 gm/Kg fuel). Developmental parametric testing of various quench vane configurations in the more fundamental flametube, Single Module Rig Configuration, demonstrated NOx EI as low as 5.2. All configurations in both the Product Module Rig configuration and the Single Module Rig configuration demonstrated exceptional efficiencies, greater than 99.95 percent, relative to the program goal of 99.9 percent efficiency at supersonic cruise conditions. Sensitivity of emissions to quench orifice design parameters were determined during the parametric quench vane test series in support of the design of the Product Module Rig configuration. For the rectangular quench orifices investigated, an aspect ratio (length/width) of approximately 2 was found to be near optimum. An optimum for orifice spacing was found to exist at approximately 0.167 inches, resulting in 24 orifices per side of a quench vane, for the 0.435 inch quench zone channel height investigated in the Single Module Rig. Smaller quench zone channel heights appeared to be beneficial in reducing emissions. Measurements were also obtained in the Single Module Rig configuration on the sensitivity of emissions to the critical combustor parameters of fuel/air ratio, pressure drop, and residence time. Minimal sensitivity was observed for all of these parameters.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT ANR PIPELINE COMPANY PARAMETRIC EMISSIONS MONITORING SYSTEM (PEMS)

    EPA Science Inventory

    The Environmental Technology Verification report discusses the technology and performance of a gaseous-emissions monitoring system for large, natural-gas-fired internal combustion engines. The device tested is the Parametric Emissions Monitoring System (PEMS) manufactured by ANR ...

  2. A double expansion method for the frequency response of finite-length beams with periodic parameters

    NASA Astrophysics Data System (ADS)

    Ying, Z. G.; Ni, Y. Q.

    2017-03-01

    A double expansion method for the frequency response of finite-length beams with periodic distribution parameters is proposed. The vibration response of the beam with spatial periodic parameters under harmonic excitations is studied. The frequency response of the periodic beam is the function of parametric period and then can be expressed by the series with the product of periodic and non-periodic functions. The procedure of the double expansion method includes the following two main steps: first, the frequency response function and periodic parameters are expanded by using identical periodic functions based on the extension of the Floquet-Bloch theorem, and the period-parametric differential equation for the frequency response is converted into a series of linear differential equations with constant coefficients; second, the solutions to the linear differential equations are expanded by using modal functions which satisfy the boundary conditions, and the linear differential equations are converted into algebraic equations according to the Galerkin method. The expansion coefficients are obtained by solving the algebraic equations and then the frequency response function is finally determined. The proposed double expansion method can uncouple the effects of the periodic expansion and modal expansion so that the expansion terms are determined respectively. The modal number considered in the second expansion can be reduced remarkably in comparison with the direct expansion method. The proposed double expansion method can be extended and applied to the other structures with periodic distribution parameters for dynamics analysis. Numerical results on the frequency response of the finite-length periodic beam with various parametric wave numbers and wave amplitude ratios are given to illustrate the effective application of the proposed method and the new frequency response characteristics, including the parameter-excited modal resonance, doubling-peak frequency response and remarkable reduction of the maximum frequency response for certain parametric wave number and wave amplitude. The results have the potential application to structural vibration control.

  3. The chi-square test of independence.

    PubMed

    McHugh, Mary L

    2013-01-01

    The Chi-square statistic is a non-parametric (distribution free) tool designed to analyze group differences when the dependent variable is measured at a nominal level. Like all non-parametric statistics, the Chi-square is robust with respect to the distribution of the data. Specifically, it does not require equality of variances among the study groups or homoscedasticity in the data. It permits evaluation of both dichotomous independent variables, and of multiple group studies. Unlike many other non-parametric and some parametric statistics, the calculations needed to compute the Chi-square provide considerable information about how each of the groups performed in the study. This richness of detail allows the researcher to understand the results and thus to derive more detailed information from this statistic than from many others. The Chi-square is a significance statistic, and should be followed with a strength statistic. The Cramer's V is the most common strength test used to test the data when a significant Chi-square result has been obtained. Advantages of the Chi-square include its robustness with respect to distribution of the data, its ease of computation, the detailed information that can be derived from the test, its use in studies for which parametric assumptions cannot be met, and its flexibility in handling data from both two group and multiple group studies. Limitations include its sample size requirements, difficulty of interpretation when there are large numbers of categories (20 or more) in the independent or dependent variables, and tendency of the Cramer's V to produce relative low correlation measures, even for highly significant results.

  4. Numerical prediction of 3-D ejector flows

    NASA Technical Reports Server (NTRS)

    Roberts, D. W.; Paynter, G. C.

    1979-01-01

    The use of parametric flow analysis, rather than parametric scale testing, to support the design of an ejector system offers a number of potential advantages. The application of available 3-D flow analyses to the design ejectors can be subdivided into several key elements. These are numerics, turbulence modeling, data handling and display, and testing in support of analysis development. Experimental and predicted jet exhaust for the Boeing 727 aircraft are examined.

  5. Evaluation of Stochastic Rainfall Models in Capturing Climate Variability for Future Drought and Flood Risk Assessment

    NASA Astrophysics Data System (ADS)

    Chowdhury, A. F. M. K.; Lockart, N.; Willgoose, G. R.; Kuczera, G. A.; Kiem, A.; Nadeeka, P. M.

    2016-12-01

    One of the key objectives of stochastic rainfall modelling is to capture the full variability of climate system for future drought and flood risk assessment. However, it is not clear how well these models can capture the future climate variability when they are calibrated to Global/Regional Climate Model data (GCM/RCM) as these datasets are usually available for very short future period/s (e.g. 20 years). This study has assessed the ability of two stochastic daily rainfall models to capture climate variability by calibrating them to a dynamically downscaled RCM dataset in an east Australian catchment for 1990-2010, 2020-2040, and 2060-2080 epochs. The two stochastic models are: (1) a hierarchical Markov Chain (MC) model, which we developed in a previous study and (2) a semi-parametric MC model developed by Mehrotra and Sharma (2007). Our hierarchical model uses stochastic parameters of MC and Gamma distribution, while the semi-parametric model uses a modified MC process with memory of past periods and kernel density estimation. This study has generated multiple realizations of rainfall series by using parameters of each model calibrated to the RCM dataset for each epoch. The generated rainfall series are used to generate synthetic streamflow by using a SimHyd hydrology model. Assessing the synthetic rainfall and streamflow series, this study has found that both stochastic models can incorporate a range of variability in rainfall as well as streamflow generation for both current and future periods. However, the hierarchical model tends to overestimate the multiyear variability of wet spell lengths (therefore, is less likely to simulate long periods of drought and flood), while the semi-parametric model tends to overestimate the mean annual rainfall depths and streamflow volumes (hence, simulated droughts are likely to be less severe). Sensitivity of these limitations of both stochastic models in terms of future drought and flood risk assessment will be discussed.

  6. Sensitivity and specificity of normality tests and consequences on reference interval accuracy at small sample size: a computer-simulation study.

    PubMed

    Le Boedec, Kevin

    2016-12-01

    According to international guidelines, parametric methods must be chosen for RI construction when the sample size is small and the distribution is Gaussian. However, normality tests may not be accurate at small sample size. The purpose of the study was to evaluate normality test performance to properly identify samples extracted from a Gaussian population at small sample sizes, and assess the consequences on RI accuracy of applying parametric methods to samples that falsely identified the parent population as Gaussian. Samples of n = 60 and n = 30 values were randomly selected 100 times from simulated Gaussian, lognormal, and asymmetric populations of 10,000 values. The sensitivity and specificity of 4 normality tests were compared. Reference intervals were calculated using 6 different statistical methods from samples that falsely identified the parent population as Gaussian, and their accuracy was compared. Shapiro-Wilk and D'Agostino-Pearson tests were the best performing normality tests. However, their specificity was poor at sample size n = 30 (specificity for P < .05: .51 and .50, respectively). The best significance levels identified when n = 30 were 0.19 for Shapiro-Wilk test and 0.18 for D'Agostino-Pearson test. Using parametric methods on samples extracted from a lognormal population but falsely identified as Gaussian led to clinically relevant inaccuracies. At small sample size, normality tests may lead to erroneous use of parametric methods to build RI. Using nonparametric methods (or alternatively Box-Cox transformation) on all samples regardless of their distribution or adjusting, the significance level of normality tests depending on sample size would limit the risk of constructing inaccurate RI. © 2016 American Society for Veterinary Clinical Pathology.

  7. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  8. Testing in semiparametric models with interaction, with applications to gene-environment interactions.

    PubMed

    Maity, Arnab; Carroll, Raymond J; Mammen, Enno; Chatterjee, Nilanjan

    2009-01-01

    Motivated from the problem of testing for genetic effects on complex traits in the presence of gene-environment interaction, we develop score tests in general semiparametric regression problems that involves Tukey style 1 degree-of-freedom form of interaction between parametrically and non-parametrically modelled covariates. We find that the score test in this type of model, as recently developed by Chatterjee and co-workers in the fully parametric setting, is biased and requires undersmoothing to be valid in the presence of non-parametric components. Moreover, in the presence of repeated outcomes, the asymptotic distribution of the score test depends on the estimation of functions which are defined as solutions of integral equations, making implementation difficult and computationally taxing. We develop profiled score statistics which are unbiased and asymptotically efficient and can be performed by using standard bandwidth selection methods. In addition, to overcome the difficulty of solving functional equations, we give easy interpretations of the target functions, which in turn allow us to develop estimation procedures that can be easily implemented by using standard computational methods. We present simulation studies to evaluate type I error and power of the method proposed compared with a naive test that does not consider interaction. Finally, we illustrate our methodology by analysing data from a case-control study of colorectal adenoma that was designed to investigate the association between colorectal adenoma and the candidate gene NAT2 in relation to smoking history.

  9. Verification of endocrinological functions at a short distance between parametric speakers and the human body.

    PubMed

    Lee, Soomin; Katsuura, Tetsuo; Shimomura, Yoshihiro

    2011-01-01

    In recent years, a new type of speaker called the parametric speaker has been used to generate highly directional sound, and these speakers are now commercially available. In our previous study, we verified that the burden of the parametric speaker was lower than that of the general speaker for endocrine functions. However, nothing has yet been demonstrated about the effects of the shorter distance than 2.6 m between parametric speakers and the human body. Therefore, we investigated the distance effect on endocrinological function and subjective evaluation. Nine male subjects participated in this study. They completed three consecutive sessions: a 20-min quiet period as a baseline, a 30-min mental task period with general speakers or parametric speakers, and a 20-min recovery period. We measured salivary cortisol and chromogranin A (CgA) concentrations. Furthermore, subjects took the Kwansei-gakuin Sleepiness Scale (KSS) test before and after the task and also a sound quality evaluation test after it. Four experiments, one with a speaker condition (general speaker and parametric speaker), the other with a distance condition (0.3 m and 1.0 m), were conducted, respectively, at the same time of day on separate days. We used three-way repeated measures ANOVA (speaker factor × distance factor × time factor) to examine the effects of the parametric speaker. We found that the endocrinological functions were not significantly different between the speaker condition and the distance condition. The results also showed that the physiological burdens increased with progress in time independent of the speaker condition and distance condition.

  10. A numerical study on piezoelectric energy harvesting by combining transverse galloping and parametric instability phenomena

    NASA Astrophysics Data System (ADS)

    Franzini, Guilherme Rosa; Santos, Rebeca Caramêz Saraiva; Pesce, Celso Pupo

    2017-12-01

    This paper aims to numerically investigate the effects of parametric instability on piezoelectric energy harvesting from the transverse galloping of a square prism. A two degrees-of-freedom reduced-order model for this problem is proposed and numerically integrated. A usual quasi-steady galloping model is applied, where the transverse force coefficient is adopted as a cubic polynomial function with respect to the angle of attack. Time-histories of nondimensional prism displacement, electric voltage and power dissipated at both the dashpot and the electrical resistance are obtained as functions of the reduced velocity. Both, oscillation amplitude and electric voltage, increased with the reduced velocity for all parametric excitation conditions tested. For low values of reduced velocity, 2:1 parametric excitation enhances the electric voltage. On the other hand, for higher reduced velocities, a 1:1 parametric excitation (i.e., the same as the natural frequency) enhances both oscillation amplitude and electric voltage. It has been also found that, depending on the parametric excitation frequency, the harvested electrical power can be amplified in 70% when compared to the case under no parametric excitation.

  11. Parametric Modeling for Fluid Systems

    NASA Technical Reports Server (NTRS)

    Pizarro, Yaritzmar Rosario; Martinez, Jonathan

    2013-01-01

    Fluid Systems involves different projects that require parametric modeling, which is a model that maintains consistent relationships between elements as is manipulated. One of these projects is the Neo Liquid Propellant Testbed, which is part of Rocket U. As part of Rocket U (Rocket University), engineers at NASA's Kennedy Space Center in Florida have the opportunity to develop critical flight skills as they design, build and launch high-powered rockets. To build the Neo testbed; hardware from the Space Shuttle Program was repurposed. Modeling for Neo, included: fittings, valves, frames and tubing, between others. These models help in the review process, to make sure regulations are being followed. Another fluid systems project that required modeling is Plant Habitat's TCUI test project. Plant Habitat is a plan to develop a large growth chamber to learn the effects of long-duration microgravity exposure to plants in space. Work for this project included the design and modeling of a duct vent for flow test. Parametric Modeling for these projects was done using Creo Parametric 2.0.

  12. Processes controlling surface, bottom and lateral melt of Arctic sea ice in a state of the art sea ice model.

    PubMed

    Tsamados, Michel; Feltham, Daniel; Petty, Alek; Schroeder, David; Flocco, Daniela

    2015-10-13

    We present a modelling study of processes controlling the summer melt of the Arctic sea ice cover. We perform a sensitivity study and focus our interest on the thermodynamics at the ice-atmosphere and ice-ocean interfaces. We use the Los Alamos community sea ice model CICE, and additionally implement and test three new parametrization schemes: (i) a prognostic mixed layer; (ii) a three equation boundary condition for the salt and heat flux at the ice-ocean interface; and (iii) a new lateral melt parametrization. Recent additions to the CICE model are also tested, including explicit melt ponds, a form drag parametrization and a halodynamic brine drainage scheme. The various sea ice parametrizations tested in this sensitivity study introduce a wide spread in the simulated sea ice characteristics. For each simulation, the total melt is decomposed into its surface, bottom and lateral melt components to assess the processes driving melt and how this varies regionally and temporally. Because this study quantifies the relative importance of several processes in driving the summer melt of sea ice, this work can serve as a guide for future research priorities. © 2015 The Author(s).

  13. Two-sample statistics for testing the equality of survival functions against improper semi-parametric accelerated failure time alternatives: an application to the analysis of a breast cancer clinical trial.

    PubMed

    Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry

    2004-06-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.

  14. Ultrasonically Absorptive Coatings for Hypersonic Laminar Flow Control

    DTIC Science & Technology

    2007-12-01

    integratt JAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of th...solid foundation for large-scale demonstration of the UAC-LFC performance the CUBRC LENS I -tnel as wel as fabrication of ceramic UAC samples...with regular microstructure to be tested the CUBRC LENS I tunnel, extensive parametric studies of the UAC laminar flow control performance were conducted

  15. Ultrasonically Absorptive Coatings for Hypersonic

    DTIC Science & Technology

    2008-05-13

    UAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of the UAC-LFC...approaching the large-scale demonstration stage in the CUBRC LENS tunnel as well as fabrication of ceramic UAC samples integrated into TPS. Summary...integrate UAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of

  16. The use of analysis of variance procedures in biological studies

    USGS Publications Warehouse

    Williams, B.K.

    1987-01-01

    The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.

  17. Accuracy evaluation of Fourier series analysis and singular spectrum analysis for predicting the volume of motorcycle sales in Indonesia

    NASA Astrophysics Data System (ADS)

    Sasmita, Yoga; Darmawan, Gumgum

    2017-08-01

    This research aims to evaluate the performance of forecasting by Fourier Series Analysis (FSA) and Singular Spectrum Analysis (SSA) which are more explorative and not requiring parametric assumption. Those methods are applied to predicting the volume of motorcycle sales in Indonesia from January 2005 to December 2016 (monthly). Both models are suitable for seasonal and trend component data. Technically, FSA defines time domain as the result of trend and seasonal component in different frequencies which is difficult to identify in the time domain analysis. With the hidden period is 2,918 ≈ 3 and significant model order is 3, FSA model is used to predict testing data. Meanwhile, SSA has two main processes, decomposition and reconstruction. SSA decomposes the time series data into different components. The reconstruction process starts with grouping the decomposition result based on similarity period of each component in trajectory matrix. With the optimum of window length (L = 53) and grouping effect (r = 4), SSA predicting testing data. Forecasting accuracy evaluation is done based on Mean Absolute Percentage Error (MAPE), Mean Absolute Error (MAE) and Root Mean Square Error (RMSE). The result shows that in the next 12 month, SSA has MAPE = 13.54 percent, MAE = 61,168.43 and RMSE = 75,244.92 and FSA has MAPE = 28.19 percent, MAE = 119,718.43 and RMSE = 142,511.17. Therefore, to predict volume of motorcycle sales in the next period should use SSA method which has better performance based on its accuracy.

  18. Plasma-enhanced mixing and flameholding in supersonic flow

    PubMed Central

    Firsov, Alexander; Savelkin, Konstantin V.; Yarantsev, Dmitry A.; Leonov, Sergey B.

    2015-01-01

    The results of experimental study of plasma-based mixing, ignition and flameholding in a supersonic model combustor are presented in the paper. The model combustor has a length of 600 mm and cross section of 72 mm width and 60 mm height. The fuel is directly injected into supersonic airflow (Mach number M=2, static pressure Pst=160–250 Torr) through wall orifices. Two series of tests are focused on flameholding and mixing correspondingly. In the first series, the near-surface quasi-DC electrical discharge is generated by flush-mounted electrodes at electrical power deposition of Wpl=3–24 kW. The scope includes parametric study of ignition and flame front dynamics, and comparison of three schemes of plasma generation: the first and the second layouts examine the location of plasma generators upstream and downstream from the fuel injectors. The third pattern follows a novel approach of combined mixing/ignition technique, where the electrical discharge distributes along the fuel jet. The last pattern demonstrates a significant advantage in terms of flameholding limit. In the second series of tests, a long discharge of submicrosecond duration is generated across the flow and along the fuel jet. A gasdynamic instability of thermal cavity developed after a deposition of high-power density in a thin plasma filament promotes the air–fuel mixing. The technique studied in this work has weighty potential for high-speed combustion applications, including cold start/restart of scramjet engines and support of transition regime in dual-mode scramjet and at off-design operation. PMID:26170434

  19. Density Fluctuations in the Solar Wind Driven by Alfvén Wave Parametric Decay

    NASA Astrophysics Data System (ADS)

    Bowen, Trevor A.; Badman, Samuel; Hellinger, Petr; Bale, Stuart D.

    2018-02-01

    Measurements and simulations of inertial compressive turbulence in the solar wind are characterized by anti-correlated magnetic fluctuations parallel to the mean field and density structures. This signature has been interpreted as observational evidence for non-propagating pressure balanced structures, kinetic ion-acoustic waves, as well as the MHD slow-mode. Given the high damping rates of parallel propagating compressive fluctuations, their ubiquity in satellite observations is surprising and suggestive of a local driving process. One possible candidate for the generation of compressive fluctuations in the solar wind is the Alfvén wave parametric instability. Here, we test the parametric decay process as a source of compressive waves in the solar wind by comparing the collisionless damping rates of compressive fluctuations with growth rates of the parametric decay instability daughter waves. Our results suggest that generation of compressive waves through parametric decay is overdamped at 1 au, but that the presence of slow-mode-like density fluctuations is correlated with the parametric decay of Alfvén waves.

  20. Nanotribological behavior analysis of graphene/metal nanocomposites via MD simulations: New concepts and underlying mechanisms

    NASA Astrophysics Data System (ADS)

    Montazeri, A.; Mobarghei, A.

    2018-04-01

    In this article, we report a series of MD-based nanoindentation tests aimed to examine the nanotribological characteristics of metal-based nanocomposites in the presence of graphene sheets. To evaluate the effects of graphene/matrix interactions on the results, nickel and copper are selected as metals having strong and weak interactions with graphene, respectively. Consequently, the influence of graphene layers sliding and their distance from the sample surface on the nanoindentation outputs is thoroughly examined. Additionally, the temperature dependence of the results is deeply investigated with emphasis on the underlying mechanisms. To verify the accuracy of nanoindentation outputs, results of this method are compared with the data obtained via the tensile test. It is concluded that the nanoindentation results are closer to the values obtained by means of experimental setups. Employing these numerical-based experiments enables us to perform parametric studies to find out the dominant factors affecting the nanotribological behavior of these nanocomposites at the atomic-scale.

  1. Recent Papers in Parametric Modelling of Time Series.

    DTIC Science & Technology

    1983-04-01

    conceptually R t A A different ways. First if one is Interested in savi 8h t~p t,p ,p parameters, then a resolution comparable to the . t -) Let’s agree to...has been affected most. Continue to next page for Figures 2 through 7 CONCLUSIONS We have presented a general framwork for de- riving and

  2. Creating Teachers' Perceptual, Behavioral, and Attitudinal Change Using Professional Development Workshops

    ERIC Educational Resources Information Center

    Shriner, Michael; Schlee, Bethanne; Hamil, Melissa; Libler, Rebecca

    2009-01-01

    As part of an on-going project designed to impact teacher quality at the pre-service, induction, and professional development levels, this paper summarizes the parametric results of a series of four different workshops conducted in the summer of 2007. In an effort to glean a better understanding of their knowledge, attitudinal, perceptual, and…

  3. Study of solid rocket motor for space shuttle booster. Volume 4: Cost

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The cost data for solid propellant rocket engines for use with the space shuttle are presented. The data are based on the selected 156 inch parallel and series burn configurations. Summary cost data are provided for the production of the 120 inch and 260 inch configurations. Graphs depicting parametric cost estimating relationships are included.

  4. Jet-Surface Interaction Noise from High-Aspect Ratio Nozzles: Test Summary

    NASA Technical Reports Server (NTRS)

    Brown, Clifford; Podboy, Gary

    2017-01-01

    Noise and flow data have been acquired for a 16:1 aspect ratio rectangular nozzle exhausting near a simple surface at the NASA Glenn Research Center as part of an ongoing effort to understand, model, and predict the noise produced by current and future concept aircraft employing a tightly integrated engine airframe designs. The particular concept under consideration in this experiment is a blended-wing-body airframe powered by a series of electric fans exhausting through slot nozzle over an aft deck. The exhaust Mach number and surface length were parametrically varied during the test. Far-field noise data were acquired for all nozzle surface geometries and exhaust flow conditions. Phased-array noise source localization data and in-flow pressure data were also acquired for a subset of the isolated (no surface) and surface configurations; these measurements provide data that have proven useful for modeling the jet-surface interaction noise source and the surface effect on the jet-mixing noise in round jets. A summary of the nozzle surface geometry, flow conditions tested, and data collected are presented.

  5. Case study of supply induced demand: the case of provision of imaging scans (computed tomography and magnetic resonance) at Unimed-Manaus.

    PubMed

    Andrade, Edson de Oliveira; Andrade, Elizabeth Nogueira de; Gallo, José Hiran

    2011-01-01

    To present the experience of a health plan operator (Unimed-Manaus) in Manaus, Amazonas, Brazil, with the accreditation of imaging services and the demand induced by the supply of new services (Roemer's Law). This is a retrospective work studying a time series covering the period from January 1998 to June 2004, in which the computed tomography and the magnetic resonance imaging services were implemented as part of the services offered by that health plan operator. Statistical analysis consisted of a descriptive and an inferential part, with the latter using a mean parametric test (Student T-test and ANOVA) and the Pearson correlation test. A 5% alpha and a 95% confidence interval were adopted. At Unimed-Manaus, the supply of new imaging services, by itself, was identified as capable of generating an increased service demand, thus characterizing the phenomenon described by Roemer. The results underscore the need to be aware of the fact that the supply of new health services could bring about their increased use without a real demand.

  6. Establishment of Biological Reference Intervals and Reference Curve for Urea by Exploratory Parametric and Non-Parametric Quantile Regression Models.

    PubMed

    Sarkar, Rajarshi

    2013-07-01

    The validity of the entire renal function tests as a diagnostic tool depends substantially on the Biological Reference Interval (BRI) of urea. Establishment of BRI of urea is difficult partly because exclusion criteria for selection of reference data are quite rigid and partly due to the compartmentalization considerations regarding age and sex of the reference individuals. Moreover, construction of Biological Reference Curve (BRC) of urea is imperative to highlight the partitioning requirements. This a priori study examines the data collected by measuring serum urea of 3202 age and sex matched individuals, aged between 1 and 80 years, by a kinetic UV Urease/GLDH method on a Roche Cobas 6000 auto-analyzer. Mann-Whitney U test of the reference data confirmed the partitioning requirement by both age and sex. Further statistical analysis revealed the incompatibility of the data for a proposed parametric model. Hence the data was non-parametrically analysed. BRI was found to be identical for both sexes till the 2(nd) decade, and the BRI for males increased progressively 6(th) decade onwards. Four non-parametric models were postulated for construction of BRC: Gaussian kernel, double kernel, local mean and local constant, of which the last one generated the best-fitting curves. Clinical decision making should become easier and diagnostic implications of renal function tests should become more meaningful if this BRI is followed and the BRC is used as a desktop tool in conjunction with similar data for serum creatinine.

  7. How to Evaluate Phase Differences between Trial Groups in Ongoing Electrophysiological Signals

    PubMed Central

    VanRullen, Rufin

    2016-01-01

    A growing number of studies endeavor to reveal periodicities in sensory and cognitive functions, by comparing the distribution of ongoing (pre-stimulus) oscillatory phases between two (or more) trial groups reflecting distinct experimental outcomes. A systematic relation between the phase of spontaneous electrophysiological signals, before a stimulus is even presented, and the eventual result of sensory or cognitive processing for that stimulus, would be indicative of an intrinsic periodicity in the underlying neural process. Prior studies of phase-dependent perception have used a variety of analytical methods to measure and evaluate phase differences, and there is currently no established standard practice in this field. The present report intends to remediate this need, by systematically comparing the statistical power of various measures of “phase opposition” between two trial groups, in a number of real and simulated experimental situations. Seven measures were evaluated: one parametric test (circular Watson-Williams test), and three distinct measures of phase opposition (phase bifurcation index, phase opposition sum, and phase opposition product) combined with two procedures for non-parametric statistical testing (permutation, or a combination of z-score and permutation). While these are obviously not the only existing or conceivable measures, they have all been used in recent studies. All tested methods performed adequately on a previously published dataset (Busch et al., 2009). On a variety of artificially constructed datasets, no single measure was found to surpass all others, but instead the suitability of each measure was contingent on several experimental factors: the time, frequency, and depth of oscillatory phase modulation; the absolute and relative amplitudes of post-stimulus event-related potentials for the two trial groups; the absolute and relative trial numbers for the two groups; and the number of permutations used for non-parametric testing. The concurrent use of two phase opposition measures, the parametric Watson-Williams test and a non-parametric test based on summing inter-trial coherence values for the two trial groups, appears to provide the most satisfactory outcome in all situations tested. Matlab code is provided to automatically compute these phase opposition measures. PMID:27683543

  8. The extension of the parametrization of the radio source coordinates in geodetic VLBI and its impact on the time series analysis

    NASA Astrophysics Data System (ADS)

    Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald

    2017-07-01

    The radio sources within the most recent celestial reference frame (CRF) catalog ICRF2 are represented by a single, time-invariant coordinate pair. The datum sources were chosen mainly according to certain statistical properties of their position time series. Yet, such statistics are not applicable unconditionally, and also ambiguous. However, ignoring systematics in the source positions of the datum sources inevitably leads to a degradation of the quality of the frame and, therefore, also of the derived quantities such as the Earth orientation parameters. One possible approach to overcome these deficiencies is to extend the parametrization of the source positions, similarly to what is done for the station positions. We decided to use the multivariate adaptive regression splines algorithm to parametrize the source coordinates. It allows a great deal of automation, by combining recursive partitioning and spline fitting in an optimal way. The algorithm finds the ideal knot positions for the splines and, thus, the best number of polynomial pieces to fit the data autonomously. With that we can correct the ICRF2 a priori coordinates for our analysis and eliminate the systematics in the position estimates. This allows us to introduce also special handling sources into the datum definition, leading to on average 30 % more sources in the datum. We find that not only the CPO can be improved by more than 10 % due to the improved geometry, but also the station positions, especially in the early years of VLBI, can benefit greatly.

  9. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    PubMed

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.

  10. A global goodness-of-fit test for receiver operating characteristic curve analysis via the bootstrap method.

    PubMed

    Zou, Kelly H; Resnic, Frederic S; Talos, Ion-Florin; Goldberg-Zimring, Daniel; Bhagwat, Jui G; Haker, Steven J; Kikinis, Ron; Jolesz, Ferenc A; Ohno-Machado, Lucila

    2005-10-01

    Medical classification accuracy studies often yield continuous data based on predictive models for treatment outcomes. A popular method for evaluating the performance of diagnostic tests is the receiver operating characteristic (ROC) curve analysis. The main objective was to develop a global statistical hypothesis test for assessing the goodness-of-fit (GOF) for parametric ROC curves via the bootstrap. A simple log (or logit) and a more flexible Box-Cox normality transformations were applied to untransformed or transformed data from two clinical studies to predict complications following percutaneous coronary interventions (PCIs) and for image-guided neurosurgical resection results predicted by tumor volume, respectively. We compared a non-parametric with a parametric binormal estimate of the underlying ROC curve. To construct such a GOF test, we used the non-parametric and parametric areas under the curve (AUCs) as the metrics, with a resulting p value reported. In the interventional cardiology example, logit and Box-Cox transformations of the predictive probabilities led to satisfactory AUCs (AUC=0.888; p=0.78, and AUC=0.888; p=0.73, respectively), while in the brain tumor resection example, log and Box-Cox transformations of the tumor size also led to satisfactory AUCs (AUC=0.898; p=0.61, and AUC=0.899; p=0.42, respectively). In contrast, significant departures from GOF were observed without applying any transformation prior to assuming a binormal model (AUC=0.766; p=0.004, and AUC=0.831; p=0.03), respectively. In both studies the p values suggested that transformations were important to consider before applying any binormal model to estimate the AUC. Our analyses also demonstrated and confirmed the predictive values of different classifiers for determining the interventional complications following PCIs and resection outcomes in image-guided neurosurgery.

  11. A Nonparametric Geostatistical Method For Estimating Species Importance

    Treesearch

    Andrew J. Lister; Rachel Riemann; Michael Hoppus

    2001-01-01

    Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...

  12. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    PubMed

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  13. Study of aerodynamic technology for single-cruise engine V/STOL fighter/attack aircraft

    NASA Technical Reports Server (NTRS)

    Driggers, H. H.; Powers, S. A.; Roush, R. T.

    1982-01-01

    A conceptual design analysis is performed on a single engine V/STOL supersonic fighter/attack concept powered by a series flow tandem fan propulsion system. Forward and aft mounted fans have independent flow paths for V/STOL operation and series flow in high speed flight. Mission, combat and V/STOL performance is calculated. Detailed aerodynamic estimates are made and aerodynamic uncertainties associated with the configuration and estimation methods identified. A wind tunnel research program is developed to resolve principal uncertainties and establish a data base for the baseline configuration and parametric variations.

  14. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods: A Comparison with Clinical Assessment

    PubMed Central

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration (High Freq) performed similarly to non-parametric methods, but had the highest recall values, suggesting that this method could be employed for automatic tremor detection. PMID:27258018

  15. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  16. Application of artificial neural network to fMRI regression analysis.

    PubMed

    Misaki, Masaya; Miyauchi, Satoru

    2006-01-15

    We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.

  17. Parametric instability analysis of truncated conical shells using the Haar wavelet method

    NASA Astrophysics Data System (ADS)

    Dai, Qiyi; Cao, Qingjie

    2018-05-01

    In this paper, the Haar wavelet method is employed to analyze the parametric instability of truncated conical shells under static and time dependent periodic axial loads. The present work is based on the Love first-approximation theory for classical thin shells. The displacement field is expressed as the Haar wavelet series in the axial direction and trigonometric functions in the circumferential direction. Then the partial differential equations are reduced into a system of coupled Mathieu-type ordinary differential equations describing dynamic instability behavior of the shell. Using Bolotin's method, the first-order and second-order approximations of principal instability regions are determined. The correctness of present method is examined by comparing the results with those in the literature and very good agreement is observed. The difference between the first-order and second-order approximations of principal instability regions for tensile and compressive loads is also investigated. Finally, numerical results are presented to bring out the influences of various parameters like static load factors, boundary conditions and shell geometrical characteristics on the domains of parametric instability of conical shells.

  18. Reference interval computation: which method (not) to choose?

    PubMed

    Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C

    2012-07-11

    When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. A stepped-plate bi-frequency source for generating a difference frequency sound with a parametric array.

    PubMed

    Je, Yub; Lee, Haksue; Park, Jongkyu; Moon, Wonkyu

    2010-06-01

    An ultrasonic radiator is developed to generate a difference frequency sound from two frequencies of ultrasound in air with a parametric array. A design method is proposed for an ultrasonic radiator capable of generating highly directive, high-amplitude ultrasonic sound beams at two different frequencies in air based on a modification of the stepped-plate ultrasonic radiator. The stepped-plate ultrasonic radiator was introduced by Gallego-Juarez et al. [Ultrasonics 16, 267-271 (1978)] in their previous study and can effectively generate highly directive, large-amplitude ultrasonic sounds in air, but only at a single frequency. Because parametric array sources must be able to generate sounds at more than one frequency, a design modification is crucial to the application of a stepped-plate ultrasonic radiator as a parametric array source in air. The aforementioned method was employed to design a parametric radiator for use in air. A prototype of this design was constructed and tested to determine whether it could successfully generate a difference frequency sound with a parametric array. The results confirmed that the proposed single small-area transducer was suitable as a parametric radiator in air.

  20. Direct adaptive robust tracking control for 6 DOF industrial robot with enhanced accuracy.

    PubMed

    Yin, Xiuxing; Pan, Li

    2018-01-01

    A direct adaptive robust tracking control is proposed for trajectory tracking of 6 DOF industrial robot in the presence of parametric uncertainties, external disturbances and uncertain nonlinearities. The controller is designed based on the dynamic characteristics in the working space of the end-effector of the 6 DOF robot. The controller includes robust control term and model compensation term that is developed directly based on the input reference or desired motion trajectory. A projection-type parametric adaptation law is also designed to compensate for parametric estimation errors for the adaptive robust control. The feasibility and effectiveness of the proposed direct adaptive robust control law and the associated projection-type parametric adaptation law have been comparatively evaluated based on two 6 DOF industrial robots. The test results demonstrate that the proposed control can be employed to better maintain the desired trajectory tracking even in the presence of large parametric uncertainties and external disturbances as compared with PD controller and nonlinear controller. The parametric estimates also eventually converge to the real values along with the convergence of tracking errors, which further validate the effectiveness of the proposed parametric adaption law. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Robust non-parametric one-sample tests for the analysis of recurrent events.

    PubMed

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  2. Two-Sample Statistics for Testing the Equality of Survival Functions Against Improper Semi-parametric Accelerated Failure Time Alternatives: An Application to the Analysis of a Breast Cancer Clinical Trial

    PubMed Central

    BROËT, PHILIPPE; TSODIKOV, ALEXANDER; DE RYCKE, YANN; MOREAU, THIERRY

    2010-01-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests. PMID:15293627

  3. Resonant dampers for parametric instabilities in gravitational wave detectors

    NASA Astrophysics Data System (ADS)

    Gras, S.; Fritschel, P.; Barsotti, L.; Evans, M.

    2015-10-01

    Advanced gravitational wave interferometric detectors will operate at their design sensitivity with nearly ˜1 MW of laser power stored in the arm cavities. Such large power may lead to the uncontrolled growth of acoustic modes in the test masses due to the transfer of optical energy to the mechanical modes of the arm cavity mirrors. These parametric instabilities have the potential to significantly compromise the detector performance and control. Here we present the design of "acoustic mode dampers" that use the piezoelectric effect to reduce the coupling of optical to mechanical energy. Experimental measurements carried on an Advanced LIGO-like test mass have shown a tenfold reduction in the amplitude of several mechanical modes, thus suggesting that this technique can greatly mitigate the impact of parametric instabilities in advanced detectors.

  4. Test of the Chevallier-Polarski-Linder parametrization for rapid dark energy equation of state transitions

    NASA Astrophysics Data System (ADS)

    Linden, Sebastian; Virey, Jean-Marc

    2008-07-01

    We test the robustness and flexibility of the Chevallier-Polarski-Linder (CPL) parametrization of the dark energy equation of state w(z)=w0+wa(z)/(1+z) in recovering a four-parameter steplike fiducial model. We constrain the parameter space region of the underlying fiducial model where the CPL parametrization offers a reliable reconstruction. It turns out that non-negligible biases leak into the results for recent (z<2.5) rapid transitions, but that CPL yields a good reconstruction in all other cases. The presented analysis is performed with supernova Ia data as forecasted for a space mission like SNAP/JDEM, combined with future expectations for the cosmic microwave background shift parameter R and the baryonic acoustic oscillation parameter A.

  5. Low noise parametric amplifiers for radio astronomy observations at 18-21 cm wavelength

    NASA Technical Reports Server (NTRS)

    Kanevskiy, B. Z.; Veselov, V. M.; Strukov, I. A.; Etkin, V. S.

    1974-01-01

    The principle characteristics and use of SHF parametric amplifiers for radiometer input devices are explored. Balanced parametric amplifiers (BPA) are considered as the SHF signal amplifiers allowing production of the amplifier circuit without a special filter to achieve decoupling. Formulas to calculate the basic parameters of a BPA are given. A modulator based on coaxial lines is discussed as the input element of the SHF. Results of laboratory tests of the receiver section and long-term stability studies of the SHF sector are presented.

  6. Ku band low noise parametric amplifier

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A low noise, K sub u-band, parametric amplifier (paramp) was developed. The unit is a spacecraft-qualifiable, prototype, parametric amplifier for eventual application in the shuttle orbiter. The amplifier was required to have a noise temperature of less than 150 K. A noise temperature of less than 120 K at a gain level of 17 db was achieved. A 3-db bandwidth in excess of 350 MHz was attained, while deviation from phase linearity of about + or - 1 degree over 50 MHz was achieved. The paramp operates within specification over an ambient temperature range of -5 C to +50 C. The performance requirements and the operation of the K sub u-band parametric amplifier system are described. The final test results are also given.

  7. Pechukas-Yukawa approach to the evolution of the quantum state of a parametrically perturbed system

    NASA Astrophysics Data System (ADS)

    Qureshi, Mumnuna A.; Zhong, Johnny; Qureshi, Zihad; Mason, Peter; Betouras, Joseph J.; Zagoskin, Alexandre M.

    2018-03-01

    We consider the evolution of the quantum states of a Hamiltonian that is parametrically perturbed via a term proportional to the adiabatic parameter λ (t ) . Starting with the Pechukas-Yukawa mapping of the energy eigenvalue evolution in a generalized Calogero-Sutherland model of a one-dimensional classical gas, we consider the adiabatic approximation with two different expansions of the quantum state in powers of d λ /d t and compare them with a direct numerical simulation. We show that one of these expansions (Magnus series) is especially convenient for the description of nonadiabatic evolution of the system. Applying the expansion to the exact cover 3-satisfiability problem, we obtain the occupation dynamics, which provides insight into the population of states and sources of decoherence in a quantum system.

  8. A Nonparametric Approach to Estimate Classification Accuracy and Consistency

    ERIC Educational Resources Information Center

    Lathrop, Quinn N.; Cheng, Ying

    2014-01-01

    When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…

  9. Nonlinear parametric model for Granger causality of time series

    NASA Astrophysics Data System (ADS)

    Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano

    2006-06-01

    The notion of Granger causality between two time series examines if the prediction of one series could be improved by incorporating information of the other. In particular, if the prediction error of the first time series is reduced by including measurements from the second time series, then the second time series is said to have a causal influence on the first one. We propose a radial basis function approach to nonlinear Granger causality. The proposed model is not constrained to be additive in variables from the two time series and can approximate any function of these variables, still being suitable to evaluate causality. Usefulness of this measure of causality is shown in two applications. In the first application, a physiological one, we consider time series of heart rate and blood pressure in congestive heart failure patients and patients affected by sepsis: we find that sepsis patients, unlike congestive heart failure patients, show symmetric causal relationships between the two time series. In the second application, we consider the feedback loop in a model of excitatory and inhibitory neurons: we find that in this system causality measures the combined influence of couplings and membrane time constants.

  10. Parametric methods outperformed non-parametric methods in comparisons of discrete numerical variables.

    PubMed

    Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter

    2011-04-13

    The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.

  11. Product assurance technology for procuring reliable, radiation-hard, custom LSI/VLSI electronics

    NASA Technical Reports Server (NTRS)

    Buehler, M. G.; Allen, R. A.; Blaes, B. R.; Hicks, K. A.; Jennings, G. A.; Lin, Y.-S.; Pina, C. A.; Sayah, H. R.; Zamani, N.

    1989-01-01

    Advanced measurement methods using microelectronic test chips are described. These chips are intended to be used in acquiring the data needed to qualify Application Specific Integrated Circuits (ASIC's) for space use. Efforts were focused on developing the technology for obtaining custom IC's from CMOS/bulk silicon foundries. A series of test chips were developed: a parametric test strip, a fault chip, a set of reliability chips, and the CRRES (Combined Release and Radiation Effects Satellite) chip, a test circuit for monitoring space radiation effects. The technical accomplishments of the effort include: (1) development of a fault chip that contains a set of test structures used to evaluate the density of various process-induced defects; (2) development of new test structures and testing techniques for measuring gate-oxide capacitance, gate-overlap capacitance, and propagation delay; (3) development of a set of reliability chips that are used to evaluate failure mechanisms in CMOS/bulk: interconnect and contact electromigration and time-dependent dielectric breakdown; (4) development of MOSFET parameter extraction procedures for evaluating subthreshold characteristics; (5) evaluation of test chips and test strips on the second CRRES wafer run; (6) two dedicated fabrication runs for the CRRES chip flight parts; and (7) publication of two papers: one on the split-cross bridge resistor and another on asymmetrical SRAM (static random access memory) cells for single-event upset analysis.

  12. Efficient nonparametric n -body force fields from machine learning

    NASA Astrophysics Data System (ADS)

    Glielmo, Aldo; Zeni, Claudio; De Vita, Alessandro

    2018-05-01

    We provide a definition and explicit expressions for n -body Gaussian process (GP) kernels, which can learn any interatomic interaction occurring in a physical system, up to n -body contributions, for any value of n . The series is complete, as it can be shown that the "universal approximator" squared exponential kernel can be written as a sum of n -body kernels. These recipes enable the choice of optimally efficient force models for each target system, as confirmed by extensive testing on various materials. We furthermore describe how the n -body kernels can be "mapped" on equivalent representations that provide database-size-independent predictions and are thus crucially more efficient. We explicitly carry out this mapping procedure for the first nontrivial (three-body) kernel of the series, and we show that this reproduces the GP-predicted forces with meV /Å accuracy while being orders of magnitude faster. These results pave the way to using novel force models (here named "M-FFs") that are computationally as fast as their corresponding standard parametrized n -body force fields, while retaining the nonparametric character, the ease of training and validation, and the accuracy of the best recently proposed machine-learning potentials.

  13. Semiparametric modeling: Correcting low-dimensional model error in parametric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Tyrus, E-mail: thb11@psu.edu; Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, 503 Walker Building, University Park, PA 16802-5013

    2016-03-01

    In this paper, a semiparametric modeling approach is introduced as a paradigm for addressing model error arising from unresolved physical phenomena. Our approach compensates for model error by learning an auxiliary dynamical model for the unknown parameters. Practically, the proposed approach consists of the following steps. Given a physics-based model and a noisy data set of historical observations, a Bayesian filtering algorithm is used to extract a time-series of the parameter values. Subsequently, the diffusion forecast algorithm is applied to the retrieved time-series in order to construct the auxiliary model for the time evolving parameters. The semiparametric forecasting algorithm consistsmore » of integrating the existing physics-based model with an ensemble of parameters sampled from the probability density function of the diffusion forecast. To specify initial conditions for the diffusion forecast, a Bayesian semiparametric filtering method that extends the Kalman-based filtering framework is introduced. In difficult test examples, which introduce chaotically and stochastically evolving hidden parameters into the Lorenz-96 model, we show that our approach can effectively compensate for model error, with forecasting skill comparable to that of the perfect model.« less

  14. Spatio-temporal analysis of recent groundwater-level trends in the Red River Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Bui, Duong Du; Kawamura, Akira; Tong, Thanh Ngoc; Amaguchi, Hideo; Nakagawa, Naoko

    2012-12-01

    A groundwater-monitoring network has been in operation in the Red River Delta, Vietnam, since 1995. Trends in groundwater level (1995-2009) in 57 wells in the Holocene unconfined aquifer and 63 wells in the Pleistocene confined aquifer were determined by applying the non-parametric Mann-Kendall trend test and Sen's slope estimator. At each well, 17 time series (e.g. annual, seasonal, monthly), computed from the original data, were analyzed. Analysis of the annual groundwater-level means revealed that 35 % of the wells in the unconfined aquifer showed downward trends, while about 21 % showed upward trends. On the other hand, confined-aquifer groundwater levels experienced downward trends in almost all locations. Spatial distributions of trends indicated that the strongly declining trends (>0.3 m/year) were mainly found in urban areas around Hanoi where there is intensive abstraction of groundwater. Although the trend results for most of the 17 time series at a given well were quite similar, different trend patterns were detected in several. The findings reflect unsustainable groundwater development and the importance of maintaining groundwater monitoring and a database in the Delta, particularly in urban areas.

  15. Two-sample tests and one-way MANOVA for multivariate biomarker data with nondetects.

    PubMed

    Thulin, M

    2016-09-10

    Testing whether the mean vector of a multivariate set of biomarkers differs between several populations is an increasingly common problem in medical research. Biomarker data is often left censored because some measurements fall below the laboratory's detection limit. We investigate how such censoring affects multivariate two-sample and one-way multivariate analysis of variance tests. Type I error rates, power and robustness to increasing censoring are studied, under both normality and non-normality. Parametric tests are found to perform better than non-parametric alternatives, indicating that the current recommendations for analysis of censored multivariate data may have to be revised. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Stochastic Hourly Weather Generator HOWGH: Validation and its Use in Pest Modelling under Present and Future Climates

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Hirschi, M.; Spirig, C.

    2014-12-01

    To quantify impact of the climate change on a specific pest (or any weather-dependent process) in a specific site, we may use a site-calibrated pest (or other) model and compare its outputs obtained with site-specific weather data representing present vs. perturbed climates. The input weather data may be produced by the stochastic weather generator. Apart from the quality of the pest model, the reliability of the results obtained in such experiment depend on an ability of the generator to represent the statistical structure of the real world weather series, and on the sensitivity of the pest model to possible imperfections of the generator. This contribution deals with the multivariate HOWGH weather generator, which is based on a combination of parametric and non-parametric statistical methods. Here, HOWGH is used to generate synthetic hourly series of three weather variables (solar radiation, temperature and precipitation) required by a dynamic pest model SOPRA to simulate the development of codling moth. The contribution presents results of the direct and indirect validation of HOWGH. In the direct validation, the synthetic series generated by HOWGH (various settings of its underlying model are assumed) are validated in terms of multiple climatic characteristics, focusing on the subdaily wet/dry and hot/cold spells. In the indirect validation, we assess the generator in terms of characteristics derived from the outputs of SOPRA model fed by the observed vs. synthetic series. The weather generator may be used to produce weather series representing present and future climates. In the latter case, the parameters of the generator may be modified by the climate change scenarios based on Global or Regional Climate Models. To demonstrate this feature, the results of codling moth simulations for future climate will be shown. Acknowledgements: The weather generator is developed and validated within the frame of projects WG4VALUE (project LD12029 sponsored by the Ministry of Education, Youth and Sports of CR), and VALUE (COST ES 1102 action).

  17. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography.

    PubMed

    Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-06-01

    Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.

  18. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography

    PubMed Central

    Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-01-01

    Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477

  19. Plasma-enhanced mixing and flameholding in supersonic flow.

    PubMed

    Firsov, Alexander; Savelkin, Konstantin V; Yarantsev, Dmitry A; Leonov, Sergey B

    2015-08-13

    The results of experimental study of plasma-based mixing, ignition and flameholding in a supersonic model combustor are presented in the paper. The model combustor has a length of 600 mm and cross section of 72 mm width and 60 mm height. The fuel is directly injected into supersonic airflow (Mach number M=2, static pressure P(st)=160-250 Torr) through wall orifices. Two series of tests are focused on flameholding and mixing correspondingly. In the first series, the near-surface quasi-DC electrical discharge is generated by flush-mounted electrodes at electrical power deposition of W(pl)=3-24 kW. The scope includes parametric study of ignition and flame front dynamics, and comparison of three schemes of plasma generation: the first and the second layouts examine the location of plasma generators upstream and downstream from the fuel injectors. The third pattern follows a novel approach of combined mixing/ignition technique, where the electrical discharge distributes along the fuel jet. The last pattern demonstrates a significant advantage in terms of flameholding limit. In the second series of tests, a long discharge of submicrosecond duration is generated across the flow and along the fuel jet. A gasdynamic instability of thermal cavity developed after a deposition of high-power density in a thin plasma filament promotes the air-fuel mixing. The technique studied in this work has weighty potential for high-speed combustion applications, including cold start/restart of scramjet engines and support of transition regime in dual-mode scramjet and at off-design operation. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  20. CLUSTERnGO: a user-defined modelling platform for two-stage clustering of time-series data.

    PubMed

    Fidaner, Işık Barış; Cankorur-Cetinkaya, Ayca; Dikicioglu, Duygu; Kirdar, Betul; Cemgil, Ali Taylan; Oliver, Stephen G

    2016-02-01

    Simple bioinformatic tools are frequently used to analyse time-series datasets regardless of their ability to deal with transient phenomena, limiting the meaningful information that may be extracted from them. This situation requires the development and exploitation of tailor-made, easy-to-use and flexible tools designed specifically for the analysis of time-series datasets. We present a novel statistical application called CLUSTERnGO, which uses a model-based clustering algorithm that fulfils this need. This algorithm involves two components of operation. Component 1 constructs a Bayesian non-parametric model (Infinite Mixture of Piecewise Linear Sequences) and Component 2, which applies a novel clustering methodology (Two-Stage Clustering). The software can also assign biological meaning to the identified clusters using an appropriate ontology. It applies multiple hypothesis testing to report the significance of these enrichments. The algorithm has a four-phase pipeline. The application can be executed using either command-line tools or a user-friendly Graphical User Interface. The latter has been developed to address the needs of both specialist and non-specialist users. We use three diverse test cases to demonstrate the flexibility of the proposed strategy. In all cases, CLUSTERnGO not only outperformed existing algorithms in assigning unique GO term enrichments to the identified clusters, but also revealed novel insights regarding the biological systems examined, which were not uncovered in the original publications. The C++ and QT source codes, the GUI applications for Windows, OS X and Linux operating systems and user manual are freely available for download under the GNU GPL v3 license at http://www.cmpe.boun.edu.tr/content/CnG. sgo24@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  1. A study of cathode erosion in high power arcjets

    NASA Astrophysics Data System (ADS)

    Harris, William Jackson, III

    Cathode erosion continues to be one of the predominant technology concerns for high power arcjets. This study will show that cathode erosion in these devices is significantly affected by several mitigating factors, including propellant composition, propellant flowrate, current level, cathode material, and power supply current ripple. In a series of 50-hour and 100-hour long duration experiments, using a water-cooled 30 kilowatt laboratory arcjet, variations in the steady-state cathode erosion rate were characterized for each of these factors using nitrogen propellant at a fixed arc current of 250 Amperes. A complementary series of measurements was made using hydrogen propellant at an arc current of 100 Amperes. The cold cathode erosion rate was also differentiated from the steady-state cathode erosion rate in a series of multi-start cathode erosion experiments. Results of these measurements are presented, along with an analysis of the significant effects of current ripple on arcjet cathode erosion. As part of this study, over a dozen refractory cathode materials were evaluated to measure their resistance to arcjet cathode erosion. Among the materials tested were W-ThO2(1%, 2%, 4%), poly and mono-crystalline W, W-LaB6, W-La2O3, W-BaO2, W-BaCaAl2O4, W-Y2O3, and ZrB2. Based on these measurements, several critical material properties were identified, such work function, density, porosity, melting point, and evaporation rate. While the majority of the materials failed to outperform traditional W-ThO2, these experimental results are used to develop a parametric model of the arcjet cathode physics. The results of this model, and the results of a finite-element thermal analysis of the arcjet cathode, are presented to better explain the relative performance of the materials tested.

  2. Air Leakage and Air Transfer Between Garage and Living Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudd, Armin

    2014-09-01

    This research project focused on evaluation of air transfer between the garage and living space in a single-family detached home constructed by a production homebuilder in compliance with the 2009 International Residential Code and the 2009 International Energy Conservation Code. The project gathered important information about the performance of whole-building ventilation systems and garage ventilation systems as they relate to minimizing flow of contaminated air from garage to living space. A series of 25 multi-point fan pressurization tests and additional zone pressure diagnostic testing characterized the garage and house air leakage, the garage-to-house air leakage, and garage and house pressuremore » relationships to each other and to outdoors using automated fan pressurization and pressure monitoring techniques. While the relative characteristics of this house may not represent the entire population of new construction configurations and air tightness levels (house and garage) throughout the country, the technical approach was conservative and should reasonably extend the usefulness of the results to a large spectrum of house configurations from this set of parametric tests in this one house. Based on the results of this testing, the two-step garage-to-house air leakage test protocol described above is recommended where whole-house exhaust ventilation is employed.« less

  3. Advanced theoretical and experimental studies in automatic control and information systems. [including mathematical programming and game theory

    NASA Technical Reports Server (NTRS)

    Desoer, C. A.; Polak, E.; Zadeh, L. A.

    1974-01-01

    A series of research projects is briefly summarized which includes investigations in the following areas: (1) mathematical programming problems for large system and infinite-dimensional spaces, (2) bounded-input bounded-output stability, (3) non-parametric approximations, and (4) differential games. A list of reports and papers which were published over the ten year period of research is included.

  4. ENERGY COSTS OF IAQ CONTROL THROUGH INCREASED VENTILATION IN A SMALL OFFICE IN A WARM, HUMID CLIMATE: PARAMETRIC ANALYSIS USING THE DOE-2 COMPUTER MODEL

    EPA Science Inventory

    The report gives results of a series of computer runs using the DOE-2.1E building energy model, simulating a small office in a hot, humid climate (Miami). These simulations assessed the energy and relative humidity (RH) penalties when the outdoor air (OA) ventilation rate is inc...

  5. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    NASA Astrophysics Data System (ADS)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  6. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  7. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    PubMed

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  8. Investigation on the coloured noise in GPS-derived position with time-varying seasonal signals

    NASA Astrophysics Data System (ADS)

    Gruszczynska, Marta; Klos, Anna; Bos, Machiel Simon; Bogusz, Janusz

    2016-04-01

    The seasonal signals in the GPS-derived time series arise from real geophysical signals related to tidal (residual) or non-tidal (loadings from atmosphere, ocean and continental hydrosphere, thermo elastic strain, etc.) effects and numerical artefacts including aliasing from mismodelling in short periods or repeatability of the GPS satellite constellation with respect to the Sun (draconitics). Singular Spectrum Analysis (SSA) is a method for investigation of nonlinear dynamics, suitable to either stationary or non-stationary data series without prior knowledge about their character. The aim of SSA is to mathematically decompose the original time series into a sum of slowly varying trend, seasonal oscillations and noise. In this presentation we will explore the ability of SSA to subtract the time-varying seasonal signals in GPS-derived North-East-Up topocentric components and show properties of coloured noise from residua. For this purpose we used data from globally distributed IGS (International GNSS Service) permanent stations processed by the JPL (Jet Propulsion Laboratory) in a PPP (Precise Point Positioning) mode. After introducing a threshold of 13 years, 264 stations left with a maximum length reaching 23 years. The data was initially pre-processed for outliers, offsets and gaps. The SSA was applied to pre-processed series to estimate the time-varying seasonal signals. We adopted a 3-years window as the optimal dimension of its size determined with the Akaike's Information Criteria (AIC) values. A Fisher-Snedecor test corrected for the presence of temporal correlation was used to determine the statistical significance of reconstructed components. This procedure showed that first four components describing annual and semi-annual signals, are significant at a 99.7% confidence level, which corresponds to 3-sigma criterion. We compared the non-parametric SSA approach with a commonly chosen parametric Least-Squares Estimation that assumes constant amplitudes and phases over time. We noticed a maximum difference in seasonal oscillation of 3.5 mm and a maximum change in velocity of 0.15 mm/year for Up component (YELL, Yellowknife, Canada), when SSA and LSE are compared. The annual signal has the greatest influence on data variability in time series, while the semi-annual signal in Up component has much smaller contribution in the total variance of data. For some stations more than 35% of the total variance is explained by annual signal. According to the Power Spectral Densities (PSD) we proved that SSA has the ability to properly subtract the seasonals changing in time with almost no influence on power-law character of stochastic part. Then, the modified Maximum Likelihood Estimation (MLE) in Hector software was applied to SSA-filtered time series. We noticed a significant improvement in spectral indices and power-law amplitudes in comparison to classically determined ones with LSE, which will be the main subject of this presentation.

  9. SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit

    PubMed Central

    Chu, Annie; Cui, Jenny; Dinov, Ivo D.

    2011-01-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994

  10. Inferring time derivatives including cell growth rates using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Swain, Peter S.; Stevenson, Keiran; Leary, Allen; Montano-Gutierrez, Luis F.; Clark, Ivan B. N.; Vogel, Jackie; Pilizota, Teuta

    2016-12-01

    Often the time derivative of a measured variable is of as much interest as the variable itself. For a growing population of biological cells, for example, the population's growth rate is typically more important than its size. Here we introduce a non-parametric method to infer first and second time derivatives as a function of time from time-series data. Our approach is based on Gaussian processes and applies to a wide range of data. In tests, the method is at least as accurate as others, but has several advantages: it estimates errors both in the inference and in any summary statistics, such as lag times, and allows interpolation with the corresponding error estimation. As illustrations, we infer growth rates of microbial cells, the rate of assembly of an amyloid fibril and both the speed and acceleration of two separating spindle pole bodies. Our algorithm should thus be broadly applicable.

  11. The effects of daily weather variables on psychosis admissions to psychiatric hospitals

    NASA Astrophysics Data System (ADS)

    McWilliams, Stephen; Kinsella, Anthony; O'Callaghan, Eadbhard

    2013-07-01

    Several studies have noted seasonal variations in admission rates of patients with psychotic illnesses. However, the changeable daily meteorological patterns within seasons have never been examined in any great depth in the context of admission rates. A handful of small studies have posed interesting questions regarding a potential link between psychiatric admission rates and meteorological variables such as environmental temperature (especially heat waves) and sunshine. In this study, we used simple non-parametric testing and more complex ARIMA and time-series regression analysis to examine whether daily meteorological patterns (wind speed and direction, barometric pressure, rainfall, sunshine, sunlight and temperature) exert an influence on admission rates for psychotic disorders across 12 regions in Ireland. Although there were some weak but interesting trends for temperature, barometric pressure and sunshine, the meteorological patterns ultimately did not exert a clinically significant influence over admissions for psychosis. Further analysis is needed.

  12. Inhibition of chaotic escape from a potential well by incommensurate escape-suppressing excitations.

    PubMed

    Chacón, R; Martínez, J A

    2002-03-01

    Theoretical results are presented concerning the reduction of chaotic escape from a potential well by means of a harmonic parametric excitation that satisfies an ultrasubharmonic resonance condition with the escape-inducing excitation. The possibility of incommensurate escape-suppressing excitations is demonstrated by studying rational approximations to the irrational escape-suppressing frequency. The analytical predictions for the suitable amplitudes and initial phases of the escape-suppressing excitation are tested against numerical simulations based on a high-resolution grid of initial conditions. These numerical results indicate that the reduction of escape is reliably achieved for small amplitudes and at, and only at, the predicted initial phases. For the case of irrational escape-suppressing frequencies, the effective escape-reducing initial phases are found to lie close to the accumulation points of the set of suitable initial phases that are associated with the complete series of convergents up to the convergent giving the chosen rational approximation.

  13. The efficiency of average linkage hierarchical clustering algorithm associated multi-scale bootstrap resampling in identifying homogeneous precipitation catchments

    NASA Astrophysics Data System (ADS)

    Chuan, Zun Liang; Ismail, Noriszura; Shinyie, Wendy Ling; Lit Ken, Tan; Fam, Soo-Fen; Senawi, Azlyna; Yusoff, Wan Nur Syahidah Wan

    2018-04-01

    Due to the limited of historical precipitation records, agglomerative hierarchical clustering algorithms widely used to extrapolate information from gauged to ungauged precipitation catchments in yielding a more reliable projection of extreme hydro-meteorological events such as extreme precipitation events. However, identifying the optimum number of homogeneous precipitation catchments accurately based on the dendrogram resulted using agglomerative hierarchical algorithms are very subjective. The main objective of this study is to propose an efficient regionalized algorithm to identify the homogeneous precipitation catchments for non-stationary precipitation time series. The homogeneous precipitation catchments are identified using average linkage hierarchical clustering algorithm associated multi-scale bootstrap resampling, while uncentered correlation coefficient as the similarity measure. The regionalized homogeneous precipitation is consolidated using K-sample Anderson Darling non-parametric test. The analysis result shows the proposed regionalized algorithm performed more better compared to the proposed agglomerative hierarchical clustering algorithm in previous studies.

  14. A comparison of confidence/credible interval methods for the area under the ROC curve for continuous diagnostic tests with small sample size.

    PubMed

    Feng, Dai; Cortese, Giuliana; Baumgartner, Richard

    2017-12-01

    The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.

  15. Using multifractal analysis of ultra-weak photon emission from germinating wheat seedlings to differentiate between two grades of intoxication with potassium dichromate

    NASA Astrophysics Data System (ADS)

    Scholkmann, Felix; Cifra, Michal; Alexandre Moraes, Thiago; de Mello Gallep, Cristiano

    2011-12-01

    The aim of the present study was to test whether the multifractal properties of ultra-weak photon emission (UPE) from germinating wheat seedlings (Triticum aestivum) change when the seedlings are treated with different concentrations of the toxin potassium dichromate (PD). To this end, UPE was measured (50 seedlings in one Petri dish, duration: approx. 16.6- 28 h) from samples of three groups: (i) control (group C, N = 9), (ii) treated with 25 ppm of PD (group G25, N = 32), and (iii) treated with 150 ppm of PD (group G150, N = 23). For the multifractal analysis, the following steps where performed: (i) each UPE time series was trimmed to a final length of 1000 min; (ii) each UPE time series was filtered, linear detrended and normalized; (iii) the multifractal spectrum (f(α)) was calculated for every UPE time series using the backward multifractal detrended moving average (MFDMA) method; (iv) each multifractal spectrum was characterized by calculating the mode (αmode) of the spectrum and the degree of multifractality (Δα) (v) for every UPE time series its mean, skewness and kurtosis were also calculated; finally (vi) all obtained parameters where analyzed to determine their ability to differentiate between the three groups. This was based on Fisher's discriminant ratio (FDR), which was calculated for each parameter combination. Additionally, a non-parametric test was used to test whether the parameter values are significantly different or not. The analysis showed that when comparing all the three groups, FDR had the highest values for the multifractal parameters (αmode, Δα). Furthermore, the differences in these parameters between the groups were statistically significant (p < 0.05). The classical parameters (mean, skewness and kurtosis) had lower FDR values than the multifractal parameters in all cases and showed no significant difference between the groups (except for the skewness between group C and G150). In conclusion, multifractal analysis enables changes in UPE time series to be detected even when they are hidden for normal linear signal analysis methods. The analysis of changes in the multifractal properties might be a basis to design a classification system enabling the intoxication of cell cultures to be quantified based on UPE measurements.

  16. Experimental results for the rapid determination of the freezing point of fuels

    NASA Technical Reports Server (NTRS)

    Mathiprakasam, B.

    1984-01-01

    Two methods for the rapid determination of the freezing point of fuels were investigated: an optical method, which detected the change in light transmission from the disappearance of solid particles in the melted fuel; and a differential thermal analysis (DTA) method, which sensed the latent heat of fusion. A laboratory apparatus was fabricated to test the two methods. Cooling was done by thermoelectric modules using an ice-water bath as a heat sink. The DTA method was later modified to eliminate the reference fuel. The data from the sample were digitized and a point of inflection, which corresponds to the ASTM D-2386 freezing point (final melting point), was identified from the derivative. The apparatus was modifified to cool the fuel to -60 C and controls were added for maintaining constant cooling rate, rewarming rate, and hold time at minimum temperature. A parametric series of tests were run for twelve fuels with freezing points from -10 C to -50 C, varying cooling rate, rewarming rate, and hold time. Based on the results, an optimum test procedure was established. The results showed good agreement with ASTM D-2386 freezing point and differential scanning calorimetry results.

  17. Software for Managing Parametric Studies

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; McCann, Karen M.; DeVivo, Adrian

    2003-01-01

    The Information Power Grid Virtual Laboratory (ILab) is a Practical Extraction and Reporting Language (PERL) graphical-user-interface computer program that generates shell scripts to facilitate parametric studies performed on the Grid. (The Grid denotes a worldwide network of supercomputers used for scientific and engineering computations involving data sets too large to fit on desktop computers.) Heretofore, parametric studies on the Grid have been impeded by the need to create control language scripts and edit input data files painstaking tasks that are necessary for managing multiple jobs on multiple computers. ILab reflects an object-oriented approach to automation of these tasks: All data and operations are organized into packages in order to accelerate development and debugging. A container or document object in ILab, called an experiment, contains all the information (data and file paths) necessary to define a complex series of repeated, sequenced, and/or branching processes. For convenience and to enable reuse, this object is serialized to and from disk storage. At run time, the current ILab experiment is used to generate required input files and shell scripts, create directories, copy data files, and then both initiate and monitor the execution of all computational processes.

  18. Correlation between a Student's Performance on the Mental Cutting Test and Their 3D Parametric Modeling Ability

    ERIC Educational Resources Information Center

    Steinhauer, H. M.

    2012-01-01

    Engineering graphics has historically been viewed as a challenging course to teach as students struggle to grasp and understand the fundamental concepts and then to master their proper application. The emergence of stable, fast, affordable 3D parametric modeling platforms such as CATIA, Pro-E, and AutoCAD while providing several pedagogical…

  19. Vapor Compression Distillation Subsystem (VCDS) component enhancement, testing and expert fault diagnostics development, volume 1

    NASA Technical Reports Server (NTRS)

    Kovach, L. S.; Zdankiewicz, E. M.

    1987-01-01

    Vapor compression distillation technology for phase change recovery of potable water from wastewater has evolved as a technically mature approach for use aboard the Space Station. A program to parametrically test an advanced preprototype Vapor Compression Distillation Subsystem (VCDS) was completed during 1985 and 1986. In parallel with parametric testing, a hardware improvement program was initiated to test the feasibility of incorporating several key improvements into the advanced preprototype VCDS following initial parametric tests. Specific areas of improvement included long-life, self-lubricated bearings, a lightweight, highly-efficient compressor, and a long-life magnetic drive. With the exception of the self-lubricated bearings, these improvements are incorporated. The advanced preprototype VCDS was designed to reclaim 95 percent of the available wastewater at a nominal water recovery rate of 1.36 kg/h achieved at a solids concentration of 2.3 percent and 308 K condenser temperature. While this performance was maintained for the initial testing, a 300 percent improvement in water production rate with a corresponding lower specific energy was achieved following incorporation of the improvements. Testing involved the characterization of key VCDS performance factors as a function of recycle loop solids concentration, distillation unit temperature and fluids pump speed. The objective of this effort was to expand the VCDS data base to enable defining optimum performance characteristics for flight hardware development.

  20. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    PubMed

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  1. Modeling personnel turnover in the parametric organization

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1991-01-01

    A model is developed for simulating the dynamics of a newly formed organization, credible during all phases of organizational development. The model development process is broken down into the activities of determining the tasks required for parametric cost analysis (PCA), determining the skills required for each PCA task, determining the skills available in the applicant marketplace, determining the structure of the model, implementing the model, and testing it. The model, parameterized by the likelihood of job function transition, has demonstrated by the capability to represent the transition of personnel across functional boundaries within a parametric organization using a linear dynamical system, and the ability to predict required staffing profiles to meet functional needs at the desired time. The model can be extended by revisions of the state and transition structure to provide refinements in functional definition for the parametric and extended organization.

  2. A non-parametric method for automatic determination of P-wave and S-wave arrival times: application to local micro earthquakes

    NASA Astrophysics Data System (ADS)

    Rawles, Christopher; Thurber, Clifford

    2015-08-01

    We present a simple, fast, and robust method for automatic detection of P- and S-wave arrivals using a nearest neighbours-based approach. The nearest neighbour algorithm is one of the most popular time-series classification methods in the data mining community and has been applied to time-series problems in many different domains. Specifically, our method is based on the non-parametric time-series classification method developed by Nikolov. Instead of building a model by estimating parameters from the data, the method uses the data itself to define the model. Potential phase arrivals are identified based on their similarity to a set of reference data consisting of positive and negative sets, where the positive set contains examples of analyst identified P- or S-wave onsets and the negative set contains examples that do not contain P waves or S waves. Similarity is defined as the square of the Euclidean distance between vectors representing the scaled absolute values of the amplitudes of the observed signal and a given reference example in time windows of the same length. For both P waves and S waves, a single pass is done through the bandpassed data, producing a score function defined as the ratio of the sum of similarity to positive examples over the sum of similarity to negative examples for each window. A phase arrival is chosen as the centre position of the window that maximizes the score function. The method is tested on two local earthquake data sets, consisting of 98 known events from the Parkfield region in central California and 32 known events from the Alpine Fault region on the South Island of New Zealand. For P-wave picks, using a reference set containing two picks from the Parkfield data set, 98 per cent of Parkfield and 94 per cent of Alpine Fault picks are determined within 0.1 s of the analyst pick. For S-wave picks, 94 per cent and 91 per cent of picks are determined within 0.2 s of the analyst picks for the Parkfield and Alpine Fault data set, respectively. For the Parkfield data set, our method picks 3520 P-wave picks and 3577 S-wave picks out of 4232 station-event pairs. For the Alpine Fault data set, the method picks 282 P-wave picks and 311 S-wave picks out of a total of 344 station-event pairs. For our testing, we note that the vast majority of station-event pairs have analyst picks, although some analyst picks are excluded based on an accuracy assessment. Finally, our tests suggest that the method is portable, allowing the use of a reference set from one region on data from a different region using relatively few reference picks.

  3. Experimental and analytical study of loss-of-flow transients in EBR-II occurring at decay power levels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, L.K.; Mohr, D.; Feldman, E.E.

    A series of eight loss-of-flow (LOF) tests have been conducted in EBR-II to study the transition between forced and natural convective flows following a variety of loss-of-primary-pumping power conditions from decay heat levels. Comparisons of measurements and pretest/posttest predictions were made on a selected test. Good agreements between measurements and predictions was found prior to and just after the flow reaching its minimum, but the agreement is not as good after that point. The temperatures are consistent with the flow response and the assumed decay power. The measured results indicate that the flows of driver and the instrumented subassemblies aremore » too much in the analytical model in the natural convective region. Although a parametric study on secondary flow, turbulent-laminar flow transition, heat transfer ability of the intermediate heat exchange at low flow and flow mixing in the primary tank has been performed to determine their effects on the flow, the cause of the discrepancy at very low flow level is still unknown.« less

  4. Generalized linear mixed models with varying coefficients for longitudinal data.

    PubMed

    Zhang, Daowen

    2004-03-01

    The routinely assumed parametric functional form in the linear predictor of a generalized linear mixed model for longitudinal data may be too restrictive to represent true underlying covariate effects. We relax this assumption by representing these covariate effects by smooth but otherwise arbitrary functions of time, with random effects used to model the correlation induced by among-subject and within-subject variation. Due to the usually intractable integration involved in evaluating the quasi-likelihood function, the double penalized quasi-likelihood (DPQL) approach of Lin and Zhang (1999, Journal of the Royal Statistical Society, Series B61, 381-400) is used to estimate the varying coefficients and the variance components simultaneously by representing a nonparametric function by a linear combination of fixed effects and random effects. A scaled chi-squared test based on the mixed model representation of the proposed model is developed to test whether an underlying varying coefficient is a polynomial of certain degree. We evaluate the performance of the procedures through simulation studies and illustrate their application with Indonesian children infectious disease data.

  5. Study of the influence of substrate and spectrophotometer characteristics on the in vitro measurement of sunscreens efficiency.

    PubMed

    Couteau, C; Philippe, A; Vibet, M-A; Paparis, E; Coiffard, L

    2018-05-16

    All the methods used for the in vitro measurement of the SPF, the universal indicator of sunscreens efficiency, rely on a spectrophotometric analysis. What can vary about the experimental protocol used is mainly the substrate and the type of spectrophotometer chosen. We decided to work with polymethylmetacrylate plates that we analyzed using two spectrophotometers equipped with integrating spheres, the UV1000S and the UV2000 apparatus. Two marketed products were such tested, after spreading 2 mg/cm 2 on the plates, using one apparatus after another. We applied a non-parametric Wilcoxon test for paired data to the measures realized on 10 plates (as we systematically used the 2 apparatus), in order to compare the series of measures obtained with the two machines. This way, we were able to show a significant difference between the SPF values respectively obtained with the UV1000S and the UV2000 spectrophotometers. This difference could be explained by the decrease of the stray light in the case of the UV2000 apparatus. Copyright © 2017. Published by Elsevier B.V.

  6. A multicenter nationwide reference intervals study for common biochemical analytes in Turkey using Abbott analyzers.

    PubMed

    Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif

    2014-12-01

    A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.

  7. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  8. Parametric and cycle tests of a 40-A-hr bipolar nickel-hydrogen battery

    NASA Technical Reports Server (NTRS)

    Cataldo, R. L.

    1986-01-01

    A series of tests was performed to characterize battery performance relating to certain operating parameters which included charge current, discharge current, temperature and pressure. The parameters were varied to confirm battery design concepts and to determine optimal operating conditions. Spacecraft power requirements are constantly increasing. Special spacecraft such as the Space Station and platforms will require energy storage systems of 130 and 25 kWh, respectively. The complexity of these high power systems will demand high reliability, and reduced mass and volume. A system that uses batteries for storage will require a cell count in excess of 400 units. These cell units must then be assembled into several batteries with over 100 cells in a series connected string. In an attempt to simplify the construction of conventional cells and batteries, the NASA Lewis Research Center battery systems group initiated work on a nickel-hydrogen battery in a bipolar configuration in early 1981. Features of the battery with this bipolar construction show promise in improving both volumetric and gravimetric energy densities as well as thermal management. Bipolar construction allows cooling in closer proximity to the cell components, thus heat removal can be accomplished at a higher rejection temperature than conventional cell designs. Also, higher current densities are achievable because of low cell impedance. Lower cell impedance is achieved via current flow perpendicular to the electrode face, thus reducing voltage drops in the electrode grid and electrode terminals tabs.

  9. Thermal effects in an ultrafast BiB 3O 6 optical parametric oscillator at high average powers

    DOE PAGES

    Petersen, T.; Zuegel, J. D.; Bromage, J.

    2017-08-15

    An ultrafast, high-average-power, extended-cavity, femtosecond BiB 3O 6 optical parametric oscillator was constructed as a test bed for investigating the scalability of infrared parametric devices. Despite the high pulse energies achieved by this system, the reduction in slope efficiency near the maximum-available pump power prompted the investigation of thermal effects in the crystal during operation. Furthermore, the local heating effects in the crystal were used to determine the impact on both phase matching and thermal lensing to understand limitations that must be overcome to achieve microjoule-level pulse energies at high repetition rates.

  10. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    PubMed

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  11. Thermal effects in an ultrafast BiB 3O 6 optical parametric oscillator at high average powers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petersen, T.; Zuegel, J. D.; Bromage, J.

    An ultrafast, high-average-power, extended-cavity, femtosecond BiB 3O 6 optical parametric oscillator was constructed as a test bed for investigating the scalability of infrared parametric devices. Despite the high pulse energies achieved by this system, the reduction in slope efficiency near the maximum-available pump power prompted the investigation of thermal effects in the crystal during operation. Furthermore, the local heating effects in the crystal were used to determine the impact on both phase matching and thermal lensing to understand limitations that must be overcome to achieve microjoule-level pulse energies at high repetition rates.

  12. Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors

    NASA Astrophysics Data System (ADS)

    Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.

    2012-12-01

    Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.

  13. Toward Higher QA: From Parametric Release of Sterile Parenteral Products to PAT for Other Pharmaceutical Dosage Forms.

    PubMed

    Hock, Sia Chong; Constance, Neo Xue Rui; Wah, Chan Lai

    2012-01-01

    Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach for determining the pharmaceutical quality of the finished dosage form. In the case of terminally sterilized parenteral products, the limitations of conventional batch testing have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms, beyond terminally sterilized parenteral products. For parametric release to be possible, manufacturers must be capable of designing quality into the product, monitoring the manufacturing processes, and controlling the quality of intermediates and finished products in real-time. Process analytical technology (PAT) has been thought to be capable of contributing to these prerequisites. It is believed that the appropriate use of PAT tools can eventually lead to the possibility of real-time release of other pharmaceutical dosage forms, by-passing the need for end-product batch testing. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. Last but not least, current regulations governing the use of PAT and the manufacturing challenges associated with PAT implementation are also discussed. Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach. In the case of terminally sterilized parenteral products, these limitations have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms. With the advancement of process analytical technology (PAT), it is possible to monitor the manufacturing processes closely. This will eventually enable quality control of the intermediates and finished products, and thus their release in real-time. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. It will also discuss the current regulations governing the use of PAT and the manufacturing challenges associated with the implementation of PAT.

  14. Determination of Acreage Thermal Protection Foam Loss From Ice and Foam Impacts

    NASA Technical Reports Server (NTRS)

    Carney, Kelly S.; Lawrence, Charles

    2015-01-01

    A parametric study was conducted to establish Thermal Protection System (TPS) loss from foam and ice impact conditions similar to what might occur on the Space Launch System. This study was based upon the large amount of testing and analysis that was conducted with both ice and foam debris impacts on TPS acreage foam for the Space Shuttle Project External Tank. Test verified material models and modeling techniques that resulted from Space Shuttle related testing were utilized for this parametric study. Parameters varied include projectile mass, impact velocity and impact angle (5 degree and 10 degree impacts). The amount of TPS acreage foam loss as a result of the various impact conditions is presented.

  15. Performance Assessment of a Large Scale Pulsejet- Driven Ejector System

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Litke, Paul J.; Schauer, Frederick R.; Bradley, Royce P.; Hoke, John L.

    2006-01-01

    Unsteady thrust augmentation was measured on a large scale driver/ejector system. A 72 in. long, 6.5 in. diameter, 100 lb(sub f) pulsejet was tested with a series of straight, cylindrical ejectors of varying length, and diameter. A tapered ejector configuration of varying length was also tested. The objectives of the testing were to determine the dimensions of the ejectors which maximize thrust augmentation, and to compare the dimensions and augmentation levels so obtained with those of other, similarly maximized, but smaller scale systems on which much of the recent unsteady ejector thrust augmentation studies have been performed. An augmentation level of 1.71 was achieved with the cylindrical ejector configuration and 1.81 with the tapered ejector configuration. These levels are consistent with, but slightly lower than the highest levels achieved with the smaller systems. The ejector diameter yielding maximum augmentation was 2.46 times the diameter of the pulsejet. This ratio closely matches those of the small scale experiments. For the straight ejector, the length yielding maximum augmentation was 10 times the diameter of the pulsejet. This was also nearly the same as the small scale experiments. Testing procedures are described, as are the parametric variations in ejector geometry. Results are discussed in terms of their implications for general scaling of pulsed thrust ejector systems

  16. [Clinical values of hemodynamics assessment by parametric color coding of digital subtraction angiography before and after endovascular therapy for critical limb ischaemia].

    PubMed

    Su, Haobo; Lou, Wensheng; Gu, Jianping

    2015-10-06

    To investigate the feasibility of parametric color coding of digital subtraction angiography (Syngo iFlow) for hemodynamics assessment in patients with critical limb ischemia in pre- and post-endovascular therapy. To explore the correlation between Syngo iFlow and the conventional techniques. from January 2013 to December 2014, Clinical data of 21 patients with TASC II type B and type C femoropopliteal arteriosclerotic occlusive disease who were treated by percutaneous transluminal angioplasty and/or primary stent implantation in Nanjing first hospital were analyzed retrospectively. Of these patients there were 10 males and 11 females with an average age of (72±6) years (range from 58-85 years). The treatment efficacy was assessed by the variation of a series of clinical symptoms indexes (such as pain score, cold sensation score and intermittent claudication score), ankle braehial index (ABI) and transcutaneous oxygen pressure (TcPO2). Angiography was performed with the same protocol before and after treatment and parametric color coding of digital subtraction angiography was created by Syngo iFlow software on a dedicated workstation. The time to peak (TTP) of artery and tissue perfusion selected at the same regions of foot and ankle were measured and analyzed to evaluate the improvement of microcirculation and hemodynamics of the ischemic limb. The correlations between Syngo iFlow and the traditional clinical evaluation methods were explored using the Spearman rank correlation test. All patients (21 limbs) underwent successful endovaseular therapy. The mean pain score, cold sensation score, intermittent claudication score, ABI and TcPO2 before treatment were (0.48±0.68), (2.71±0.72), (2.86±0.85), ABI (0.33±0.07), TcPO2 (26.83±3.41) mmHg. While 1 week after treatment all above indicators were (2.57±0.93), (0.33±0.48), (0.90±0.54), (0.69±0.11), TcPO2 (53.75±3.60) mmHg respectively. There were significant statistical differences between pre- and post-treatment (P<0.05). The pre- and post-operative TTP of artery and tissue perfusion were (14.07±1.77) vs (10.43±2.05) s, (18.75±2.72) vs (15.38±2.78) s. For assessment of hemodynamic changes during and after treatment, parametric color coding of digital subtraction angiography (Syngo iFlow) was assumed to show the limb blood flow and perfusion were improved and the differences were statistically significant. The Spearman rank correlation test showed the TTP of artery was positively correlated with ABI, TcPO2 (r=0.65, 0.73, P<0.05), the TTP of tissue perfusion was also positively correlated with ABI, TcPO2 (r=0.60, 0.60, P<0.05). Parametric color coding of digital subtraction angiography (Syngo iFlow) is a real-time, sensitive and quantitative tool that might provide additional support in the hemodynamics evaluation of endovascular treatment for patients with lower extremity peripheral arterial occlusion disease.

  17. A Comparison of Kernel Equating and Traditional Equipercentile Equating Methods and the Parametric Bootstrap Methods for Estimating Standard Errors in Equipercentile Equating

    ERIC Educational Resources Information Center

    Choi, Sae Il

    2009-01-01

    This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…

  18. Characteristic mega-basin water storage behavior using GRACE.

    PubMed

    Reager, J T; Famiglietti, James S

    2013-06-01

    [1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA's Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km 2 ), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world's largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤  E f  ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation.

  19. Characteristic mega-basin water storage behavior using GRACE

    PubMed Central

    Reager, J T; Famiglietti, James S

    2013-01-01

    [1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA’s Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km2), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world’s largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤ Ef ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation. PMID:24563556

  20. Long-term vegetation activity trends in the Iberian Peninsula and The Balearic Islands using high spatial resolution NOAA-AVHRR data (1981 - 2015).

    NASA Astrophysics Data System (ADS)

    Martin-Hernandez, Natalia; Vicente-Serrano, Sergio; Azorin-Molina, Cesar; Begueria-Portugues, Santiago; Reig-Gracia, Fergus; Zabalza-Martínez, Javier

    2017-04-01

    We have analysed trends in the Normalized Difference Vegetation Index (NDVI) in the Iberian Peninsula and The Balearic Islands over the period 1981 - 2015 using a new high resolution data set from the entire available NOAA - AVHRR images (IBERIAN NDVI dataset). After a complete processing including geocoding, calibration, cloud removal, topographic correction and temporal filtering, we obtained bi-weekly time series. To assess the accuracy of the new IBERIAN NDVI time-series, we have compared temporal variability and trends of NDVI series with those results reported by GIMMS 3g and MODIS (MOD13A3) NDVI datasets. In general, the IBERIAN NDVI showed high reliability with these two products but showing higher spatial resolution than the GIMMS dataset and covering two more decades than the MODIS dataset. Using the IBERIAN NDVI dataset, we analysed NDVI trends by means of the non-parametric Mann-Kendall test and Theil-Sen slope estimator. In average, vegetation trends in the study area show an increase over the last decades. However, there are local spatial differences: the main increase has been recorded in humid regions of the north of the Iberian Peninsula. The statistical techniques allow finding abrupt and gradual changes in different land cover types during the analysed period. These changes are related with human activity due to land transformations (from dry to irrigated land), land abandonment and forest recovery.

  1. SPAGETTA: a Multi-Purpose Gridded Stochastic Weather Generator

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Huth, R.; Rotach, M. W.; Dabhi, H.

    2017-12-01

    SPAGETTA is a new multisite/gridded multivariate parametric stochastic weather generator (WG). Site-specific precipitation occurrence and amount are modelled by Markov chain and Gamma distribution, the non-precipitation variables are modelled by an autoregressive (AR) model conditioned on precipitation occurrence, and the spatial coherence of all variables is modelled following the Wilks' (2009) approach. SPAGETTA may be run in two modes. Mode 1: it is run as a classical WG, which is calibrated using weather series from multiple sites, and only then it may produce arbitrarily long synthetic series mimicking the spatial and temporal structure of the calibration data. To generate the weather series representing the future climate, the WG parameters are modified according to the climate change scenario, typically derived from GCM or RCM simulations. Mode 2: the user provides only basic information (not necessarily to be realistic) on the temporal and spatial auto-correlation structure of the weather variables and their mean annual cycle; the generator itself derives the parameters of the underlying AR model, which produces the multi-site weather series. Optionally, the user may add the spatially varying trend, which is superimposed to the synthetic series. The contribution consists of following parts: (a) Model of the WG. (b) Validation of WG in terms of the spatial temperature and precipitation characteristics, including characteristics of spatial hot/cold/dry/wet spells. (c) Results of the climate change impact experiment, in which the WG parameters representing the spatial and temporal variability are modified using the climate change scenarios and the effect on the above spatial validation indices is analysed. In this experiment, the WG is calibrated using the E-OBS gridded daily weather data for several European regions, and the climate change scenarios are derived from the selected RCM simulations (CORDEX database). (d) The second mode of operation will be demonstrated by results obtained while developing the methodology for assessing collective significance of trends in multi-site weather series. The performance of the proposed test statistics is assessed based on large number of realisations of synthetic series produced by WG assuming a given statistical structure and trend of the weather series.

  2. Nonparametric predictive inference for combining diagnostic tests with parametric copula

    NASA Astrophysics Data System (ADS)

    Muhammad, Noryanti; Coolen, F. P. A.; Coolen-Maturi, T.

    2017-09-01

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine and health care. The Receiver Operating Characteristic (ROC) curve is a popular statistical tool for describing the performance of diagnostic tests. The area under the ROC curve (AUC) is often used as a measure of the overall performance of the diagnostic test. In this paper, we interest in developing strategies for combining test results in order to increase the diagnostic accuracy. We introduce nonparametric predictive inference (NPI) for combining two diagnostic test results with considering dependence structure using parametric copula. NPI is a frequentist statistical framework for inference on a future observation based on past data observations. NPI uses lower and upper probabilities to quantify uncertainty and is based on only a few modelling assumptions. While copula is a well-known statistical concept for modelling dependence of random variables. A copula is a joint distribution function whose marginals are all uniformly distributed and it can be used to model the dependence separately from the marginal distributions. In this research, we estimate the copula density using a parametric method which is maximum likelihood estimator (MLE). We investigate the performance of this proposed method via data sets from the literature and discuss results to show how our method performs for different family of copulas. Finally, we briefly outline related challenges and opportunities for future research.

  3. Control law parameterization for an aeroelastic wind-tunnel model equipped with an active roll control system and comparison with experiment

    NASA Technical Reports Server (NTRS)

    Perry, Boyd, III; Dunn, H. J.; Sandford, Maynard C.

    1988-01-01

    Nominal roll control laws were designed, implemented, and tested on an aeroelastically-scaled free-to-roll wind-tunnel model of an advanced fighter configuration. The tests were performed in the NASA Langley Transonic Dynamics Tunnel. A parametric study of the nominal roll control system was conducted. This parametric study determined possible control system gain variations which yielded identical closed-loop stability (roll mode pole location) and identical roll response but different maximum control-surface deflections. Comparison of analytical predictions with wind-tunnel results was generally very good.

  4. Mapping the Chevallier-Polarski-Linder parametrization onto physical dark energy Models

    NASA Astrophysics Data System (ADS)

    Scherrer, Robert J.

    2015-08-01

    We examine the Chevallier-Polarski-Linder (CPL) parametrization, in the context of quintessence and barotropic dark energy models, to determine the subset of such models to which it can provide a good fit. The CPL parametrization gives the equation of state parameter w for the dark energy as a linear function of the scale factor a , namely w =w0+wa(1 -a ). In the case of quintessence models, we find that over most of the w0, wa parameter space the CPL parametrization maps onto a fairly narrow form of behavior for the potential V (ϕ ), while a one-dimensional subset of parameter space, for which wa=κ (1 +w0) , with κ constant, corresponds to a wide range of functional forms for V (ϕ ). For barotropic models, we show that the functional dependence of the pressure on the density, up to a multiplicative constant, depends only on wi=wa+w0 and not on w0 and wa separately. Our results suggest that the CPL parametrization may not be optimal for testing either type of model.

  5. Latent component-based gear tooth fault detection filter using advanced parametric modeling

    NASA Astrophysics Data System (ADS)

    Ettefagh, M. M.; Sadeghi, M. H.; Rezaee, M.; Chitsaz, S.

    2009-10-01

    In this paper, a new parametric model-based filter is proposed for gear tooth fault detection. The designing of the filter consists of identifying the most proper latent component (LC) of the undamaged gearbox signal by analyzing the instant modules (IMs) and instant frequencies (IFs) and then using the component with lowest IM as the proposed filter output for detecting fault of the gearbox. The filter parameters are estimated by using the LC theory in which an advanced parametric modeling method has been implemented. The proposed method is applied on the signals, extracted from simulated gearbox for detection of the simulated gear faults. In addition, the method is used for quality inspection of the produced Nissan-Junior vehicle gearbox by gear profile error detection in an industrial test bed. For evaluation purpose, the proposed method is compared with the previous parametric TAR/AR-based filters in which the parametric model residual is considered as the filter output and also Yule-Walker and Kalman filter are implemented for estimating the parameters. The results confirm the high performance of the new proposed fault detection method.

  6. Posttest analysis of MIST Test 320201 using TRAC-PF1/MOD1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebe, D.A.; Steiner, J.L.; Boyack, B.E.

    A posttest calculation and analysis of Multi-Loop Integral System Test 320201, a small-break loss-of-coolant accident (SBLOCA) test with a scaled 50-cm{sup 2} cold-leg pump discharge leak, has been completed and is reported herein. It was one in a series of tests, with leak size varied parametrically. Scaled leak sizes included 5, 10, (the nominal, Test 3109AA), and 50 cm{sub 2}. The test exhibited the major post-SBLOCA phenomena, as expected, including depressurization to saturation, interruption of loop flow, boiler-condenser mode cooling, refill, and postrefill cooldown. Full high-pressure injection and auxiliary feedwater were available, reactor coolant pumps were not available, and reactor-vesselmore » vent valves and guard heaters were automatically controlled. Constant level control in the steam-generator (SG) secondaries was used after SG-secondary refill; and symmetric SG pressure control was also used. The sequence of events seen in this test was similar to the sequence of events for much of the nominal test except that events occurred in a shorter time frame as the system inventory was reduced and the system depressurized at a faster rate. The calculation was performed using TRAC-PFL/MOD 1. Agreement between test data and the calculation was generally reasonable. All major trends and phenomena were correctly predicted. We believe that the correct conclusions about trends and phenomena will be reached if the code is used in similar applications.« less

  7. Posttest analysis of MIST Test 320201 using TRAC-PF1/MOD1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebe, D.A.; Steiner, J.L.; Boyack, B.E.

    A posttest calculation and analysis of Multi-Loop Integral System Test 320201, a small-break loss-of-coolant accident (SBLOCA) test with a scaled 50-cm[sup 2] cold-leg pump discharge leak, has been completed and is reported herein. It was one in a series of tests, with leak size varied parametrically. Scaled leak sizes included 5, 10, (the nominal, Test 3109AA), and 50 cm[sub 2]. The test exhibited the major post-SBLOCA phenomena, as expected, including depressurization to saturation, interruption of loop flow, boiler-condenser mode cooling, refill, and postrefill cooldown. Full high-pressure injection and auxiliary feedwater were available, reactor coolant pumps were not available, and reactor-vesselmore » vent valves and guard heaters were automatically controlled. Constant level control in the steam-generator (SG) secondaries was used after SG-secondary refill; and symmetric SG pressure control was also used. The sequence of events seen in this test was similar to the sequence of events for much of the nominal test except that events occurred in a shorter time frame as the system inventory was reduced and the system depressurized at a faster rate. The calculation was performed using TRAC-PFL/MOD 1. Agreement between test data and the calculation was generally reasonable. All major trends and phenomena were correctly predicted. We believe that the correct conclusions about trends and phenomena will be reached if the code is used in similar applications.« less

  8. A New Trend-Following Indicator: Using SSA to Design Trading Rules

    NASA Astrophysics Data System (ADS)

    Leles, Michel Carlo Rodrigues; Mozelli, Leonardo Amaral; Guimarães, Homero Nogueira

    Singular Spectrum Analysis (SSA) is a non-parametric approach that can be used to decompose a time-series as trends, oscillations and noise. Trend-following strategies rely on the principle that financial markets move in trends for an extended period of time. Moving Averages (MAs) are the standard indicator to design such strategies. In this study, SSA is used as an alternative method to enhance trend resolution in comparison with the traditional MA. New trading rules using SSA as indicator are proposed. This paper shows that for the Down Jones Industrial Average (DJIA) and Shangai Securities Composite Index (SSCI) time-series the SSA trading rules provided, in general, better results in comparison to MA trading rules.

  9. The measurement of acoustic properties of limited size panels by use of a parametric source

    NASA Astrophysics Data System (ADS)

    Humphrey, V. F.

    1985-01-01

    A method of measuring the acoustic properties of limited size panels immersed in water, with a truncated parametric array used as the acoustic source, is described. The insertion loss and reflection loss of thin metallic panels, typically 0·45 m square, were measured at normal incidence by using this technique. Results were obtained for a wide range of frequencies (10 to 100 kHz) and were found to be in good agreement with the theoretical predictions for plane waves. Measurements were also made of the insertion loss of aluminium, Perspex and G.R.P. panels for angles of incidence up to 50°. The broad bandwidth available from the parametric source permitted detailed measurements to be made over a wide frequency range using a single transmitting transducer. The small spot sizes obtainable with the parametric source also helped to reduce the significance of diffraction from edges of the panel under test.

  10. Observational Signatures of Parametric Instability at 1AU

    NASA Astrophysics Data System (ADS)

    Bowen, T. A.; Bale, S. D.; Badman, S.

    2017-12-01

    Observations and simulations of inertial compressive turbulence in the solar wind are characterized by density structures anti-correlated with magnetic fluctuations parallel to the mean field. This signature has been interpreted as observational evidence for non-propagating pressure balanced structures (PBS), kinetic ion acoustic waves, as well as the MHD slow mode. Recent work, specifically Verscharen et al. (2017), has highlighted the unexpected fluid like nature of the solar wind. Given the high damping rates of parallel propagating compressive fluctuations, their ubiquity in satellite observations is surprising and suggests the presence of a driving process. One possible candidate for the generation of compressive fluctuations in the solar wind is the parametric instability, in which large amplitude Alfvenic fluctuations decay into parallel propagating compressive waves. This work employs 10 years of WIND observations in order to test the parametric decay process as a source of compressive waves in the solar wind through comparing collisionless damping rates of compressive fluctuations with growth rates of the parametric instability. Preliminary results suggest that generation of compressive waves through parametric decay is overdamped at 1 AU. However, the higher parametric decay rates expected in the inner heliosphere likely allow for growth of the slow mode-the remnants of which could explain density fluctuations observed at 1AU.

  11. Efficient Verification of Holograms Using Mobile Augmented Reality.

    PubMed

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.

  12. Stochastic convergence of renewable energy consumption in OECD countries: a fractional integration approach.

    PubMed

    Solarin, Sakiru Adebola; Gil-Alana, Luis Alberiko; Al-Mulali, Usama

    2018-04-13

    In this article, we have examined the hypothesis of convergence of renewable energy consumption in 27 OECD countries. However, instead of relying on classical techniques, which are based on the dichotomy between stationarity I(0) and nonstationarity I(1), we consider a more flexible approach based on fractional integration. We employ both parametric and semiparametric techniques. Using parametric methods, evidence of convergence is found in the cases of Mexico, Switzerland and Sweden along with the USA, Portugal, the Czech Republic, South Korea and Spain, and employing semiparametric approaches, we found evidence of convergence in all these eight countries along with Australia, France, Japan, Greece, Italy and Poland. For the remaining 13 countries, even though the orders of integration of the series are smaller than one in all cases except Germany, the confidence intervals are so wide that we cannot reject the hypothesis of unit roots thus not finding support for the hypothesis of convergence.

  13. Debt and growth: A non-parametric approach

    NASA Astrophysics Data System (ADS)

    Brida, Juan Gabriel; Gómez, David Matesanz; Seijas, Maria Nela

    2017-11-01

    In this study, we explore the dynamic relationship between public debt and economic growth by using a non-parametric approach based on data symbolization and clustering methods. The study uses annual data of general government consolidated gross debt-to-GDP ratio and gross domestic product for sixteen countries between 1977 and 2015. Using symbolic sequences, we introduce a notion of distance between the dynamical paths of different countries. Then, a Minimal Spanning Tree and a Hierarchical Tree are constructed from time series to help detecting the existence of groups of countries sharing similar economic performance. The main finding of the study appears for the period 2008-2016 when several countries surpassed the 90% debt-to-GDP threshold. During this period, three groups (clubs) of countries are obtained: high, mid and low indebted countries, suggesting that the employed debt-to-GDP threshold drives economic dynamics for the selected countries.

  14. Parametric instability in the high power era of Advanced LIGO

    NASA Astrophysics Data System (ADS)

    Hardwick, Terra; Blair, Carl; Kennedy, Ross; Evans, Matthew; Fritschel, Peter; LIGO Virgo Scientific Collaboration

    2017-01-01

    After the first direct detections of gravitational waves, Advanced LIGO aims to increase its detection rate during the upcoming science runs through a series of detector improvements, including increased optical power. Higher circulating power increases the likelihood for three-mode parametric instabilities (PIs), in which mechanical modes of the mirrors scatter light into higher-order optical modes in the cavity and the resulting optical modes reinforce the mechanical modes via radiation pressure. Currently, LIGO uses two PI mitigation methods: thermal tuning to change the cavity g-factor and effectively decrease the frequency overlap between mechanical and optical modes, and active damping of mechanical modes with electrostatic actuation. While the combined methods provide stability at the current operating power, there is evidence that these will be insufficient for the next planned power increase; future suppression methods including acoustic mode dampers and dynamic g-factor modulation are discussed.

  15. Statistical evaluation of rainfall time series in concurrence with agriculture and water resources of Ken River basin, Central India (1901-2010)

    NASA Astrophysics Data System (ADS)

    Meshram, Sarita Gajbhiye; Singh, Sudhir Kumar; Meshram, Chandrashekhar; Deo, Ravinesh C.; Ambade, Balram

    2017-12-01

    Trend analysis of long-term rainfall records can be used to facilitate better agriculture water management decision and climate risk studies. The main objective of this study was to identify the existing trends in the long-term rainfall time series over the period 1901-2010 utilizing 12 hydrological stations located at the Ken River basin (KRB) in Madhya Pradesh, India. To investigate the different trends, the rainfall time series data were divided into annual and seasonal (i.e., pre-monsoon, monsoon, post-monsoon, and winter season) sub-sets, and a statistical analysis of data using the non-parametric Mann-Kendall (MK) test and the Sen's slope approach was applied to identify the nature of the existing trends in rainfall series for the Ken River basin. The obtained results were further interpolated with the aid of the Quantum Geographic Information System (GIS) approach employing the inverse distance weighted approach. The results showed that the monsoon and the winter season exhibited a negative trend in rainfall changes over the period of study, and this was true for all stations, although the changes during the pre- and the post-monsoon seasons were less significant. The outcomes of this research study also suggest significant decreases in the seasonal and annual trends of rainfall amounts in the study period. These findings showing a clear signature of climate change impacts on KRB region potentially have implications in terms of climate risk management strategies to be developed during major growing and harvesting seasons and also to aid in the appropriate water resource management strategies that must be implemented in decision-making process.

  16. Linking the Weather Generator with Regional Climate Model: Effect of Higher Resolution

    NASA Astrophysics Data System (ADS)

    Dubrovsky, Martin; Huth, Radan; Farda, Ales; Skalak, Petr

    2014-05-01

    This contribution builds on our last year EGU contribution, which followed two aims: (i) validation of the simulations of the present climate made by the ALADIN-Climate Regional Climate Model (RCM) at 25 km resolution, and (ii) presenting a methodology for linking the parametric weather generator (WG) with RCM output (aiming to calibrate a gridded WG capable of producing realistic synthetic multivariate weather series for weather-ungauged locations). Now we have available new higher-resolution (6.25 km) simulations with the same RCM. The main topic of this contribution is an anser to a following question: What is an effect of using a higher spatial resolution on a quality of simulating the surface weather characteristics? In the first part, the high resolution RCM simulation of the present climate will be validated in terms of selected WG parameters, which are derived from the RCM-simulated surface weather series and compared to those derived from weather series observed in 125 Czech meteorological stations. The set of WG parameters will include statistics of the surface temperature and precipitation series. When comparing the WG parameters from the two sources (RCM vs observations), we interpolate the RCM-based parameters into the station locations while accounting for the effect of altitude. In the second part, we will discuss an effect of using the higher resolution: the results of the validation tests will be compared with those obtained with the lower-resolution RCM. Acknowledgements: The present experiment is made within the frame of projects ALARO-Climate (project P209/11/2405 sponsored by the Czech Science Foundation), WG4VALUE (project LD12029 sponsored by the Ministry of Education, Youth and Sports of CR) and VALUE (COST ES 1102 action).

  17. Total recognition discriminability in Huntington's and Alzheimer's disease.

    PubMed

    Graves, Lisa V; Holden, Heather M; Delano-Wood, Lisa; Bondi, Mark W; Woods, Steven Paul; Corey-Bloom, Jody; Salmon, David P; Delis, Dean C; Gilbert, Paul E

    2017-03-01

    Both the original and second editions of the California Verbal Learning Test (CVLT) provide an index of total recognition discriminability (TRD) but respectively utilize nonparametric and parametric formulas to compute the index. However, the degree to which population differences in TRD may vary across applications of these nonparametric and parametric formulas has not been explored. We evaluated individuals with Huntington's disease (HD), individuals with Alzheimer's disease (AD), healthy middle-aged adults, and healthy older adults who were administered the CVLT-II. Yes/no recognition memory indices were generated, including raw nonparametric TRD scores (as used in CVLT-I) and raw and standardized parametric TRD scores (as used in CVLT-II), as well as false positive (FP) rates. Overall, the patient groups had significantly lower TRD scores than their comparison groups. The application of nonparametric and parametric formulas resulted in comparable effect sizes for all group comparisons on raw TRD scores. Relative to the HD group, the AD group showed comparable standardized parametric TRD scores (despite lower raw nonparametric and parametric TRD scores), whereas the previous CVLT literature has shown that standardized TRD scores are lower in AD than in HD. Possible explanations for the similarity in standardized parametric TRD scores in the HD and AD groups in the present study are discussed, with an emphasis on the importance of evaluating TRD scores in the context of other indices such as FP rates in an effort to fully capture recognition memory function using the CVLT-II.

  18. Gallium arsenide (GaAs) solar cell modeling studies

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.

    1980-01-01

    Various models were constructed which will allow for the variation of system components. Computer studies were then performed using the models constructed in order to study the effects of various system changes. In particular, GaAs and Si flat plate solar power arrays were studied and compared. Series and shunt resistance models were constructed. Models for the chemical kinetics of the annealing process were prepared. For all models constructed, various parametric studies were performed.

  19. Accelerating Imitation Learning in Relational Domains via Transfer by Initialization

    DTIC Science & Technology

    2013-08-28

    Warcraft , regulation of traffic lights, logistics, and a variety of planning domains. A supervised learning method for imitation learning was recently...some information about the world (traffic density at a signal, distance to the goal etc.). We assume a functional parametrization over the policy and...strategy (RTS) game engine written in C based off the Warcraft series of games. Like all RTS games, it allows multiple agents to be controlled

  20. Development of the Complex General Linear Model in the Fourier Domain: Application to fMRI Multiple Input-Output Evoked Responses for Single Subjects

    PubMed Central

    Rio, Daniel E.; Rawlings, Robert R.; Woltz, Lawrence A.; Gilman, Jodi; Hommer, Daniel W.

    2013-01-01

    A linear time-invariant model based on statistical time series analysis in the Fourier domain for single subjects is further developed and applied to functional MRI (fMRI) blood-oxygen level-dependent (BOLD) multivariate data. This methodology was originally developed to analyze multiple stimulus input evoked response BOLD data. However, to analyze clinical data generated using a repeated measures experimental design, the model has been extended to handle multivariate time series data and demonstrated on control and alcoholic subjects taken from data previously analyzed in the temporal domain. Analysis of BOLD data is typically carried out in the time domain where the data has a high temporal correlation. These analyses generally employ parametric models of the hemodynamic response function (HRF) where prewhitening of the data is attempted using autoregressive (AR) models for the noise. However, this data can be analyzed in the Fourier domain. Here, assumptions made on the noise structure are less restrictive, and hypothesis tests can be constructed based on voxel-specific nonparametric estimates of the hemodynamic transfer function (HRF in the Fourier domain). This is especially important for experimental designs involving multiple states (either stimulus or drug induced) that may alter the form of the response function. PMID:23840281

  1. Development of the complex general linear model in the Fourier domain: application to fMRI multiple input-output evoked responses for single subjects.

    PubMed

    Rio, Daniel E; Rawlings, Robert R; Woltz, Lawrence A; Gilman, Jodi; Hommer, Daniel W

    2013-01-01

    A linear time-invariant model based on statistical time series analysis in the Fourier domain for single subjects is further developed and applied to functional MRI (fMRI) blood-oxygen level-dependent (BOLD) multivariate data. This methodology was originally developed to analyze multiple stimulus input evoked response BOLD data. However, to analyze clinical data generated using a repeated measures experimental design, the model has been extended to handle multivariate time series data and demonstrated on control and alcoholic subjects taken from data previously analyzed in the temporal domain. Analysis of BOLD data is typically carried out in the time domain where the data has a high temporal correlation. These analyses generally employ parametric models of the hemodynamic response function (HRF) where prewhitening of the data is attempted using autoregressive (AR) models for the noise. However, this data can be analyzed in the Fourier domain. Here, assumptions made on the noise structure are less restrictive, and hypothesis tests can be constructed based on voxel-specific nonparametric estimates of the hemodynamic transfer function (HRF in the Fourier domain). This is especially important for experimental designs involving multiple states (either stimulus or drug induced) that may alter the form of the response function.

  2. A note on the correlation between circular and linear variables with an application to wind direction and air temperature data in a Mediterranean climate

    NASA Astrophysics Data System (ADS)

    Lototzis, M.; Papadopoulos, G. K.; Droulia, F.; Tseliou, A.; Tsiros, I. X.

    2018-04-01

    There are several cases where a circular variable is associated with a linear one. A typical example is wind direction that is often associated with linear quantities such as air temperature and air humidity. The analysis of a statistical relationship of this kind can be tested by the use of parametric and non-parametric methods, each of which has its own advantages and drawbacks. This work deals with correlation analysis using both the parametric and the non-parametric procedure on a small set of meteorological data of air temperature and wind direction during a summer period in a Mediterranean climate. Correlations were examined between hourly, daily and maximum-prevailing values, under typical and non-typical meteorological conditions. Both tests indicated a strong correlation between mean hourly wind directions and mean hourly air temperature, whereas mean daily wind direction and mean daily air temperature do not seem to be correlated. In some cases, however, the two procedures were found to give quite dissimilar levels of significance on the rejection or not of the null hypothesis of no correlation. The simple statistical analysis presented in this study, appropriately extended in large sets of meteorological data, may be a useful tool for estimating effects of wind on local climate studies.

  3. Testing the cosmic conservation of photon number with type Ia supernovae and ages of old objects

    NASA Astrophysics Data System (ADS)

    Jesus, J. F.; Holanda, R. F. L.; Dantas, M. A.

    2017-12-01

    In this paper, we obtain luminosity distances by using ages of 32 old passive galaxies distributed over the redshift interval 0.11< z < 1.84 and test the cosmic conservation of photon number by comparing them with 580 distance moduli of type Ia supernovae (SNe Ia) from the so-called Union 2.1 compilation. Our analyses are based on the fact that the method of obtaining ages of galaxies relies on the detailed shape of galaxy spectra but not on galaxy luminosity. Possible departures from cosmic conservation of photon number is parametrized by τ (z) = 2 ɛ z and τ (z) = ɛ z/(1+z) (for ɛ =0 the conservation of photon number is recovered). We find ɛ =0.016^{+0.078}_{-0.075} from the first parametrization and ɛ =- 0.18^{+0.25}_{-0.24} from the second parametrization, both limits at 95% c.l. In this way, no significant departure from cosmic conservation of photon number is verified. In addition, by considering the total age as inferred from Planck (2015) analysis, we find the incubation time t_{inc}=1.66± 0.29 Gyr and t_{inc}=1.23± 0.27 Gyr at 68% c.l. for each parametrization, respectively.

  4. 4D Cone-beam CT reconstruction using a motion model based on principal component analysis

    PubMed Central

    Staub, David; Docef, Alen; Brock, Robert S.; Vaman, Constantin; Murphy, Martin J.

    2011-01-01

    Purpose: To provide a proof of concept validation of a novel 4D cone-beam CT (4DCBCT) reconstruction algorithm and to determine the best methods to train and optimize the algorithm. Methods: The algorithm animates a patient fan-beam CT (FBCT) with a patient specific parametric motion model in order to generate a time series of deformed CTs (the reconstructed 4DCBCT) that track the motion of the patient anatomy on a voxel by voxel scale. The motion model is constrained by requiring that projections cast through the deformed CT time series match the projections of the raw patient 4DCBCT. The motion model uses a basis of eigenvectors that are generated via principal component analysis (PCA) of a training set of displacement vector fields (DVFs) that approximate patient motion. The eigenvectors are weighted by a parameterized function of the patient breathing trace recorded during 4DCBCT. The algorithm is demonstrated and tested via numerical simulation. Results: The algorithm is shown to produce accurate reconstruction results for the most complicated simulated motion, in which voxels move with a pseudo-periodic pattern and relative phase shifts exist between voxels. The tests show that principal component eigenvectors trained on DVFs from a novel 2D/3D registration method give substantially better results than eigenvectors trained on DVFs obtained by conventionally registering 4DCBCT phases reconstructed via filtered backprojection. Conclusions: Proof of concept testing has validated the 4DCBCT reconstruction approach for the types of simulated data considered. In addition, the authors found the 2D/3D registration approach to be our best choice for generating the DVF training set, and the Nelder-Mead simplex algorithm the most robust optimization routine. PMID:22149852

  5. Development and fabrication of S-band chip varactor parametric amplifier

    NASA Technical Reports Server (NTRS)

    Kramer, E.

    1974-01-01

    A noncryogenic, S-band parametric amplifier operating in the 2.2 to 2.3 GHz band and having an average input noise temperature of less than 30 K was built and tested. The parametric amplifier module occupies a volume of less than 1-1/4 cubic feet and weighs less than 60 pounds. The module is designed for use in various NASA ground stations to replace larger, more complex cryogenic units which require considerably more maintenance because of the cryogenic refrigeration system employed. The amplifier can be located up to 15 feet from the power supply unit. Optimum performance was achieved through the use of high-quality unpackaged (chip) varactors in the amplifier design.

  6. Parametric study of extended end-plate connection using finite element modeling

    NASA Astrophysics Data System (ADS)

    Mureşan, Ioana Cristina; Bâlc, Roxana

    2017-07-01

    End-plate connections with preloaded high strength bolts represent a convenient, fast and accurate solution for beam-to-column joints. The behavior of framework joints build up with this type of connection are sensitive dependent on geometrical and material characteristics of the elements connected. This paper presents results of parametric analyses on the behavior of a bolted extended end-plate connection using finite element modeling program Abaqus. This connection was experimentally tested in the Laboratory of Faculty of Civil Engineering from Cluj-Napoca and the results are briefly reviewed in this paper. The numerical model of the studied connection was described in detail in [1] and provides data for this parametric study.

  7. Bayesian non-parametric inference for stochastic epidemic models using Gaussian Processes.

    PubMed

    Xu, Xiaoguang; Kypraios, Theodore; O'Neill, Philip D

    2016-10-01

    This paper considers novel Bayesian non-parametric methods for stochastic epidemic models. Many standard modeling and data analysis methods use underlying assumptions (e.g. concerning the rate at which new cases of disease will occur) which are rarely challenged or tested in practice. To relax these assumptions, we develop a Bayesian non-parametric approach using Gaussian Processes, specifically to estimate the infection process. The methods are illustrated with both simulated and real data sets, the former illustrating that the methods can recover the true infection process quite well in practice, and the latter illustrating that the methods can be successfully applied in different settings. © The Author 2016. Published by Oxford University Press.

  8. A non-parametric consistency test of the ΛCDM model with Planck CMB data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghamousa, Amir; Shafieloo, Arman; Hamann, Jan, E-mail: amir@aghamousa.com, E-mail: jan.hamann@unsw.edu.au, E-mail: shafieloo@kasi.re.kr

    Non-parametric reconstruction methods, such as Gaussian process (GP) regression, provide a model-independent way of estimating an underlying function and its uncertainty from noisy data. We demonstrate how GP-reconstruction can be used as a consistency test between a given data set and a specific model by looking for structures in the residuals of the data with respect to the model's best-fit. Applying this formalism to the Planck temperature and polarisation power spectrum measurements, we test their global consistency with the predictions of the base ΛCDM model. Our results do not show any serious inconsistencies, lending further support to the interpretation ofmore » the base ΛCDM model as cosmology's gold standard.« less

  9. Impact Response Comparison Between Parametric Human Models and Postmortem Human Subjects with a Wide Range of Obesity Levels.

    PubMed

    Zhang, Kai; Cao, Libo; Wang, Yulong; Hwang, Eunjoo; Reed, Matthew P; Forman, Jason; Hu, Jingwen

    2017-10-01

    Field data analyses have shown that obesity significantly increases the occupant injury risks in motor vehicle crashes, but the injury assessment tools for people with obesity are largely lacking. The objectives of this study were to use a mesh morphing method to rapidly generate parametric finite element models with a wide range of obesity levels and to evaluate their biofidelity against impact tests using postmortem human subjects (PMHS). Frontal crash tests using three PMHS seated in a vehicle rear seat compartment with body mass index (BMI) from 24 to 40 kg/m 2 were selected. To develop the human models matching the PMHS geometry, statistical models of external body shape, rib cage, pelvis, and femur were applied to predict the target geometry using age, sex, stature, and BMI. A mesh morphing method based on radial basis functions was used to rapidly morph a baseline human model into the target geometry. The model-predicted body excursions and injury measures were compared to the PMHS tests. Comparisons of occupant kinematics and injury measures between the tests and simulations showed reasonable correlations across the wide range of BMI levels. The parametric human models have the capability to account for the obesity effects on the occupant impact responses and injury risks. © 2017 The Obesity Society.

  10. Intratumor distribution and test-retest comparisons of physiological parameters quantified by dynamic contrast-enhanced MRI in rat U251 glioma.

    PubMed

    Aryal, Madhava P; Nagaraja, Tavarekere N; Brown, Stephen L; Lu, Mei; Bagher-Ebadian, Hassan; Ding, Guangliang; Panda, Swayamprava; Keenan, Kelly; Cabral, Glauber; Mikkelsen, Tom; Ewing, James R

    2014-10-01

    The distribution of dynamic contrast-enhanced MRI (DCE-MRI) parametric estimates in a rat U251 glioma model was analyzed. Using Magnevist as contrast agent (CA), 17 nude rats implanted with U251 cerebral glioma were studied by DCE-MRI twice in a 24 h interval. A data-driven analysis selected one of three models to estimate either (1) plasma volume (vp), (2) vp and forward volume transfer constant (K(trans)) or (3) vp, K(trans) and interstitial volume fraction (ve), constituting Models 1, 2 and 3, respectively. CA distribution volume (VD) was estimated in Model 3 regions by Logan plots. Regions of interest (ROIs) were selected by model. In the Model 3 ROI, descriptors of parameter distributions--mean, median, variance and skewness--were calculated and compared between the two time points for repeatability. All distributions of parametric estimates in Model 3 ROIs were positively skewed. Test-retest differences between population summaries for any parameter were not significant (p ≥ 0.10; Wilcoxon signed-rank and paired t tests). These and similar measures of parametric distribution and test-retest variance from other tumor models can be used to inform the choice of biomarkers that best summarize tumor status and treatment effects. Copyright © 2014 John Wiley & Sons, Ltd.

  11. SPM analysis of parametric (R)-[11C]PK11195 binding images: plasma input versus reference tissue parametric methods.

    PubMed

    Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald

    2007-05-01

    (R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).

  12. Can color-coded parametric maps improve dynamic enhancement pattern analysis in MR mammography?

    PubMed

    Baltzer, P A; Dietzel, M; Vag, T; Beger, S; Freiberg, C; Herzog, A B; Gajda, M; Camara, O; Kaiser, W A

    2010-03-01

    Post-contrast enhancement characteristics (PEC) are a major criterion for differential diagnosis in MR mammography (MRM). Manual placement of regions of interest (ROIs) to obtain time/signal intensity curves (TSIC) is the standard approach to assess dynamic enhancement data. Computers can automatically calculate the TSIC in every lesion voxel and combine this data to form one color-coded parametric map (CCPM). Thus, the TSIC of the whole lesion can be assessed. This investigation was conducted to compare the diagnostic accuracy (DA) of CCPM with TSIC for the assessment of PEC. 329 consecutive patients with 469 histologically verified lesions were examined. MRM was performed according to a standard protocol (1.5 T, 0.1 mmol/kgbw Gd-DTPA). ROIs were drawn manually within any lesion to calculate the TSIC. CCPMs were created in all patients using dedicated software (CAD Sciences). Both methods were rated by 2 observers in consensus on an ordinal scale. Receiver operating characteristics (ROC) analysis was used to compare both methods. The area under the curve (AUC) was significantly (p=0.026) higher for CCPM (0.829) than TSIC (0.749). The sensitivity was 88.5% (CCPM) vs. 82.8% (TSIC), whereas equal specificity levels were found (CCPM: 63.7%, TSIC: 63.0%). The color-coded parametric maps (CCPMs) showed a significantly higher DA compared to TSIC, in particular the sensitivity could be increased. Therefore, the CCPM method is a feasible approach to assessing dynamic data in MRM and condenses several imaging series into one parametric map. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Parametric Interactions between Alfven waves in LaPD

    NASA Astrophysics Data System (ADS)

    Brugman, B.; Carter, T. A.; Cowley, S. C.; Pribyl, P.; Lybarger, W.

    2004-11-01

    The physics governing interactions between large amplitude Alfvén waves, which are relevant to plasmas in space as well as the laboratory, is at present not well understood. A major class of such interactions which are believed to occur in compressible plasmas is referred to as parametric decay. We will present the results of a series of experiments involving the interactions of large amplitude LHP Alfvén wave conducted on the Large Plasma Device (LaPD); where β ≪ 1, n ˜ 10^12 frac1cm^3 and B0 in (200,2500) G. These experiments show strong signs of one form of parametric decay, known as the Modulational Instability, which represents the interaction of two Alfvén waves and a low frequency density perturbation. This interaction is believed to occur in plasmas with β < 1 as well as β > 1, over a broad range of wavevector space, and for RHP as well as LHP Alfvén waves - distinguishing it from the Beat and Decay instabilities. Details of this interaction, in particular the structure of the incident waves as well as that of their byproducts, will be shown in physical as well as wavevector space. The generation of large amplitude waves using both an Alfvén wave MASER and high current loop antennas will also be illustrated. Lastly theoretical descriptions of parametric decay will be presented and compared to observations. Future work will also include comparisons of experimental results with applicable simulations, such as GS2. Work supported by DOE grant number DE-FG03-02ER54688

  14. Application of Taguchi methods to dual mixture ratio propulsion system optimization for SSTO vehicles

    NASA Technical Reports Server (NTRS)

    Stanley, Douglas O.; Unal, Resit; Joyner, C. R.

    1992-01-01

    The application of advanced technologies to future launch vehicle designs would allow the introduction of a rocket-powered, single-stage-to-orbit (SSTO) launch system early in the next century. For a selected SSTO concept, a dual mixture ratio, staged combustion cycle engine that employs a number of innovative technologies was selected as the baseline propulsion system. A series of parametric trade studies are presented to optimize both a dual mixture ratio engine and a single mixture ratio engine of similar design and technology level. The effect of varying lift-off thrust-to-weight ratio, engine mode transition Mach number, mixture ratios, area ratios, and chamber pressure values on overall vehicle weight is examined. The sensitivity of the advanced SSTO vehicle to variations in each of these parameters is presented, taking into account the interaction of each of the parameters with each other. This parametric optimization and sensitivity study employs a Taguchi design method. The Taguchi method is an efficient approach for determining near-optimum design parameters using orthogonal matrices from design of experiments (DOE) theory. Using orthogonal matrices significantly reduces the number of experimental configurations to be studied. The effectiveness and limitations of the Taguchi method for propulsion/vehicle optimization studies as compared to traditional single-variable parametric trade studies is also discussed.

  15. Nonparametric functional data estimation applied to ozone data: prediction and extreme value analysis.

    PubMed

    Quintela-del-Río, Alejandro; Francisco-Fernández, Mario

    2011-02-01

    The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Fabrication and Testing of Microfluidic Optomechanical Oscillators

    PubMed Central

    Han, Kewen; Kim, Kyu Hyun; Kim, Junhwan; Lee, Wonsuk; Liu, Jing; Fan, Xudong; Carmon, Tal; Bahl, Gaurav

    2014-01-01

    Cavity optomechanics experiments that parametrically couple the phonon modes and photon modes have been investigated in various optical systems including microresonators. However, because of the increased acoustic radiative losses during direct liquid immersion of optomechanical devices, almost all published optomechanical experiments have been performed in solid phase. This paper discusses a recently introduced hollow microfluidic optomechanical resonator. Detailed methodology is provided to fabricate these ultra-high-Q microfluidic resonators, perform optomechanical testing, and measure radiation pressure-driven breathing mode and SBS-driven whispering gallery mode parametric vibrations. By confining liquids inside the capillary resonator, high mechanical- and optical- quality factors are simultaneously maintained. PMID:24962013

  17. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    PubMed

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Parametric versus Cox's model: an illustrative analysis of divorce in Canada.

    PubMed

    Balakrishnan, T R; Rao, K V; Krotki, K J; Lapierre-adamcyk, E

    1988-06-01

    Recent demographic literature clearly recognizes the importance of survival modes in the analysis of cross-sectional event histories. Of the various survival models, Cox's (1972) partial parametric model has been very popular due to its simplicity, and readily available computer software for estimation, sometimes at the cost of precision and parsimony of the model. This paper focuses on parametric failure time models for event history analysis such as Weibell, lognormal, loglogistic, and exponential models. The authors also test the goodness of fit of these parametric models versus the Cox's proportional hazards model taking Kaplan-Meier estimate as base. As an illustration, the authors reanalyze the Canadian Fertility Survey data on 1st marriage dissolution with parametric models. Though these parametric model estimates were not very different from each other, there seemed to be a slightly better fit with loglogistic. When 8 covariates were used in the analysis, it was found that the coefficients were similar in the models, and the overall conclusions about the relative risks would not have been different. The findings reveal that in marriage dissolution, the differences according to demographic and socioeconomic characteristics may be far more important than is generally found in many studies. Therefore, one should not treat the population as homogeneous in analyzing survival probabilities of marriages, other than for cursory analysis of overall trends.

  19. Parametric Inlet Tested in Glenn's 10- by 10-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Davis, David O.; Solano, Paul A.

    2005-01-01

    The Parametric Inlet is an innovative concept for the inlet of a gas-turbine propulsion system for supersonic aircraft. The concept approaches the performance of past inlet concepts, but with less mechanical complexity, lower weight, and greater aerodynamic stability and safety. Potential applications include supersonic cruise aircraft and missiles. The Parametric Inlet uses tailored surfaces to turn the incoming supersonic flow inward toward an axis of symmetry. The terminal shock spans the opening of the subsonic diffuser leading to the engine. The external cowl area is smaller, which reduces cowl drag. The use of only external supersonic compression avoids inlet unstart--an unsafe shock instability present in previous inlet designs that use internal supersonic compression. This eliminates the need for complex mechanical systems to control unstart, which reduces weight. The conceptual design was conceived by TechLand Research, Inc. (North Olmsted, OH), which received funding through NASA s Small-Business Innovation Research program. The Boeing Company (Seattle, WA) also participated in the conceptual design. The NASA Glenn Research Center became involved starting with the preliminary design of a model for testing in Glenn s 10- by 10-Foot Supersonic Wind Tunnel (10 10 SWT). The inlet was sized for a speed of Mach 2.35 while matching requirements of an existing cold pipe used in previous inlet tests. The parametric aspects of the model included interchangeable components for different cowl lip, throat slot, and sidewall leading-edge shapes and different vortex generator configurations. Glenn researchers used computational fluid dynamics (CFD) tools for three-dimensional, turbulent flow analysis to further refine the aerodynamic design.

  20. Parametric correlation functions to model the structure of permanent environmental (co)variances in milk yield random regression models.

    PubMed

    Bignardi, A B; El Faro, L; Cardoso, V L; Machado, P F; Albuquerque, L G

    2009-09-01

    The objective of the present study was to estimate milk yield genetic parameters applying random regression models and parametric correlation functions combined with a variance function to model animal permanent environmental effects. A total of 152,145 test-day milk yields from 7,317 first lactations of Holstein cows belonging to herds located in the southeastern region of Brazil were analyzed. Test-day milk yields were divided into 44 weekly classes of days in milk. Contemporary groups were defined by herd-test-day comprising a total of 2,539 classes. The model included direct additive genetic, permanent environmental, and residual random effects. The following fixed effects were considered: contemporary group, age of cow at calving (linear and quadratic regressions), and the population average lactation curve modeled by fourth-order orthogonal Legendre polynomial. Additive genetic effects were modeled by random regression on orthogonal Legendre polynomials of days in milk, whereas permanent environmental effects were estimated using a stationary or nonstationary parametric correlation function combined with a variance function of different orders. The structure of residual variances was modeled using a step function containing 6 variance classes. The genetic parameter estimates obtained with the model using a stationary correlation function associated with a variance function to model permanent environmental effects were similar to those obtained with models employing orthogonal Legendre polynomials for the same effect. A model using a sixth-order polynomial for additive effects and a stationary parametric correlation function associated with a seventh-order variance function to model permanent environmental effects would be sufficient for data fitting.

  1. Off-line wafer level reliability control: unique measurement method to monitor the lifetime indicator of gate oxide validated within bipolar/CMOS/DMOS technology

    NASA Astrophysics Data System (ADS)

    Gagnard, Xavier; Bonnaud, Olivier

    2000-08-01

    We have recently published a paper on a new rapid method for the determination of the lifetime of the gate oxide involved in a Bipolar/CMOS/DMOS technology (BCD). Because this previous method was based on a current measurement with gate voltage as a parameter needing several stress voltages, it was applied only by lot sampling. Thus, we tried to find an indicator in order to monitor the gate oxide lifetime during the wafer level parametric test and involving only one measurement of the device on each wafer test cell. Using the Weibull law and Crook model, combined with our recent model, we have developed a new test method needing only one electrical measurement of MOS capacitor to monitor the quality of the gate oxide. Based also on a current measurement, the parameter is the lifetime indicator of the gate oxide. From the analysis of several wafers, we gave evidence of the possibility to detect a low performance wafer, which corresponds to the infantile failure on the Weibull plot. In order to insert this new method in the BCD parametric program, a parametric flowchart was established. This type of measurement is an important challenges, because the actual measurements, breakdown charge, Qbd, and breakdown electric field, Ebd, at parametric level and Ebd and interface states density, Dit during the process cannot guarantee the gate oxide lifetime all along fabrication process. This indicator measurement is the only one, which predicts the lifetime decrease.

  2. Detecting correlation changes in multivariate time series: A comparison of four non-parametric change point detection methods.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva

    2017-06-01

    Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.

  3. Neural mechanisms of planning: A computational analysis using event-related fMRI

    PubMed Central

    Fincham, Jon M.; Carter, Cameron S.; van Veen, Vincent; Stenger, V. Andrew; Anderson, John R.

    2002-01-01

    To investigate the neural mechanisms of planning, we used a novel adaptation of the Tower of Hanoi (TOH) task and event-related functional MRI. Participants were trained in applying a specific strategy to an isomorph of the five-disk TOH task. After training, participants solved novel problems during event-related functional MRI. A computational cognitive model of the task was used to generate a reference time series representing the expected blood oxygen level-dependent response in brain areas involved in the manipulation and planning of goals. This time series was used as one term within a general linear modeling framework to identify brain areas in which the time course of activity varied as a function of goal-processing events. Two distinct time courses of activation were identified, one in which activation varied parametrically with goal-processing operations, and the other in which activation became pronounced only during goal-processing intensive trials. Regions showing the parametric relationship comprised a frontoparietal system and include right dorsolateral prefrontal cortex [Brodmann's area (BA 9)], bilateral parietal (BA 40/7), and bilateral premotor (BA 6) areas. Regions preferentially engaged only during goal-intensive processing include left inferior frontal gyrus (BA 44). The implications of these results for the current model, as well as for our understanding of the neural mechanisms of planning and functional specialization of the prefrontal cortex, are discussed. PMID:11880658

  4. The linear transformation model with frailties for the analysis of item response times.

    PubMed

    Wang, Chun; Chang, Hua-Hua; Douglas, Jeffrey A

    2013-02-01

    The item response times (RTs) collected from computerized testing represent an underutilized source of information about items and examinees. In addition to knowing the examinees' responses to each item, we can investigate the amount of time examinees spend on each item. In this paper, we propose a semi-parametric model for RTs, the linear transformation model with a latent speed covariate, which combines the flexibility of non-parametric modelling and the brevity as well as interpretability of parametric modelling. In this new model, the RTs, after some non-parametric monotone transformation, become a linear model with latent speed as covariate plus an error term. The distribution of the error term implicitly defines the relationship between the RT and examinees' latent speeds; whereas the non-parametric transformation is able to describe various shapes of RT distributions. The linear transformation model represents a rich family of models that includes the Cox proportional hazards model, the Box-Cox normal model, and many other models as special cases. This new model is embedded in a hierarchical framework so that both RTs and responses are modelled simultaneously. A two-stage estimation method is proposed. In the first stage, the Markov chain Monte Carlo method is employed to estimate the parametric part of the model. In the second stage, an estimating equation method with a recursive algorithm is adopted to estimate the non-parametric transformation. Applicability of the new model is demonstrated with a simulation study and a real data application. Finally, methods to evaluate the model fit are suggested. © 2012 The British Psychological Society.

  5. Formation of parametric images using mixed-effects models: a feasibility study.

    PubMed

    Huang, Husan-Ming; Shih, Yi-Yu; Lin, Chieh

    2016-03-01

    Mixed-effects models have been widely used in the analysis of longitudinal data. By presenting the parameters as a combination of fixed effects and random effects, mixed-effects models incorporating both within- and between-subject variations are capable of improving parameter estimation. In this work, we demonstrate the feasibility of using a non-linear mixed-effects (NLME) approach for generating parametric images from medical imaging data of a single study. By assuming that all voxels in the image are independent, we used simulation and animal data to evaluate whether NLME can improve the voxel-wise parameter estimation. For testing purposes, intravoxel incoherent motion (IVIM) diffusion parameters including perfusion fraction, pseudo-diffusion coefficient and true diffusion coefficient were estimated using diffusion-weighted MR images and NLME through fitting the IVIM model. The conventional method of non-linear least squares (NLLS) was used as the standard approach for comparison of the resulted parametric images. In the simulated data, NLME provides more accurate and precise estimates of diffusion parameters compared with NLLS. Similarly, we found that NLME has the ability to improve the signal-to-noise ratio of parametric images obtained from rat brain data. These data have shown that it is feasible to apply NLME in parametric image generation, and the parametric image quality can be accordingly improved with the use of NLME. With the flexibility to be adapted to other models or modalities, NLME may become a useful tool to improve the parametric image quality in the future. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  6. An at-site flood estimation method in the context of nonstationarity II. Statistical analysis of floods in Quebec

    NASA Astrophysics Data System (ADS)

    Gado, Tamer A.; Nguyen, Van-Thanh-Van

    2016-04-01

    This paper, the second of a two-part paper, investigates the nonstationary behaviour of flood peaks in Quebec (Canada) by analyzing the annual maximum flow series (AMS) available for the common 1966-2001 period from a network of 32 watersheds. Temporal trends in the mean of flood peaks were examined by the nonparametric Mann-Kendall test. The significance of the detected trends over the whole province is also assessed by a bootstrap test that preserves the cross-correlation structure of the network. Furthermore, The LM-NS method (introduced in the first part) is used to parametrically model the AMS, investigating its applicability to real data, to account for temporal trends in the moments of the time series. In this study two probability distributions (GEV & Gumbel) were selected to model four different types of time-varying moments of the historical time series considered, comprising eight competing models. The selected models are: two stationary models (GEV0 & Gumbel0), two nonstationary models in the mean as a linear function of time (GEV1 & Gumbel1), two nonstationary models in the mean as a parabolic function of time (GEV2 & Gumbel2), and two nonstationary models in the mean and the log standard deviation as linear functions of time (GEV11 & Gumbel11). The eight models were applied to flood data available for each watershed and their performance was compared to identify the best model for each location. The comparative methodology involves two phases: (1) a descriptive ability based on likelihood-based optimality criteria such as the Bayesian Information Criterion (BIC) and the deviance statistic; and (2) a predictive ability based on the residual bootstrap. According to the Mann-Kendall test and the LM-NS method, a quarter of the analyzed stations show significant trends in the AMS. All of the significant trends are negative, indicating decreasing flood magnitudes in Quebec. It was found that the LM-NS method could provide accurate flood estimates in the context of nonstationarity. The results have indicated the importance of taking into consideration the nonstationary behaviour of the flood series in order to improve the quality of flood estimation. The results also provided a general impression on the possible impacts of climate change on flood estimation in the Quebec province.

  7. EMISSION TEST REPORT- FIELD TEST OF CARBON INJECTION FOR MERCURY CONTROL, CAMDEN COUNTY MUNICIPAL WASTE COMBUSTOR

    EPA Science Inventory

    The report gives results of parametric test to evaluate the injection powdered activated carbon to control volatile pollutants in municipal waste combustor (MWC) flue gas. he tests were conducted at a spray dryer absorber/electrostatic precipitator (SD/ESP)-equipped MWC in Camden...

  8. Chaotic map clustering algorithm for EEG analysis

    NASA Astrophysics Data System (ADS)

    Bellotti, R.; De Carlo, F.; Stramaglia, S.

    2004-03-01

    The non-parametric chaotic map clustering algorithm has been applied to the analysis of electroencephalographic signals, in order to recognize the Huntington's disease, one of the most dangerous pathologies of the central nervous system. The performance of the method has been compared with those obtained through parametric algorithms, as K-means and deterministic annealing, and supervised multi-layer perceptron. While supervised neural networks need a training phase, performed by means of data tagged by the genetic test, and the parametric methods require a prior choice of the number of classes to find, the chaotic map clustering gives a natural evidence of the pathological class, without any training or supervision, thus providing a new efficient methodology for the recognition of patterns affected by the Huntington's disease.

  9. An Empirical Study of Eight Nonparametric Tests in Hierarchical Regression.

    ERIC Educational Resources Information Center

    Harwell, Michael; Serlin, Ronald C.

    When normality does not hold, nonparametric tests represent an important data-analytic alternative to parametric tests. However, the use of nonparametric tests in educational research has been limited by the absence of easily performed tests for complex experimental designs and analyses, such as factorial designs and multiple regression analyses,…

  10. An Item Response Theory Model for Test Bias.

    ERIC Educational Resources Information Center

    Shealy, Robin; Stout, William

    This paper presents a conceptualization of test bias for standardized ability tests which is based on multidimensional, non-parametric, item response theory. An explanation of how individually-biased items can combine through a test score to produce test bias is provided. It is contended that bias, although expressed at the item level, should be…

  11. Linkage mapping of beta 2 EEG waves via non-parametric regression.

    PubMed

    Ghosh, Saurabh; Begleiter, Henri; Porjesz, Bernice; Chorlian, David B; Edenberg, Howard J; Foroud, Tatiana; Goate, Alison; Reich, Theodore

    2003-04-01

    Parametric linkage methods for analyzing quantitative trait loci are sensitive to violations in trait distributional assumptions. Non-parametric methods are relatively more robust. In this article, we modify the non-parametric regression procedure proposed by Ghosh and Majumder [2000: Am J Hum Genet 66:1046-1061] to map Beta 2 EEG waves using genome-wide data generated in the COGA project. Significant linkage findings are obtained on chromosomes 1, 4, 5, and 15 with findings at multiple regions on chromosomes 4 and 15. We analyze the data both with and without incorporating alcoholism as a covariate. We also test for epistatic interactions between regions of the genome exhibiting significant linkage with the EEG phenotypes and find evidence of epistatic interactions between a region each on chromosome 1 and chromosome 4 with one region on chromosome 15. While regressing out the effect of alcoholism does not affect the linkage findings, the epistatic interactions become statistically insignificant. Copyright 2003 Wiley-Liss, Inc.

  12. [Diversity and frequency of scientific research design and statistical methods in the "Arquivos Brasileiros de Oftalmologia": a systematic review of the "Arquivos Brasileiros de Oftalmologia"--1993-2002].

    PubMed

    Crosta, Fernando; Nishiwaki-Dantas, Maria Cristina; Silvino, Wilmar; Dantas, Paulo Elias Correa

    2005-01-01

    To verify the frequency of study design, applied statistical analysis and approval by institutional review offices (Ethics Committee) of articles published in the "Arquivos Brasileiros de Oftalmologia" during a 10-year interval, with later comparative and critical analysis by some of the main international journals in the field of Ophthalmology. Systematic review without metanalysis was performed. Scientific papers published in the "Arquivos Brasileiros de Oftalmologia" between January 1993 and December 2002 were reviewed by two independent reviewers and classified according to the applied study design, statistical analysis and approval by the institutional review offices. To categorize those variables, a descriptive statistical analysis was used. After applying inclusion and exclusion criteria, 584 articles for evaluation of statistical analysis and, 725 articles for evaluation of study design were reviewed. Contingency table (23.10%) was the most frequently applied statistical method, followed by non-parametric tests (18.19%), Student's t test (12.65%), central tendency measures (10.60%) and analysis of variance (9.81%). Of 584 reviewed articles, 291 (49.82%) presented no statistical analysis. Observational case series (26.48%) was the most frequently used type of study design, followed by interventional case series (18.48%), observational case description (13.37%), non-random clinical study (8.96%) and experimental study (8.55%). We found a higher frequency of observational clinical studies, lack of statistical analysis in almost half of the published papers. Increase in studies with approval by institutional review Ethics Committee was noted since it became mandatory in 1996.

  13. Time-related patterns of ventricular shunt failure.

    PubMed

    Kast, J; Duong, D; Nowzari, F; Chadduck, W M; Schiff, S J

    1994-11-01

    Proximal obstruction is reported to be the most common cause of ventriculoperitoneal (VP) shunt failure, suggesting that imperfect ventricular catheter placement and inadequate valve mechanisms are major causes. This study retrospectively examined patterns of shunt failure in 128 consecutive patients with symptoms of shunt malfunction over a 2-year period. Factors analyzed included site of failure, time from shunt placement or last revision to failure, age of patient at time of failure, infections, and primary etiology of the hydrocephalus. One hundred of these patients required revisions; 14 revisions were due to infections. In this series there was a higher incidence of distal (43%) than of proximal (35%) failure. The difference was not statistically significant when the overall series was considered; however, when factoring time to failure as a variable, marked differences were noted regardless of the underlying cause of hydrocephalus or the age of the patient. Of the 49 patients needing a shunt revision or replacement within 2 years of the previous operation, 50% had proximal malfunction, 14% distal, and 10% had malfunctions attributable directly to the valve itself. Also, 12 of the 14 infections occurred during this time interval. In sharp contrast, of the 51 patients having shunt failure from 2 to more than 12 years after the previous procedure, 72% had distal malfunction, 21% proximal, and only 6% had a faulty valve or infection. This difference between time to failure for proximal versus distal failures was statistically significant (P < 0.00001 for both Student's t-test and non-parametric Mann-Whitney U-test).(ABSTRACT TRUNCATED AT 250 WORDS)

  14. Performance, static stability, and control effectiveness of a parametric space shuttle launch vehicle

    NASA Technical Reports Server (NTRS)

    Buchholz, R. E.; Gamble, M.

    1972-01-01

    This test was run as a continuation of a prior investigation of aerodynamic performance and static stability tests for a parametric space shuttle launch vehicle. The purposes of this test were: (1) to obtain a more complete set of data in the transonic flight region, (2) to investigate new H-0 tank noseshapes and tank diameters, (3) to obtain control effectiveness data for the orbiter at 0 degree incidence and with a smaller diameter H-0 tank, and (4) to determine the effects of varying solid rocket motor-to-H0 tank gap size. Experimental data were obtained for angles of attack from -10 to +10 degrees and for angles of sideslip from +10 to -10 degrees at Mach numbers ranging from .6 to 4.96.

  15. Fatigue analysis of multiple site damage at a row of holes in a wide panel

    NASA Technical Reports Server (NTRS)

    Buhler, Kimberley; Grandt, Alten F., Jr.; Moukawsher, E. J.

    1994-01-01

    This paper is concerned with predicting the fatigue life of unstiffened panels which contain multiple site damage (MSD). The initial damage consists of through-the-thickness cracks emanating from a row of holes in the center of a finite width panel. A fracture mechanics analysis has been developed to predict the growth, interaction, and coalescence of the various cracks which propagate in the panel. A strain-life analysis incorporating Neuber's rule for notches, and Miner's rule for cumulative damage, is also employed to predict crack initiation for holes without initial cracking. This analysis is compared with the results of a series of fatigue tests on 2024-T3 aluminum panels, and is shown to do an excellent job of predicting the influence of MSD on the fatigue life of nine inch wide specimens. Having established confidence in the ability to analyze the influence of MSD on fatigue life, a parametric study is conducted to examine the influence of various MSD scenarios in an unstiffened panel. The numerical study considered 135 cases in all, with the parametric variables being the applied cyclic stress level, the lead crack geometry, and the number and location of MSD cracks. The numerical analysis provides details for the manner in which lead cracks and MSD cracks grow and coalesce leading to final failure. The results indicate that MSD located adjacent to lead cracks is the most damaging configuration, while for cases without lead cracks, MSD clusters which are not separated by uncracked holes are most damaging.

  16. Computational parametric study of a Richtmyer-Meshkov instability for an inclined interface.

    PubMed

    McFarland, Jacob A; Greenough, Jeffrey A; Ranjan, Devesh

    2011-08-01

    A computational study of the Richtmyer-Meshkov instability for an inclined interface is presented. The study covers experiments to be performed in the Texas A&M University inclined shock tube facility. Incident shock wave Mach numbers from 1.2 to 2.5, inclination angles from 30° to 60°, and gas pair Atwood numbers of ∼0.67 and ∼0.95 are used in this parametric study containing 15 unique combinations of these parameters. Qualitative results are examined through a time series of density plots for multiple combinations of these parameters, and the qualitative effects of each of the parameters are discussed. Pressure, density, and vorticity fields are presented in animations available online to supplement the discussion of the qualitative results. These density plots show the evolution of two main regions in the flow field: a mixing region containing driver and test gas that is dominated by large vortical structures, and a more homogeneous region of unmixed fluid which can separate away from the mixing region in some cases. The interface mixing width is determined for various combinations of the parameters listed at the beginning of the Abstract. A scaling method for the mixing width is proposed using the interface geometry and wave velocities calculated using one-dimensional gas dynamic equations. This model uses the transmitted wave velocity for the characteristic velocity and an initial offset time based on the travel time of strong reflected waves. It is compared to an adapted Richtmyer impulsive model scaling and shown to scale the initial mixing width growth rate more effectively for fixed Atwood number.

  17. Pig brain stereotaxic standard space: mapping of cerebral blood flow normative values and effect of MPTP-lesioning.

    PubMed

    Andersen, Flemming; Watanabe, Hideaki; Bjarkam, Carsten; Danielsen, Erik H; Cumming, Paul

    2005-07-15

    The analysis of physiological processes in brain by position emission tomography (PET) is facilitated when images are spatially normalized to a standard coordinate system. Thus, PET activation studies of human brain frequently employ the common stereotaxic coordinates of Talairach. We have developed an analogous stereotaxic coordinate system for the brain of the Gottingen miniature pig, based on automatic co-registration of magnetic resonance (MR) images obtained in 22 male pigs. The origin of the pig brain stereotaxic space (0, 0, 0) was arbitrarily placed in the centroid of the pineal gland as identified on the average MRI template. The orthogonal planes were imposed using the line between stereotaxic zero and the optic chiasm. A series of mean MR images in the coronal, sagittal and horizontal planes were generated. To test the utility of the common coordinate system for functional imaging studies of minipig brain, we calculated cerebral blood flow (CBF) maps from normal minipigs and from minipigs with a syndrome of parkisonism induced by 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP)-poisoning. These maps were transformed from the native space into the common stereotaxic space. After global normalization of these maps, an undirected search for differences between the groups was then performed using statistical parametric mapping. Using this method, we detected a statistically significant focal increase in CBF in the left cerebellum of the MPTP-lesioned group. We expect the present approach to be of general use in the statistical parametric mapping of CBF and other physiological parameters in living pig brain.

  18. Matrix approach to land carbon cycle modeling: A case study with the Community Land Model.

    PubMed

    Huang, Yuanyuan; Lu, Xingjie; Shi, Zheng; Lawrence, David; Koven, Charles D; Xia, Jianyang; Du, Zhenggang; Kluzek, Erik; Luo, Yiqi

    2018-03-01

    The terrestrial carbon (C) cycle has been commonly represented by a series of C balance equations to track C influxes into and effluxes out of individual pools in earth system models (ESMs). This representation matches our understanding of C cycle processes well but makes it difficult to track model behaviors. It is also computationally expensive, limiting the ability to conduct comprehensive parametric sensitivity analyses. To overcome these challenges, we have developed a matrix approach, which reorganizes the C balance equations in the original ESM into one matrix equation without changing any modeled C cycle processes and mechanisms. We applied the matrix approach to the Community Land Model (CLM4.5) with vertically-resolved biogeochemistry. The matrix equation exactly reproduces litter and soil organic carbon (SOC) dynamics of the standard CLM4.5 across different spatial-temporal scales. The matrix approach enables effective diagnosis of system properties such as C residence time and attribution of global change impacts to relevant processes. We illustrated, for example, the impacts of CO 2 fertilization on litter and SOC dynamics can be easily decomposed into the relative contributions from C input, allocation of external C into different C pools, nitrogen regulation, altered soil environmental conditions, and vertical mixing along the soil profile. In addition, the matrix tool can accelerate model spin-up, permit thorough parametric sensitivity tests, enable pool-based data assimilation, and facilitate tracking and benchmarking of model behaviors. Overall, the matrix approach can make a broad range of future modeling activities more efficient and effective. © 2017 John Wiley & Sons Ltd.

  19. Nonlinear Tides in Close Binary Systems

    NASA Astrophysics Data System (ADS)

    Weinberg, Nevin N.; Arras, Phil; Quataert, Eliot; Burkart, Josh

    2012-06-01

    We study the excitation and damping of tides in close binary systems, accounting for the leading-order nonlinear corrections to linear tidal theory. These nonlinear corrections include two distinct physical effects: three-mode nonlinear interactions, i.e., the redistribution of energy among stellar modes of oscillation, and nonlinear excitation of stellar normal modes by the time-varying gravitational potential of the companion. This paper, the first in a series, presents the formalism for studying nonlinear tides and studies the nonlinear stability of the linear tidal flow. Although the formalism we present is applicable to binaries containing stars, planets, and/or compact objects, we focus on non-rotating solar-type stars with stellar or planetary companions. Our primary results include the following: (1) The linear tidal solution almost universally used in studies of binary evolution is unstable over much of the parameter space in which it is employed. More specifically, resonantly excited internal gravity waves in solar-type stars are nonlinearly unstable to parametric resonance for companion masses M' >~ 10-100 M ⊕ at orbital periods P ≈ 1-10 days. The nearly static "equilibrium" tidal distortion is, however, stable to parametric resonance except for solar binaries with P <~ 2-5 days. (2) For companion masses larger than a few Jupiter masses, the dynamical tide causes short length scale waves to grow so rapidly that they must be treated as traveling waves, rather than standing waves. (3) We show that the global three-wave treatment of parametric instability typically used in the astrophysics literature does not yield the fastest-growing daughter modes or instability threshold in many cases. We find a form of parametric instability in which a single parent wave excites a very large number of daughter waves (N ≈ 103[P/10 days] for a solar-type star) and drives them as a single coherent unit with growth rates that are a factor of ≈N faster than the standard three-wave parametric instability. These are local instabilities viewed through the lens of global analysis; the coherent global growth rate follows local rates in the regions where the shear is strongest. In solar-type stars, the dynamical tide is unstable to this collective version of the parametric instability for even sub-Jupiter companion masses with P <~ a month. (4) Independent of the parametric instability, the dynamical and equilibrium tides excite a wide range of stellar p-modes and g-modes by nonlinear inhomogeneous forcing; this coupling appears particularly efficient at draining energy out of the dynamical tide and may be more important than either wave breaking or parametric resonance at determining the nonlinear dissipation of the dynamical tide.

  20. Membrane triangles with corner drilling freedoms. III - Implementation and performance evaluation

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.; Alexander, Scott

    1992-01-01

    This paper completes a three-part series on the formulation of 3-node, 9-dof membrane triangles with corner drilling freedoms based on parametrized variational principles. The first four sections cover element implementation details including determination of optimal parameters and treatment of distributed loads. Then three elements of this type, labeled ALL, FF and EFF-ANDES, are tested on standard plane stress problems. ALL represents numerically integrated versions of Allman's 1988 triangle; FF is based on the free formulation triangle presented by Bergan and Felippa in 1985; and EFF-ANDES represent two different formulations of the optimal triangle derived in Parts I and II. The numerical studies indicate that the ALL, FF and EFF-ANDES elements are comparable in accuracy for elements of unitary aspect ratios. The ALL elements are found to stiffen rapidly in inplane bending for high aspect ratios, whereas the FF and EFF elements maintain accuracy. The EFF and ANDES implementations have a moderate edge in formation speed over the FF.

  1. The link between social cognition and self-referential thought in the medial prefrontal cortex.

    PubMed

    Mitchell, Jason P; Banaji, Mahzarin R; Macrae, C Neil

    2005-08-01

    The medial prefrontal cortex (mPFC) has been implicated in seemingly disparate cognitive functions, such as understanding the minds of other people and processing information about the self. This functional overlap would be expected if humans use their own experiences to infer the mental states of others, a basic postulate of simulation theory. Neural activity was measured while participants attended to either the mental or physical aspects of a series of other people. To permit a test of simulation theory's prediction that inferences based on self-reflection should only be made for similar others, targets were subsequently rated for their degree of similarity to self. Parametric analyses revealed a region of the ventral mPFC--previously implicated in self-referencing tasks--in which activity correlated with perceived self/other similarity, but only for mentalizing trials. These results suggest that self-reflection may be used to infer the mental states of others when they are sufficiently similar to self.

  2. Design and performance of an analysis-by-synthesis class of predictive speech coders

    NASA Technical Reports Server (NTRS)

    Rose, Richard C.; Barnwell, Thomas P., III

    1990-01-01

    The performance of a broad class of analysis-by-synthesis linear predictive speech coders is quantified experimentally. The class of coders includes a number of well-known techniques as well as a very large number of speech coders which have not been named or studied. A general formulation for deriving the parametric representation used in all of the coders in the class is presented. A new coder, named the self-excited vocoder, is discussed because of its good performance with low complexity, and because of the insight this coder gives to analysis-by-synthesis coders in general. The results of a study comparing the performances of different members of this class are presented. The study takes the form of a series of formal subjective and objective speech quality tests performed on selected coders. The results of this study lead to some interesting and important observations concerning the controlling parameters for analysis-by-synthesis speech coders.

  3. Out-of-core Evaluations of Uranium Nitride-fueled Converters

    NASA Technical Reports Server (NTRS)

    Shimada, K.

    1972-01-01

    Two uranium nitride fueled converters were tested parametrically for their initial characterization and are currently being life-tested out of core. Test method being employed for the parametric and the diagnostic measurements during the life tests, and test results are presented. One converter with a rhenium emitter had an initial output power density of 6.9 W/ sq cm at the black body emitter temperature of 1900 K. The power density remained unchanged for the first 1000 hr of life test but degraded nearly 50% percent during the following 1000 hr. Electrode work function measurements indicated that the uranium fuel was diffusing out of the emitter clad of 0.635 mm. The other converter with a tungsten emitter had an initial output power density of 2.2 W/ sq cm at 1900 K with a power density of 3.9 W/sq cm at 4300 h. The power density suddenly degraded within 20 hr to practically zero output at 4735 hr.

  4. Low-cost, disposable microfluidics device for blood plasma extraction using continuously alternating paramagnetic and diamagnetic capture modes

    PubMed Central

    Kim, Pilkee; Ong, Eng Hui; Yoon, Yong-Jin; Ng, Sum Huan Gary; Puttachat, Khuntontong

    2016-01-01

    Blood plasma contains biomarkers and substances that indicate the physiological state of an organism, and it can be used to diagnose various diseases or body condition. To improve the accuracy of diagnostic test, it is required to obtain the high purity of blood plasma. This paper presents a low-cost, disposable microfluidics device for blood plasma extraction using magnetophoretic behaviors of blood cells. This device uses alternating magnetophoretic capture modes to trap and separate paramagnetic and diamagnetic cells away from blood plasma. The device system is composed of two parts, a disposable microfluidics chip and a non-disposable (reusable) magnetic field source. Such modularized device helps the structure of the disposable part dramatically simplified, which is beneficial for low-cost mass production. A series of numerical simulation and parametric study have been performed to describe the mechanism of blood cell separation in the microchannel, and the results are discussed. Furthermore, experimental feasibility test has been carried out in order to demonstrate the blood plasma extraction process of the proposed device. In this experiment, pure blood plasma has been successfully extracted with yield of 21.933% from 75 μl 1:10 dilution of deoxygenated blood. PMID:27042252

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johannsen, Tim; Psaltis, Dimitrios, E-mail: timj@physics.arizona.ed, E-mail: dpsaltis@email.arizona.ed

    According to the no-hair theorem, an astrophysical black hole is uniquely described by only two quantities, the mass and the spin. In this series of papers, we investigate a framework for testing the no-hair theorem with observations of black holes in the electromagnetic spectrum. We formulate our approach in terms of a parametric spacetime which contains a quadrupole moment that is independent of both mass and spin. If the no-hair theorem is correct, then any deviation of the black hole quadrupole moment from its Kerr value has to be zero. We analyze in detail the properties of this quasi-Kerr spacetimemore » that are critical to interpreting observations of black holes and demonstrate their dependence on the spin and quadrupole moment. In particular, we show that the location of the innermost stable circular orbit and the gravitational lensing experienced by photons are affected significantly at even modest deviations of the quadrupole moment from the value predicted by the no-hair theorem. We argue that observations of black hole images, of relativistically broadened iron lines, as well as of thermal X-ray spectra from accreting black holes will lead in the near future to an experimental test of the no-hair theorem.« less

  6. Trend analysis of annual precipitation of Mauritius for the period 1981-2010

    NASA Astrophysics Data System (ADS)

    Raja, Nussaïbah B.; Aydin, Olgu

    2018-04-01

    This study researched the precipitation variability across 53 meteorological stations in Mauritius and different subregions of the island, over a 30-year study period (1981-2010). Time series was investigated for each 5-year interval and also for the whole study period. Non-parametric Mann-Kendall and Spearman's rho statistical tests were used to detect trends in annual precipitation. A mix of positive (increasing) and negative (decreasing) trends was highlighted for the 5-year interval analysis. The statistical tests nevertheless agreed on the overall trend for Mauritius and the subregions. Most regions showed a decrease in precipitation during the period 1996-2000. This is attributed to the 1998-2000 drought period which was brought about by a moderate La Niña event. In general, an increase in precipitation levels was observed across the country during the study period. This increase is the result of an increase in extreme precipitation events in the region. On the other hand, two subregions, both located in the highlands, experienced a decline in precipitation levels. Since most of the reservoirs in Mauritius are located in these two subregions, this implies serious consequences for water availability in the country if existing storage capacities are kept.

  7. Exploring Hot Exoplanet Atmospheres with JWST/NIRSpec and a Hybrid Version of NEMESIS

    NASA Astrophysics Data System (ADS)

    Badhan, Mahmuda A.; Mandell, Avi; Batalha, Natasha; Irwin, Patrick GJ; Barstow, Joanna; Garland, Ryan; Deming, Drake; Hesman, Brigette E.; Nixon, Conor A.

    2016-01-01

    Understanding the formation environments and evolution scenarios of hot Jupiters demands robust measures for constraining their atmospheric physical properties and transit observations at unprecedented resolutions. Here we have utilized a combination of two different approaches, Optimal Estimation (OE) and Markov Chain Monte Carlo (MCMC), as part of the extensively validated NEMESIS atmospheric retrieval code, to infer pressure-temperature (P-T) profiles & gas mixing ratios (VMR) of H2O, CO2, CH4 and CO, from a series of tests conducted on JWST/NIRSpec simulations of the dayside thermal emission spectra (secondary eclipse) of H2-dominated hot-Jupiter candidates. To keep the number of parameters low and henceforth retrieve more plausible profile shapes, we have used a parametrized form of the temperature profile based upon the analytic radiative equilibrium derivation in Guillot et al. 2010. For the purpose of testing and validation, we also show some preliminary work on published dataset from the Hubble Space Telescope (HST) and Spitzer missions. Finally, high-temperature (T> 1000K) spectroscopic line lists are slowly but continually being improved by the atmospheric retrieval community. Since this carries the potential of impacting hot Jupiter atmospheric models quite significantly, we compare results from different databases.

  8. Test data from small solid propellant rocket motor plume measurements (FA-21)

    NASA Technical Reports Server (NTRS)

    Hair, L. M.; Somers, R. E.

    1976-01-01

    A program is described for obtaining a reliable, parametric set of measurements in the exhaust plumes of solid propellant rocket motors. Plume measurements included pressures, temperatures, forces, heat transfer rates, particle sampling, and high-speed movies. Approximately 210,000 digital data points and 15,000 movie frames were acquired. Measurements were made at points in the plumes via rake-mounted probes, and on the surface of a large plate impinged by the exhaust plume. Parametric variations were made in pressure altitude, propellant aluminum loading, impinged plate incidence angle and distance from nozzle exit to plate or rake. Reliability was incorporated by continual use of repeat runs. The test setup of the various hardware items is described along with an account of test procedures. Test results and data accuracy are discussed. Format of the data presentation is detailed. Complete data are included in the appendix.

  9. The influence of intraocular pressure and air jet pressure on corneal contactless tonometry tests.

    PubMed

    Simonini, Irene; Pandolfi, Anna

    2016-05-01

    The air puff is a dynamic contactless tonometer test used in ophthalmology clinical practice to assess the biomechanical properties of the human cornea and the intraocular pressure due to the filling fluids of the eye. The test is controversial, since the dynamic response of the cornea is governed by the interaction of several factors which cannot be discerned within a single measurement. In this study we describe a numerical model of the air puff tests, and perform a parametric analysis on the major action parameters (jet pressure and intraocular pressure) to assess their relevance on the mechanical response of a patient-specific cornea. The particular cornea considered here has been treated with laser reprofiling to correct myopia, and the parametric study has been conducted on both the preoperative and postoperative geometries. The material properties of the cornea have been obtained by means of an identification procedure that compares the static biomechanical response of preoperative and postoperative corneas under the physiological IOP. The parametric study on the intraocular pressure suggests that the displacement of the cornea׳s apex can be a reliable indicator for tonometry, and the one on the air jet pressure predicts the outcomes of two or more distinct measurements on the same cornea, which can be used in inverse procedures to estimate the material properties of the tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Serum and Plasma Metabolomic Biomarkers for Lung Cancer.

    PubMed

    Kumar, Nishith; Shahjaman, Md; Mollah, Md Nurul Haque; Islam, S M Shahinul; Hoque, Md Aminul

    2017-01-01

    In drug invention and early disease prediction of lung cancer, metabolomic biomarker detection is very important. Mortality rate can be decreased, if cancer is predicted at the earlier stage. Recent diagnostic techniques for lung cancer are not prognosis diagnostic techniques. However, if we know the name of the metabolites, whose intensity levels are considerably changing between cancer subject and control subject, then it will be easy to early diagnosis the disease as well as to discover the drug. Therefore, in this paper we have identified the influential plasma and serum blood sample metabolites for lung cancer and also identified the biomarkers that will be helpful for early disease prediction as well as for drug invention. To identify the influential metabolites, we considered a parametric and a nonparametric test namely student׳s t-test as parametric and Kruskal-Wallis test as non-parametric test. We also categorized the up-regulated and down-regulated metabolites by the heatmap plot and identified the biomarkers by support vector machine (SVM) classifier and pathway analysis. From our analysis, we got 27 influential (p-value<0.05) metabolites from plasma sample and 13 influential (p-value<0.05) metabolites from serum sample. According to the importance plot through SVM classifier, pathway analysis and correlation network analysis, we declared 4 metabolites (taurine, aspertic acid, glutamine and pyruvic acid) as plasma biomarker and 3 metabolites (aspartic acid, taurine and inosine) as serum biomarker.

  11. Examining deterrence of adult sex crimes: A semi-parametric intervention time series approach.

    PubMed

    Park, Jin-Hong; Bandyopadhyay, Dipankar; Letourneau, Elizabeth

    2014-01-01

    Motivated by recent developments on dimension reduction (DR) techniques for time series data, the association of a general deterrent effect towards South Carolina (SC)'s registration and notification (SORN) policy for preventing sex crimes was examined. Using adult sex crime arrestee data from 1990 to 2005, the the idea of Central Mean Subspace (CMS) is extended to intervention time series analysis (CMS-ITS) to model the sequential intervention effects of 1995 (the year SC's SORN policy was initially implemented) and 1999 (the year the policy was revised to include online notification) on the time series spectrum. The CMS-ITS model estimation was achieved via kernel smoothing techniques, and compared to interrupted auto-regressive integrated time series (ARIMA) models. Simulation studies and application to the real data underscores our model's ability towards achieving parsimony, and to detect intervention effects not earlier determined via traditional ARIMA models. From a public health perspective, findings from this study draw attention to the potential general deterrent effects of SC's SORN policy. These findings are considered in light of the overall body of research on sex crime arrestee registration and notification policies, which remain controversial.

  12. The 32nd CDC: System identification using interval dynamic models

    NASA Technical Reports Server (NTRS)

    Keel, L. H.; Lew, J. S.; Bhattacharyya, S. P.

    1992-01-01

    Motivated by the recent explosive development of results in the area of parametric robust control, a new technique to identify a family of uncertain systems is identified. The new technique takes the frequency domain input and output data obtained from experimental test signals and produces an 'interval transfer function' that contains the complete frequency domain behavior with respect to the test signals. This interval transfer function is one of the key concepts in the parametric robust control approach and identification with such an interval model allows one to predict the worst case performance and stability margins using recent results on interval systems. The algorithm is illustrated by applying it to an 18 bay Mini-Mast truss structure.

  13. Space-Plane Spreadsheet Program

    NASA Technical Reports Server (NTRS)

    Mackall, Dale

    1993-01-01

    Basic Hypersonic Data and Equations (HYPERDATA) spreadsheet computer program provides data gained from three analyses of performance of space plane. Equations used to perform analyses derived from Newton's second law of physics, derivation included. First analysis is parametric study of some basic factors affecting ability of space plane to reach orbit. Second includes calculation of thickness of spherical fuel tank. Third produces ratio between volume of fuel and total mass for each of various aircraft. HYPERDATA intended for use on Macintosh(R) series computers running Microsoft Excel 3.0.

  14. (abstract) A Comparison Between Measurements of the F-layer Critical Frequency and Values Derived from the PRISM Adjustment Algorithm Applied to Total Electron Content Data in the Equatorial Region

    NASA Technical Reports Server (NTRS)

    Mannucci, A. J.; Anderson, D. N.; Abdu, A. M.

    1994-01-01

    The Parametrized Real-Time Ionosphere Specification Model (PRISM) is a global ionospheric specification model that can incorporate real-time data to compute accurate electron density profiles. Time series of computed and measured data are compared in this paper. This comparison can be used to suggest methods of optimizing the PRISM adjustment algorithm for TEC data obtained at low altitudes.

  15. An Improved Model of Cryogenic Propellant Stratification in a Rotating, Reduced Gravity Environment

    NASA Technical Reports Server (NTRS)

    Oliveira, Justin; Kirk, Daniel R.; Schallhorn, Paul A.; Piquero, Jorge L.; Campbell, Mike; Chase, Sukhdeep

    2007-01-01

    This paper builds on a series of analytical literature models used to predict thermal stratification within rocket propellant tanks. The primary contribution to the literature is to add the effect of tank rotation and to demonstrate the influence of rotation on stratification times and temperatures. This work also looks levels of thermal stratification for generic propellant tanks (cylindrical shapes) over a parametric range of upper-stage coast times, heating levels, rotation rates, and gravity levels.

  16. Forced response of mistuned bladed disk assemblies

    NASA Technical Reports Server (NTRS)

    Watson, Brian C.; Kamat, Manohar P.; Murthy, Durbha V.

    1993-01-01

    A complete analytic model of mistuned bladed disk assemblies, designed to simulate the dynamical behavior of these systems, is analyzed. The model incorporates a generalized method for describing the mistuning of the assembly through the introduction of specific mistuning modes. The model is used to develop a computational bladed disk assembly model for a series of parametric studies. Results are presented demonstrating that the response amplitudes of bladed disk assemblies depend both on the excitation mode and on the mistune mode.

  17. MHD Turbulence Sheared in Fixed and Rotating Frames

    NASA Technical Reports Server (NTRS)

    Kassinos, S. C.; Knaepen, B.; Wray, A.

    2004-01-01

    We consider homogeneous turbulence in a conducting fluid that is exposed to a uniform external magnetic field while being sheared in fixed and rotating frames. We take both the frame-rotation axis and the applied magnetic field to be aligned in the direction normal to the plane of the mean shear. Here a systematic parametric study is carried out in a series of Direct Numerical Simulations (DNS) in order to clarify the main effects determining the structural anisotropy and stability of the flow.

  18. Laser And Nonlinear Optical Materials For Laser Remote Sensing

    NASA Technical Reports Server (NTRS)

    Barnes, Norman P.

    2005-01-01

    NASA remote sensing missions involving laser systems and their economic impact are outlined. Potential remote sensing missions include: green house gasses, tropospheric winds, ozone, water vapor, and ice cap thickness. Systems to perform these measurements use lanthanide series lasers and nonlinear devices including second harmonic generators and parametric oscillators. Demands these missions place on the laser and nonlinear optical materials are discussed from a materials point of view. Methods of designing new laser and nonlinear optical materials to meet these demands are presented.

  19. Segmentation algorithm for non-stationary compound Poisson processes. With an application to inventory time series of market members in a financial market

    NASA Astrophysics Data System (ADS)

    Tóth, B.; Lillo, F.; Farmer, J. D.

    2010-11-01

    We introduce an algorithm for the segmentation of a class of regime switching processes. The segmentation algorithm is a non parametric statistical method able to identify the regimes (patches) of a time series. The process is composed of consecutive patches of variable length. In each patch the process is described by a stationary compound Poisson process, i.e. a Poisson process where each count is associated with a fluctuating signal. The parameters of the process are different in each patch and therefore the time series is non-stationary. Our method is a generalization of the algorithm introduced by Bernaola-Galván, et al. [Phys. Rev. Lett. 87, 168105 (2001)]. We show that the new algorithm outperforms the original one for regime switching models of compound Poisson processes. As an application we use the algorithm to segment the time series of the inventory of market members of the London Stock Exchange and we observe that our method finds almost three times more patches than the original one.

  20. A Wind-Tunnel Parametric Investigation of Tiltrotor Whirl-Flutter Stability Boundaries

    NASA Technical Reports Server (NTRS)

    Piatak, David J.; Kvaternik, Raymond G.; Nixon, Mark W.; Langston, Chester W.; Singleton, Jeffrey D.; Bennett, Richard L.; Brown, Ross K.

    2001-01-01

    A wind-tunnel investigation of tiltrotor whirl-flutter stability boundaries has been conducted on a 1/5-size semispan tiltrotor model known as the Wing and Rotor Aeroelastic Test System (WRATS) in the NASA-Langley Transonic Dynamics Tunnel as part of a joint NASA/Army/Bell Helicopter Textron, Inc. (BHTI) research program. The model was first developed by BHTI as part of the JVX (V-22) research and development program in the 1980's and was recently modified to incorporate a hydraulically-actuated swashplate control system for use in active controls research. The modifications have changed the model's pylon mass properties sufficiently to warrant testing to re-establish its baseline stability boundaries. A parametric investigation of the effect of rotor design variables on stability was also conducted. The model was tested in both the on-downstop and off-downstop configurations, at cruise flight and hover rotor rotational speeds, and in both air and heavy gas (R-134a) test mediums. Heavy gas testing was conducted to quantify Mach number compressibility effects on tiltrotor stability. Experimental baseline stability boundaries in air are presented with comparisons to results from parametric variations of rotor pitch-flap coupling and control system stiffness. Increasing the rotor pitch-flap coupling (delta(sub 3) more negative) was found to have a destabilizing effect on stability, while a reduction in control system stiffness was found to have little effect on whirl-flutter stability. Results indicate that testing in R-134a, and thus matching full-scale tip Mach number, has a destabilizing effect, which demonstrates that whirl-flutter stability boundaries in air are unconservative.

  1. An appraisal of statistical procedures used in derivation of reference intervals.

    PubMed

    Ichihara, Kiyoshi; Boyd, James C

    2010-11-01

    When conducting studies to derive reference intervals (RIs), various statistical procedures are commonly applied at each step, from the planning stages to final computation of RIs. Determination of the necessary sample size is an important consideration, and evaluation of at least 400 individuals in each subgroup has been recommended to establish reliable common RIs in multicenter studies. Multiple regression analysis allows identification of the most important factors contributing to variation in test results, while accounting for possible confounding relationships among these factors. Of the various approaches proposed for judging the necessity of partitioning reference values, nested analysis of variance (ANOVA) is the likely method of choice owing to its ability to handle multiple groups and being able to adjust for multiple factors. Box-Cox power transformation often has been used to transform data to a Gaussian distribution for parametric computation of RIs. However, this transformation occasionally fails. Therefore, the non-parametric method based on determination of the 2.5 and 97.5 percentiles following sorting of the data, has been recommended for general use. The performance of the Box-Cox transformation can be improved by introducing an additional parameter representing the origin of transformation. In simulations, the confidence intervals (CIs) of reference limits (RLs) calculated by the parametric method were narrower than those calculated by the non-parametric approach. However, the margin of difference was rather small owing to additional variability in parametrically-determined RLs introduced by estimation of parameters for the Box-Cox transformation. The parametric calculation method may have an advantage over the non-parametric method in allowing identification and exclusion of extreme values during RI computation.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ray, Jaideep; Lee, Jina; Lefantzi, Sophia

    The estimation of fossil-fuel CO2 emissions (ffCO2) from limited ground-based and satellite measurements of CO2 concentrations will form a key component of the monitoring of treaties aimed at the abatement of greenhouse gas emissions. To that end, we construct a multiresolution spatial parametrization for fossil-fuel CO2 emissions (ffCO2), to be used in atmospheric inversions. Such a parametrization does not currently exist. The parametrization uses wavelets to accurately capture the multiscale, nonstationary nature of ffCO2 emissions and employs proxies of human habitation, e.g., images of lights at night and maps of built-up areas to reduce the dimensionality of the multiresolution parametrization.more » The parametrization is used in a synthetic data inversion to test its suitability for use in atmospheric inverse problem. This linear inverse problem is predicated on observations of ffCO2 concentrations collected at measurement towers. We adapt a convex optimization technique, commonly used in the reconstruction of compressively sensed images, to perform sparse reconstruction of the time-variant ffCO2 emission field. We also borrow concepts from compressive sensing to impose boundary conditions i.e., to limit ffCO2 emissions within an irregularly shaped region (the United States, in our case). We find that the optimization algorithm performs a data-driven sparsification of the spatial parametrization and retains only of those wavelets whose weights could be estimated from the observations. Further, our method for the imposition of boundary conditions leads to a 10computational saving over conventional means of doing so. We conclude with a discussion of the accuracy of the estimated emissions and the suitability of the spatial parametrization for use in inverse problems with a significant degree of regularization.« less

  3. Diode step stress program, JANTX1N5614

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The reliability of switching diode JANTX1N5614 was tested. The effect of power/temperature step stress on the diode was determined. Control sample units were maintained for verification of the electrical parametric testing. Results are reported.

  4. Chiral symmetry constraints on resonant amplitudes

    NASA Astrophysics Data System (ADS)

    Bruns, Peter C.; Mai, Maxim

    2018-03-01

    We discuss the impact of chiral symmetry constraints on the quark-mass dependence of meson resonance pole positions, which are encoded in non-perturbative parametrizations of meson scattering amplitudes. Model-independent conditions on such parametrizations are derived, which are shown to guarantee the correct functional form of the leading quark-mass corrections to the resonance pole positions. Some model amplitudes for ππ scattering, widely used for the determination of ρ and σ resonance properties from results of lattice simulations, are tested explicitly with respect to these conditions.

  5. Estimating survival of radio-tagged birds

    USGS Publications Warehouse

    Bunck, C.M.; Pollock, K.H.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Parametric and nonparametric methods for estimating survival of radio-tagged birds are described. The general assumptions of these methods are reviewed. An estimate based on the assumption of constant survival throughout the period is emphasized in the overview of parametric methods. Two nonparametric methods, the Kaplan-Meier estimate of the survival funcrion and the log rank test, are explained in detail The link between these nonparametric methods and traditional capture-recapture models is discussed aloag with considerations in designing studies that use telemetry techniques to estimate survival.

  6. An algorithm for full parametric solution of problems on the statics of orthotropic plates by the method of boundary states with perturbations

    NASA Astrophysics Data System (ADS)

    Penkov, V. B.; Ivanychev, D. A.; Novikova, O. S.; Levina, L. V.

    2018-03-01

    The article substantiates the possibility of building full parametric analytical solutions of mathematical physics problems in arbitrary regions by means of computer systems. The suggested effective means for such solutions is the method of boundary states with perturbations, which aptly incorporates all parameters of an orthotropic medium in a general solution. We performed check calculations of elastic fields of an anisotropic rectangular region (test and calculation problems) for a generalized plane stress state.

  7. Laser-Based Remote Sensing of Explosives by a Differential Absorption and Scattering Method

    NASA Astrophysics Data System (ADS)

    Ayrapetyan, V. S.

    2018-01-01

    A multifunctional IR parametric laser system is developed and tested for remote detection and identification of atmospheric gases, including explosive and chemically aggressive substances. Calculations and experimental studies of remote determination of the spectroscopic parameters of the best known explosive substances TNT, RDX, and PETN are carried out. The feasibility of high sensitivity detection ( 1 ppm) of these substances with the aid of a multifunctional IR parametric light source by differential absorption and scattering is demonstrated.

  8. The urban heat island in Rio de Janeiro, Brazil, in the last 30 years using remote sensing data

    NASA Astrophysics Data System (ADS)

    Peres, Leonardo de Faria; Lucena, Andrews José de; Rotunno Filho, Otto Corrêa; França, José Ricardo de Almeida

    2018-02-01

    The aim of this work is to study urban heat island (UHI) in Metropolitan Area of Rio de Janeiro (MARJ) based on the analysis of land-surface temperature (LST) and land-use patterns retrieved from Landsat-5/Thematic Mapper (TM), Landsat-7/Enhanced Thematic Mapper Plus (ETM+) and Landsat-8/Operational Land Imager (OLI) and Thermal Infrared Sensors (TIRS) data covering a 32-year period between 1984 and 2015. LST temporal evolution is assessed by comparing the average LST composites for 1984-1999 and 2000-2015 where the parametric Student t-test was conducted at 5% significance level to map the pixels where LST for the more recent period is statistically significantly greater than the previous one. The non-parametric Mann-Whitney-Wilcoxon rank sum test has also confirmed at the same 5% significance level that the more recent period (2000-2015) has higher LST values. UHI intensity between ;urban; and ;rural/urban low density; (;vegetation;) areas for 1984-1999 and 2000-2015 was established and confirmed by both parametric and non-parametric tests at 1% significance level as 3.3 °C (5.1 °C) and 4.4 °C (7.1 °C), respectively. LST has statistically significantly (p-value < 0.01) increased over time in two of three land cover classes (;urban; and ;urban low density;), respectively by 1.9 °C and 0.9 °C, except in ;vegetation; class. A spatial analysis was also performed to identify the urban pixels within MARJ where UHI is more intense by subtracting the LST of these pixels from the LST mean value of ;vegetation; land-use class.

  9. Post-Newtonian parameter γ in generalized non-local gravity

    NASA Astrophysics Data System (ADS)

    Zhang, Xue; Wu, YaBo; Yang, WeiQiang; Zhang, ChengYuan; Chen, BoHai; Zhang, Nan

    2017-10-01

    We investigate the post-Newtonian parameter γ and derive its formalism in generalized non-local (GNL) gravity, which is the modified theory of general relativity (GR) obtained by adding a term m 2 n-2 R☐-n R to the Einstein-Hilbert action. Concretely, based on parametrizing the generalized non-local action in which gravity is described by a series of dynamical scalar fields ϕ i in addition to the metric tensor g μν, the post-Newtonian limit is computed, and the effective gravitational constant as well as the post-Newtonian parameters are directly obtained from the generalized non-local gravity. Moreover, by discussing the values of the parametrized post-Newtonian parameters γ, we can compare our expressions and results with those in Hohmann and Järv et al. (2016), as well as current observational constraints on the values of γ in Will (2006). Hence, we draw restrictions on the nonminimal coupling terms F̅ around their background values.

  10. Non-classical Signature of Parametric Fluorescence and its Application in Metrology

    NASA Astrophysics Data System (ADS)

    Hamar, M.; Michálek, V.; Pathak, A.

    2014-08-01

    The article provides a short theoretical background of what the non-classical light means. We applied the criterion for the existence of non-classical effects derived by C.T. Lee on parametric fluorescence. The criterion was originally derived for the study of two light beams with one mode per beam. We checked if the criterion is still working for two multimode beams of parametric down-conversion through numerical simulations. The theoretical results were tested by measurement of photon number statistics of twin beams emitted by nonlinear BBO crystal pumped by intense femtoseconds UV pulse. We used ICCD camera as the detector of photons in both beams. It appears that the criterion can be used for the measurement of the quantum efficiencies of the ICCD cameras.

  11. Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations

    NASA Technical Reports Server (NTRS)

    Mantz, A.; Allen, S. W.

    2011-01-01

    Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.

  12. EMISSION TEST REPORT, OMSS FIELD TEST ON CARBON INJECTION FOR MERCURY CONTROL

    EPA Science Inventory

    The report discusses results of a parametric evaluation of powdered activated carbon for control of mercury (Hg) emission from a municipal waste cornbustor (MWC) equipped with a lime spray dryer absorber/fabric filter (SD/FF). The primary test objectives were to evaluate the effe...

  13. Sample Size Determination for One- and Two-Sample Trimmed Mean Tests

    ERIC Educational Resources Information Center

    Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng

    2008-01-01

    Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…

  14. Parametric Fin-Body and Fin-Plate Database for a Series of 12 Missile Fins

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.

    2001-01-01

    A cooperative experimental investigation has been performed to obtain a systematic fin-body and fin-plate database for a series of 12 missile fins. These data are intended to complement and extend the information contained in the Triservice missile project and to provide a systematic set of experimental data from which fin-body interference factors can be derived. Data were obtained with the fins mounted on both an axisymmetric body and on a flat plate that was used to simulate fin-alone measurements. The experiments were conducted at Mach numbers from 0.60 to 3.95; fin deflection angles of 0 deg, 10 deg, and -10 deg; and angles of attack up to 30 deg on the body and up to 95 deg on the flat plate. The data were obtained from three-component balances attached to the fins and a six-component balance located in the axisymmetric body. The data obtained in this project are documented in tabular form in this report. In addition, selected data are presented in graphical form to illustrate the effects of the test variables. These variables are configuration angle of attack; Mach number; and fin parameters of deflection angle, planform size, taper ratio, and aspect ratio. A very limited comparison with the Triservice missile data is made to illustrate the consistency between the data from these two projects.

  15. Toward an Empirically-Based Parametric Explosion Spectral Model

    DTIC Science & Technology

    2011-09-01

    Site (NNSS, formerly the Nevada Test Site ) with data from explosions at the Semipalatinsk Test ...Nevada Test Site ) with data from explosions at the Semipalatinsk Test Site recorded at the Borovoye Geophysical Observatory (BRV). The BRV data archive...explosions at Semipalatinsk Test Site of the former Soviet Union (Figure 4). As an example, we plot the regional phase spectra of one of

  16. Nonlinear complexity of random visibility graph and Lempel-Ziv on multitype range-intensity interacting financial dynamics

    NASA Astrophysics Data System (ADS)

    Zhang, Yali; Wang, Jun

    2017-09-01

    In an attempt to investigate the nonlinear complex evolution of financial dynamics, a new financial price model - the multitype range-intensity contact (MRIC) financial model, is developed based on the multitype range-intensity interacting contact system, in which the interaction and transmission of different types of investment attitudes in a stock market are simulated by viruses spreading. Two new random visibility graph (VG) based analyses and Lempel-Ziv complexity (LZC) are applied to study the complex behaviors of return time series and the corresponding random sorted series. The VG method is the complex network theory, and the LZC is a non-parametric measure of complexity reflecting the rate of new pattern generation of a series. In this work, the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, the numerical empirical study shows the similar complexity behaviors between the model and the real markets, the research confirms that the financial model is reasonable to some extent.

  17. PSFGAN: a generative adversarial network system for separating quasar point sources and host galaxy light

    NASA Astrophysics Data System (ADS)

    Stark, Dominic; Launet, Barthelemy; Schawinski, Kevin; Zhang, Ce; Koss, Michael; Turp, M. Dennis; Sartori, Lia F.; Zhang, Hantian; Chen, Yiru; Weigel, Anna K.

    2018-06-01

    The study of unobscured active galactic nuclei (AGN) and quasars depends on the reliable decomposition of the light from the AGN point source and the extended host galaxy light. The problem is typically approached using parametric fitting routines using separate models for the host galaxy and the point spread function (PSF). We present a new approach using a Generative Adversarial Network (GAN) trained on galaxy images. We test the method using Sloan Digital Sky Survey r-band images with artificial AGN point sources added that are then removed using the GAN and with parametric methods using GALFIT. When the AGN point source is more than twice as bright as the host galaxy, we find that our method, PSFGAN, can recover point source and host galaxy magnitudes with smaller systematic error and a lower average scatter (49 per cent). PSFGAN is more tolerant to poor knowledge of the PSF than parametric methods. Our tests show that PSFGAN is robust against a broadening in the PSF width of ± 50 per cent if it is trained on multiple PSFs. We demonstrate that while a matched training set does improve performance, we can still subtract point sources using a PSFGAN trained on non-astronomical images. While initial training is computationally expensive, evaluating PSFGAN on data is more than 40 times faster than GALFIT fitting two components. Finally, PSFGAN is more robust and easy to use than parametric methods as it requires no input parameters.

  18. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  19. Controlling for seasonal patterns and time varying confounders in time-series epidemiological models: a simulation study.

    PubMed

    Perrakis, Konstantinos; Gryparis, Alexandros; Schwartz, Joel; Le Tertre, Alain; Katsouyanni, Klea; Forastiere, Francesco; Stafoggia, Massimo; Samoli, Evangelia

    2014-12-10

    An important topic when estimating the effect of air pollutants on human health is choosing the best method to control for seasonal patterns and time varying confounders, such as temperature and humidity. Semi-parametric Poisson time-series models include smooth functions of calendar time and weather effects to control for potential confounders. Case-crossover (CC) approaches are considered efficient alternatives that control seasonal confounding by design and allow inclusion of smooth functions of weather confounders through their equivalent Poisson representations. We evaluate both methodological designs with respect to seasonal control and compare spline-based approaches, using natural splines and penalized splines, and two time-stratified CC approaches. For the spline-based methods, we consider fixed degrees of freedom, minimization of the partial autocorrelation function, and general cross-validation as smoothing criteria. Issues of model misspecification with respect to weather confounding are investigated under simulation scenarios, which allow quantifying omitted, misspecified, and irrelevant-variable bias. The simulations are based on fully parametric mechanisms designed to replicate two datasets with different mortality and atmospheric patterns. Overall, minimum partial autocorrelation function approaches provide more stable results for high mortality counts and strong seasonal trends, whereas natural splines with fixed degrees of freedom perform better for low mortality counts and weak seasonal trends followed by the time-season-stratified CC model, which performs equally well in terms of bias but yields higher standard errors. Copyright © 2014 John Wiley & Sons, Ltd.

  20. Research on respiratory motion correction method based on liver contrast-enhanced ultrasound images of single mode

    NASA Astrophysics Data System (ADS)

    Zhang, Ji; Li, Tao; Zheng, Shiqiang; Li, Yiyong

    2015-03-01

    To reduce the effects of respiratory motion in the quantitative analysis based on liver contrast-enhanced ultrasound (CEUS) image sequencesof single mode. The image gating method and the iterative registration method using model image were adopted to register liver contrast-enhanced ultrasound image sequences of single mode. The feasibility of the proposed respiratory motion correction method was explored preliminarily using 10 hepatocellular carcinomas CEUS cases. The positions of the lesions in the time series of 2D ultrasound images after correction were visually evaluated. Before and after correction, the quality of the weighted sum of transit time (WSTT) parametric images were also compared, in terms of the accuracy and spatial resolution. For the corrected and uncorrected sequences, their mean deviation values (mDVs) of time-intensity curve (TIC) fitting derived from CEUS sequences were measured. After the correction, the positions of the lesions in the time series of 2D ultrasound images were almost invariant. In contrast, the lesions in the uncorrected images all shifted noticeably. The quality of the WSTT parametric maps derived from liver CEUS image sequences were improved more greatly. Moreover, the mDVs of TIC fitting derived from CEUS sequences after the correction decreased by an average of 48.48+/-42.15. The proposed correction method could improve the accuracy of quantitative analysis based on liver CEUS image sequences of single mode, which would help in enhancing the differential diagnosis efficiency of liver tumors.

  1. Path-space variational inference for non-equilibrium coarse-grained systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harmandaris, Vagelis, E-mail: harman@uoc.gr; Institute of Applied and Computational Mathematics; Kalligiannaki, Evangelia, E-mail: ekalligian@tem.uoc.gr

    In this paper we discuss information-theoretic tools for obtaining optimized coarse-grained molecular models for both equilibrium and non-equilibrium molecular simulations. The latter are ubiquitous in physicochemical and biological applications, where they are typically associated with coupling mechanisms, multi-physics and/or boundary conditions. In general the non-equilibrium steady states are not known explicitly as they do not necessarily have a Gibbs structure. The presented approach can compare microscopic behavior of molecular systems to parametric and non-parametric coarse-grained models using the relative entropy between distributions on the path space and setting up a corresponding path-space variational inference problem. The methods can become entirelymore » data-driven when the microscopic dynamics are replaced with corresponding correlated data in the form of time series. Furthermore, we present connections and generalizations of force matching methods in coarse-graining with path-space information methods. We demonstrate the enhanced transferability of information-based parameterizations to different observables, at a specific thermodynamic point, due to information inequalities. We discuss methodological connections between information-based coarse-graining of molecular systems and variational inference methods primarily developed in the machine learning community. However, we note that the work presented here addresses variational inference for correlated time series due to the focus on dynamics. The applicability of the proposed methods is demonstrated on high-dimensional stochastic processes given by overdamped and driven Langevin dynamics of interacting particles.« less

  2. Development, Evaluation, and Sensitivity Analysis of Parametric Finite Element Whole-Body Human Models in Side Impacts.

    PubMed

    Hwang, Eunjoo; Hu, Jingwen; Chen, Cong; Klein, Katelyn F; Miller, Carl S; Reed, Matthew P; Rupp, Jonathan D; Hallman, Jason J

    2016-11-01

    Occupant stature and body shape may have significant effects on injury risks in motor vehicle crashes, but the current finite element (FE) human body models (HBMs) only represent occupants with a few sizes and shapes. Our recent studies have demonstrated that, by using a mesh morphing method, parametric FE HBMs can be rapidly developed for representing a diverse population. However, the biofidelity of those models across a wide range of human attributes has not been established. Therefore, the objectives of this study are 1) to evaluate the accuracy of HBMs considering subject-specific geometry information, and 2) to apply the parametric HBMs in a sensitivity analysis for identifying the specific parameters affecting body responses in side impact conditions. Four side-impact tests with two male post-mortem human subjects (PMHSs) were selected to evaluate the accuracy of the geometry and impact responses of the morphed HBMs. For each PMHS test, three HBMs were simulated to compare with the test results: the original Total Human Model for Safety (THUMS) v4.01 (O-THUMS), a parametric THUMS (P-THUMS), and a subject-specific THUMS (S-THUMS). The P-THUMS geometry was predicted from only age, sex, stature, and BMI using our statistical geometry models of skeleton and body shape, while the S-THUMS geometry was based on each PMHS's CT data. The simulation results showed a preliminary trend that the correlations between the PTHUMS- predicted impact responses and the four PMHS tests (mean-CORA: 0.84, 0.78, 0.69, 0.70) were better than those between the O-THUMS and the normalized PMHS responses (mean-CORA: 0.74, 0.72, 0.55, 0.63), while they are similar to the correlations between S-THUMS and the PMHS tests (mean-CORA: 0.85, 0.85, 0.67, 0.72). The sensitivity analysis using the PTHUMS showed that, in side impact conditions, the HBM skeleton and body shape geometries as well as the body posture were more important in modeling the occupant impact responses than the bone and soft tissue material properties and the padding stiffness with the given parameter ranges. More investigations are needed to further support these findings.

  3. A Nonparametric K-Sample Test for Equality of Slopes.

    ERIC Educational Resources Information Center

    Penfield, Douglas A.; Koffler, Stephen L.

    1986-01-01

    The development of a nonparametric K-sample test for equality of slopes using Puri's generalized L statistic is presented. The test is recommended when the assumptions underlying the parametric model are violated. This procedure replaces original data with either ranks (for data with heavy tails) or normal scores (for data with light tails).…

  4. Performance of DIMTEST-and NOHARM-Based Statistics for Testing Unidimensionality

    ERIC Educational Resources Information Center

    Finch, Holmes; Habing, Brian

    2007-01-01

    This Monte Carlo study compares the ability of the parametric bootstrap version of DIMTEST with three goodness-of-fit tests calculated from a fitted NOHARM model to detect violations of the assumption of unidimensionality in testing data. The effectiveness of the procedures was evaluated for different numbers of items, numbers of examinees,…

  5. Capabilities of stochastic rainfall models as data providers for urban hydrology

    NASA Astrophysics Data System (ADS)

    Haberlandt, Uwe

    2017-04-01

    For planning of urban drainage systems using hydrological models, long, continuous precipitation series with high temporal resolution are needed. Since observed time series are often too short or not available everywhere, the use of synthetic precipitation is a common alternative. This contribution compares three precipitation models regarding their suitability to provide 5 minute continuous rainfall time series for a) sizing of drainage networks for urban flood protection and b) dimensioning of combined sewage systems for pollution reduction. The rainfall models are a parametric stochastic model (Haberlandt et al., 2008), a non-parametric probabilistic approach (Bárdossy, 1998) and a stochastic downscaling of dynamically simulated rainfall (Berg et al., 2013); all models are operated both as single site and multi-site generators. The models are applied with regionalised parameters assuming that there is no station at the target location. Rainfall and discharge characteristics are utilised for evaluation of the model performance. The simulation results are compared against results obtained from reference rainfall stations not used for parameter estimation. The rainfall simulations are carried out for the federal states of Baden-Württemberg and Lower Saxony in Germany and the discharge simulations for the drainage networks of the cities of Hamburg, Brunswick and Freiburg. Altogether, the results show comparable simulation performance for the three models, good capabilities for single site simulations but low skills for multi-site simulations. Remarkably, there is no significant difference in simulation performance comparing the tasks flood protection with pollution reduction, so the models are finally able to simulate both the extremes and the long term characteristics of rainfall equally well. Bárdossy, A., 1998. Generating precipitation time series using simulated annealing. Wat. Resour. Res., 34(7): 1737-1744. Berg, P., Wagner, S., Kunstmann, H., Schädler, G., 2013. High resolution regional climate model simulations for Germany: part I — validation. Climate Dynamics, 40(1): 401-414. Haberlandt, U., Ebner von Eschenbach, A.-D., Buchwald, I., 2008. A space-time hybrid hourly rainfall model for derived flood frequency analysis. Hydrol. Earth Syst. Sci., 12: 1353-1367.

  6. Appplications of the post-Tolman-Oppenheimer-Volkoff formalism

    NASA Astrophysics Data System (ADS)

    Silva, Hector O.; Glampedakis, Kostas; Pappas, George; Berti, Emanuele

    2017-01-01

    Besides their astrophysical interest, neutron stars are promising candidates for testing theories of gravity in the strong-field regime. It is known that, generically, modifications to general relativity affect the bulk properties of neutron stars, e.g. their masses and radii, in a way that depends on the specific choice of theory. In this presentation we review a theory-agnostic approach to model relativistic stars, called the post-Tolman-Oppenheimer-Volkoff formalism. Drawing inspiration from the parametrized post-Newtonian formalism, this framework allows us to describe perturbative deviations from general relativity in the structure of neutrons stars in a parametrized manner. We show that a variety of astrophysical observables (namely the surface redshift, the apparent radius, the Eddington luminosity and the orbital frequency of particles in geodesic motion around neutron stars) can be parametrized using only two parameters.

  7. Effect of Monovalent Ion Parameters on Molecular Dynamics Simulations of G-Quadruplexes.

    PubMed

    Havrila, Marek; Stadlbauer, Petr; Islam, Barira; Otyepka, Michal; Šponer, Jiří

    2017-08-08

    G-quadruplexes (GQs) are key noncanonical DNA and RNA architectures stabilized by desolvated monovalent cations present in their central channels. We analyze extended atomistic molecular dynamics simulations (∼580 μs in total) of GQs with 11 monovalent cation parametrizations, assessing GQ overall structural stability, dynamics of internal cations, and distortions of the G-tetrad geometries. Majority of simulations were executed with the SPC/E water model; however, test simulations with TIP3P and OPC water models are also reported. The identity and parametrization of ions strongly affect behavior of a tetramolecular d[GGG] 4 GQ, which is unstable with several ion parametrizations. The remaining studied RNA and DNA GQs are structurally stable, though the G-tetrad geometries are always deformed by bifurcated H-bonding in a parametrization-specific manner. Thus, basic 10-μs-scale simulations of fully folded GQs can be safely done with a number of cation parametrizations. However, there are parametrization-specific differences and basic force-field errors affecting the quantitative description of ion-tetrad interactions, which may significantly affect studies of the ion-binding processes and description of the GQ folding landscape. Our d[GGG] 4 simulations indirectly suggest that such studies will also be sensitive to the water models. During exchanges with bulk water, the Na + ions move inside the GQs in a concerted manner, while larger relocations of the K + ions are typically separated. We suggest that the Joung-Cheatham SPC/E K + parameters represent a safe choice in simulation studies of GQs, though variation of ion parameters can be used for specific simulation goals.

  8. Impact of state updating and multi-parametric ensemble for streamflow hindcasting in European river basins

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.

    2015-12-01

    Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.

  9. Design and the parametric testing of the space station prototype integrated vapor compression distillation water recovery module

    NASA Technical Reports Server (NTRS)

    Reveley, W. F.; Nuccio, P. P.

    1975-01-01

    Potable water for the Space Station Prototype life support system is generated by the vapor compression technique of vacuum distillation. A description of a complete three-man modular vapor compression water renovation loop that was built and tested is presented; included are all of the pumps, tankage, chemical post-treatment, instrumentation, and controls necessary to make the loop representative of an automatic, self-monitoring, null gravity system. The design rationale is given and the evolved configuration is described. Presented next are the results of an extensive parametric test during which distilled water was generated from urine and urinal flush water with concentration of solids in the evaporating liquid increasing progressively to 60 percent. Water quality, quantity and production rate are shown together with measured energy consumption rate in terms of watt-hours per kilogram of distilled water produced.

  10. Packaging Technology for SiC High Temperature Circuits Operable up to 500 Degrees Centigrade

    NASA Technical Reports Server (NTRS)

    Chen, Lian-Yu

    2002-01-01

    New high temperature low power 8-pin packages have been fabricated using commercial fabrication service. These packages are made of aluminum nitride and 96 percent alumina with Au metallization. The new design of these packages provides the chips inside with EM shielding. Wirebond geometry control has been achieved for precise mechanical tests. Au wirebond samples with 45 degree heel-angle have been tested using wireloop test module. The geometry control improves the consistency of measurement of the wireloop breaking point.Also reported on is a parametric study of the thermomechanical reliability of a Au thick-film based SiC die-attach assembly using nonlinear finite element analysis (FEA) was conducted to optimize the die-attach thermo-mechanical performance for operation at temperatures from room temperature to 500 degrees Centigrade. This parametric study centered on material selection, structure design and process control.

  11. Thermal Testing and Analysis of an Efficient High-Temperature Multi-Screen Internal Insulation

    NASA Technical Reports Server (NTRS)

    Weiland, Stefan; Handrick, Karin; Daryabeigi, Kamran

    2007-01-01

    Conventional multi-layer insulations exhibit excellent insulation performance but they are limited to the temperature range to which their components reflective foils and spacer materials are compatible. For high temperature applications, the internal multi-screen insulation IMI has been developed that utilizes unique ceramic material technology to produce reflective screens with high temperature stability. For analytical insulation sizing a parametric material model is developed that includes the main contributors for heat flow which are radiation and conduction. The adaptation of model-parameters based on effective steady-state thermal conductivity measurements performed at NASA Langley Research Center (LaRC) allows for extrapolation to arbitrary stack configurations and temperature ranges beyond the ones that were covered in the conductivity measurements. Experimental validation of the parametric material model was performed during the thermal qualification test of the X-38 Chin-panel, where test results and predictions showed a good agreement.

  12. Local tests of gravitation with Gaia observations of Solar System Objects

    NASA Astrophysics Data System (ADS)

    Hees, Aurélien; Le Poncin-Lafitte, Christophe; Hestroffer, Daniel; David, Pedro

    2018-04-01

    In this proceeding, we show how observations of Solar System Objects with Gaia can be used to test General Relativity and to constrain modified gravitational theories. The high number of Solar System objects observed and the variety of their orbital parameters associated with the impressive astrometric accuracy will allow us to perform local tests of General Relativity. In this communication, we present a preliminary sensitivity study of the Gaia observations on dynamical parameters such as the Sun quadrupolar moment and on various extensions to general relativity such as the parametrized post-Newtonian parameters, the fifth force formalism and a violation of Lorentz symmetry parametrized by the Standard-Model extension framework. We take into account the time sequences and the geometry of the observations that are particular to Gaia for its nominal mission (5 years) and for an extended mission (10 years).

  13. Assessing noninferiority in a three-arm trial using the Bayesian approach.

    PubMed

    Ghosh, Pulak; Nathoo, Farouk; Gönen, Mithat; Tiwari, Ram C

    2011-07-10

    Non-inferiority trials, which aim to demonstrate that a test product is not worse than a competitor by more than a pre-specified small amount, are of great importance to the pharmaceutical community. As a result, methodology for designing and analyzing such trials is required, and developing new methods for such analysis is an important area of statistical research. The three-arm trial consists of a placebo, a reference and an experimental treatment, and simultaneously tests the superiority of the reference over the placebo along with comparing this reference to an experimental treatment. In this paper, we consider the analysis of non-inferiority trials using Bayesian methods which incorporate both parametric as well as semi-parametric models. The resulting testing approach is both flexible and robust. The benefit of the proposed Bayesian methods is assessed via simulation, based on a study examining home-based blood pressure interventions. Copyright © 2011 John Wiley & Sons, Ltd.

  14. On the efficacy of procedures to normalize Ex-Gaussian distributions.

    PubMed

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2014-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results.

  15. RSRA sixth scale wind tunnel test. Tabulated balance data, volume 2

    NASA Technical Reports Server (NTRS)

    Ruddell, A.; Flemming, R.

    1974-01-01

    Summaries are presented of all the force and moment data acquired during the RSRA Sixth Scale Wind Tunnel Test. These data include and supplement the data presented in curve form in previous reports. Each summary includes the model configuration, wing and empennage incidences and deflections, and recorded balance data. The first group of data in each summary presents the force and moment data in full scale parametric form, the dynamic pressure and velocity in the test section, and the powered nacelle fan speed. The second and third groups of data are the balance data in nondimensional coefficient form. The wind axis coefficient data corresponds to the parametric data divided by the wing area for forces and divided by the product of the wing area and wing span or mean aerodynamic chord for moments. The stability axis data resolves the wind axis data with respect to the angle of yaw.

  16. Head impact mechanisms of a child occupant seated in a child restraint system as determined by impact testing.

    PubMed

    Yoshida, Ryoichi; Okada, Hiroshi; Nomura, Mitsunori; Mizuno, Koji; Tanaka, Yoshinori; Hosokawa, Naruyuki

    2011-11-01

    In side collision accidents, the head is the most frequently injured body region for child occupants seated in a child restraint system (CRS). Accident analyses show that a child's head can move out of the CRS shell, make hard contact with the vehicle interior, and thus sustain serious injuries. In order to improve child head protection in side collisions, it is necessary to understand the injury mechanism of a child in the CRS whose head makes contact with the vehicle interior. In this research, an SUV-to-car oblique side crash test was conducted to reconstruct such head contacts. A Q3s child dummy was seated in a CRS in the rear seat of the target car. The Q3s child dummy's head moved out beyond the CRS side wing, moved laterally, and made contact with the side window glass and the doorsill. It was demonstrated that the hard head contact, which produced a high HIC value, could occur in side collisions. A series of sled tests was carried out to reproduce the dummy kinematic behavior observed in the SUV-to-car crash test, and the sled test conditions such as sled angle, ECE seat slant angle and velocity-time history that duplicated the kinematic behavior were determined. A parametric study also was conducted with the sled tests; and it was found that the impact angle, harness slack, chest clip, and the CRS side wing shape affected the torso motion and head contact with the vehicle interior.

  17. Refining patterns of joint hypermobility, habitus, and orthopedic traits in joint hypermobility syndrome and Ehlers-Danlos syndrome, hypermobility type.

    PubMed

    Morlino, Silvia; Dordoni, Chiara; Sperduti, Isabella; Venturini, Marina; Celletti, Claudia; Camerota, Filippo; Colombi, Marina; Castori, Marco

    2017-04-01

    Joint hypermobility syndrome (JHS) and Ehlers-Danlos syndrome, hypermobility type (EDS-HT) are two overlapping heritable disorders (JHS/EDS-HT) recognized by separated sets of diagnostic criteria and still lack a confirmatory test. This descriptive research was aimed at better characterizing the clinical phenotype of JHS/EDS-HT with focus on available diagnostic criteria, and in order to propose novel features and assessment strategies. One hundred and eighty-nine (163 females, 26 males; age: 2-73 years) patients from two Italian reference centers were investigated for Beighton score, range of motion in 21 additional joints, rate and sites of dislocations and sprains, recurrent soft-tissue injuries, tendon and muscle ruptures, body mass index, arm span/height ratio, wrist and thumb signs, and 12 additional orthopedic features. Rough rates were compared by age, sex, and handedness with a series of parametric and non-parametric tools. Multiple correspondence analysis was carried out for possible co-segregations of features. Beighton score and hypermobility at other joints were influenced by age at diagnosis. Rate and sites of joint instability complications did not vary according to age at diagnosis except for soft-tissue injuries. No major difference was registered by sex and dominant versus non-dominant body side. At multiple correspondence analysis, selected features tend to co-segregate in a dichotomous distribution. Dolichostenomelia and arachnodactyly segregated independently. This study pointed out a more protean musculoskeletal phenotype than previously considered according to available diagnostic criteria for JHS/EDS-HT. Our findings corroborated the need for a re-thinking of JHS/EDS-HT on clinical grounds in order to find better therapeutic and research strategies. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Non-parametric trend analysis of the aridity index for three large arid and semi-arid basins in Iran

    NASA Astrophysics Data System (ADS)

    Ahani, Hossien; Kherad, Mehrzad; Kousari, Mohammad Reza; van Roosmalen, Lieke; Aryanfar, Ramin; Hosseini, Seyyed Mashaallah

    2013-05-01

    Currently, an important scientific challenge that researchers are facing is to gain a better understanding of climate change at the regional scale, which can be especially challenging in an area with low and highly variable precipitation amounts such as Iran. Trend analysis of the medium-term change using ground station observations of meteorological variables can enhance our knowledge of the dominant processes in an area and contribute to the analysis of future climate projections. Generally, studies focus on the long-term variability of temperature and precipitation and to a lesser extent on other important parameters such as moisture indices. In this study the recent 50-year trends (1955-2005) of precipitation (P), potential evapotranspiration (PET), and aridity index (AI) in monthly time scale were studied over 14 synoptic stations in three large Iran basins using the Mann-Kendall non-parametric test. Additionally, an analysis of the monthly, seasonal and annual trend of each parameter was performed. Results showed no significant trends in the monthly time series. However, PET showed significant, mostly decreasing trends, for the seasonal values, which resulted in a significant negative trend in annual PET at five stations. Significant negative trends in seasonal P values were only found at a number of stations in spring and summer and no station showed significant negative trends in annual P. Due to the varied positive and negative trends in annual P and to a lesser extent PET, almost as many stations with negative as positive trends in annual AI were found, indicating that both drying and wetting trends occurred in Iran. Overall, the northern part of the study area showed an increasing trend in annual AI which meant that the region became wetter, while the south showed decreasing trends in AI.

  19. Characterization and modelling of the spatially- and spectrally-varying point-spread function in hyperspectral imaging systems for computational correction of axial optical aberrations

    NASA Astrophysics Data System (ADS)

    Špiclin, Žiga; Bürmen, Miran; Pernuš, Franjo; Likar, Boštjan

    2012-03-01

    Spatial resolution of hyperspectral imaging systems can vary significantly due to axial optical aberrations that originate from wavelength-induced index-of-refraction variations of the imaging optics. For systems that have a broad spectral range, the spatial resolution will vary significantly both with respect to the acquisition wavelength and with respect to the spatial position within each spectral image. Variations of the spatial resolution can be effectively characterized as part of the calibration procedure by a local image-based estimation of the pointspread function (PSF) of the hyperspectral imaging system. The estimated PSF can then be used in the image deconvolution methods to improve the spatial resolution of the spectral images. We estimated the PSFs from the spectral images of a line grid geometric caliber. From individual line segments of the line grid, the PSF was obtained by a non-parametric estimation procedure that used an orthogonal series representation of the PSF. By using the non-parametric estimation procedure, the PSFs were estimated at different spatial positions and at different wavelengths. The variations of the spatial resolution were characterized by the radius and the fullwidth half-maximum of each PSF and by the modulation transfer function, computed from images of USAF1951 resolution target. The estimation and characterization of the PSFs and the image deconvolution based spatial resolution enhancement were tested on images obtained by a hyperspectral imaging system with an acousto-optic tunable filter in the visible spectral range. The results demonstrate that the spatial resolution of the acquired spectral images can be significantly improved using the estimated PSFs and image deconvolution methods.

  20. Simulating Streamflow and Dissolved Organic Matter Export from small Forested Watersheds

    NASA Astrophysics Data System (ADS)

    Xu, N.; Wilson, H.; Saiers, J. E.

    2010-12-01

    Coupling the rainfall-runoff process and solute transport in catchment models is important for understanding the dynamics of water-quality-relevant constituents in a watershed. To simulate the hydrologic and biogeochemical processes in a parametrically parsimonious way remains challenging. The purpose of this study is to quantify the export of water and dissolved organic matter (DOM) from a forested catchment by developing and testing a coupled model for rainfall-runoff and soil-water flushing of DOM. Natural DOM plays an important role in terrestrial and aquatic systems by affecting nutrient cycling, contaminant mobility and toxicity, and drinking water quality. Stream-water discharge and DOM concentrations were measured in a first-order stream in Harvard Forest, Massachusetts. These measurements show that stream water DOM concentrations are greatest during hydrologic events induced by rainfall or snowmelt and decline to low, steady levels during periods of baseflow. Comparison of the stream-discharge data to calculations of a simple rainfall-runoff model reveals a hysteretic relationship between stream-flow rates and the storage of water within the catchment. A modified version of the rainfall-runoff model that accounts for hysteresis in the storage-discharge relationship in a parametrically simple way is capable of describing much, but not all, of the variation in the time-series data on stream discharge. Our ongoing research is aimed at linking the new rainfall-runoff formulation with coupled equations that predict soil-flushing and stream-water concentrations of DOM as functions of the temporal change in catchment water storage. This model will provide a predictive tool for examining how changes in climatic variables would affect the runoff generation and DOM fluxes from terrestrial landscape.

  1. DFTB Parameters for the Periodic Table: Part 1, Electronic Structure.

    PubMed

    Wahiduzzaman, Mohammad; Oliveira, Augusto F; Philipsen, Pier; Zhechkov, Lyuben; van Lenthe, Erik; Witek, Henryk A; Heine, Thomas

    2013-09-10

    A parametrization scheme for the electronic part of the density-functional based tight-binding (DFTB) method that covers the periodic table is presented. A semiautomatic parametrization scheme has been developed that uses Kohn-Sham energies and band structure curvatures of real and fictitious homoatomic crystal structures as reference data. A confinement potential is used to tighten the Kohn-Sham orbitals, which includes two free parameters that are used to optimize the performance of the method. The method is tested on more than 100 systems and shows excellent overall performance.

  2. A longitudinal model for functional connectivity networks using resting-state fMRI.

    PubMed

    Hart, Brian; Cribben, Ivor; Fiecas, Mark

    2018-06-04

    Many neuroimaging studies collect functional magnetic resonance imaging (fMRI) data in a longitudinal manner. However, the current fMRI literature lacks a general framework for analyzing functional connectivity (FC) networks in fMRI data obtained from a longitudinal study. In this work, we build a novel longitudinal FC model using a variance components approach. First, for all subjects' visits, we account for the autocorrelation inherent in the fMRI time series data using a non-parametric technique. Second, we use a generalized least squares approach to estimate 1) the within-subject variance component shared across the population, 2) the baseline FC strength, and 3) the FC's longitudinal trend. Our novel method for longitudinal FC networks seeks to account for the within-subject dependence across multiple visits, the variability due to the subjects being sampled from a population, and the autocorrelation present in fMRI time series data, while restricting the number of parameters in order to make the method computationally feasible and stable. We develop a permutation testing procedure to draw valid inference on group differences in the baseline FC network and change in FC over longitudinal time between a set of patients and a comparable set of controls. To examine performance, we run a series of simulations and apply the model to longitudinal fMRI data collected from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database. Overall, we found no difference in the global FC network between Alzheimer's disease patients and healthy controls, but did find differing local aging patterns in the FC between the left hippocampus and the posterior cingulate cortex. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Hydroelastic analysis of surface wave interaction with concentric porous and flexible cylinder systems

    NASA Astrophysics Data System (ADS)

    Mandal, S.; Datta, N.; Sahoo, T.

    2013-10-01

    The present study deals with the hydroelastic analysis of gravity wave interaction with concentric porous and flexible cylinder systems, in which the inner cylinder is rigid and the outer cylinder is porous and flexible. The problems are analyzed in finite water depth under the assumption of small amplitude water wave theory and structural response. The cylinder configurations in the present study are namely (a) surface-piercing truncated cylinders, (b) bottom-touching truncated cylinders and (c) complete submerged cylinders extended from free surface to bottom. As special cases of the concentric cylinder system, wave diffraction by (i) porous flexible cylinder and (ii) flexible floating cage with rigid bottom are analyzed. The scattering potentials are evaluated using Fourier-Bessel series expansion method and the least square approximation method. The convergence of the double series is tested numerically to determine the number of terms in the Fourier-Bessel series expansion. The effects of porosity and flexibility of the outer cylinder, in attenuating the hydrodynamic forces and dynamic overturning moments, are analyzed for various cylinder configurations and wave characteristics. A parametric study with respect to wave frequency, ratios of inner-to-outer cylinder radii, annular spacing between the two cylinders and porosities is done. In order to understand the flow distribution around the cylinders, contour plots are provided. The findings of the present study are likely to be of immense help in the design of various types of marine structures which can withstand the wave loads of varied nature in the marine environment. The theory can be easily extended to deal with a large class of problems associated with acoustic wave interaction with flexible porous structures.

  4. From Neutron Star Observables to the Equation of State. I. An Optimal Parametrization

    NASA Astrophysics Data System (ADS)

    Raithel, Carolyn A.; Özel, Feryal; Psaltis, Dimitrios

    2016-11-01

    The increasing number and precision of measurements of neutron star masses, radii, and, in the near future, moments of inertia offer the possibility of precisely determining the neutron star equation of state (EOS). One way to facilitate the mapping of observables to the EOS is through a parametrization of the latter. We present here a generic method for optimizing the parametrization of any physically allowed EOS. We use mock EOS that incorporate physically diverse and extreme behavior to test how well our parametrization reproduces the global properties of the stars, by minimizing the errors in the observables of mass, radius, and the moment of inertia. We find that using piecewise polytropes and sampling the EOS with five fiducial densities between ˜1-8 times the nuclear saturation density results in optimal errors for the smallest number of parameters. Specifically, it recreates the radii of the assumed EOS to within less than 0.5 km for the extreme mock EOS and to within less than 0.12 km for 95% of a sample of 42 proposed, physically motivated EOS. Such a parametrization is also able to reproduce the maximum mass to within 0.04 {M}⊙ and the moment of inertia of a 1.338 {M}⊙ neutron star to within less than 10% for 95% of the proposed sample of EOS.

  5. Selecting a Separable Parametric Spatiotemporal Covariance Structure for Longitudinal Imaging Data

    PubMed Central

    George, Brandon; Aban, Inmaculada

    2014-01-01

    Longitudinal imaging studies allow great insight into how the structure and function of a subject’s internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures, and the spatial from the outcomes of interest being observed at multiple points in a patients body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on Type I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the Type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be done in practice, as well as how covariance structure choice can change inferences about fixed effects. PMID:25293361

  6. Identification of stress responsive genes by studying specific relationships between mRNA and protein abundance.

    PubMed

    Morimoto, Shimpei; Yahara, Koji

    2018-03-01

    Protein expression is regulated by the production and degradation of mRNAs and proteins but the specifics of their relationship are controversial. Although technological advances have enabled genome-wide and time-series surveys of mRNA and protein abundance, recent studies have shown paradoxical results, with most statistical analyses being limited to linear correlation, or analysis of variance applied separately to mRNA and protein datasets. Here, using recently analyzed genome-wide time-series data, we have developed a statistical analysis framework for identifying which types of genes or biological gene groups have significant correlation between mRNA and protein abundance after accounting for potential time delays. Our framework stratifies all genes in terms of the extent of time delay, conducts gene clustering in each stratum, and performs a non-parametric statistical test of the correlation between mRNA and protein abundance in a gene cluster. Consequently, we revealed stronger correlations than previously reported between mRNA and protein abundance in two metabolic pathways. Moreover, we identified a pair of stress responsive genes ( ADC17 and KIN1 ) that showed a highly similar time series of mRNA and protein abundance. Furthermore, we confirmed robustness of the analysis framework by applying it to another genome-wide time-series data and identifying a cytoskeleton-related gene cluster (keratin 18, keratin 17, and mitotic spindle positioning) that shows similar correlation. The significant correlation and highly similar changes of mRNA and protein abundance suggests a concerted role of these genes in cellular stress response, which we consider provides an answer to the question of the specific relationships between mRNA and protein in a cell. In addition, our framework for studying the relationship between mRNAs and proteins in a cell will provide a basis for studying specific relationships between mRNA and protein abundance after accounting for potential time delays.

  7. Normal Distribution of CD8+ T-Cell-Derived ELISPOT Counts within Replicates Justifies the Reliance on Parametric Statistics for Identifying Positive Responses.

    PubMed

    Karulin, Alexey Y; Caspell, Richard; Dittrich, Marcus; Lehmann, Paul V

    2015-03-02

    Accurate assessment of positive ELISPOT responses for low frequencies of antigen-specific T-cells is controversial. In particular, it is still unknown whether ELISPOT counts within replicate wells follow a theoretical distribution function, and thus whether high power parametric statistics can be used to discriminate between positive and negative wells. We studied experimental distributions of spot counts for up to 120 replicate wells of IFN-γ production by CD8+ T-cell responding to EBV LMP2A (426 - 434) peptide in human PBMC. The cells were tested in serial dilutions covering a wide range of average spot counts per condition, from just a few to hundreds of spots per well. Statistical analysis of the data using diagnostic Q-Q plots and the Shapiro-Wilk normality test showed that in the entire dynamic range of ELISPOT spot counts within replicate wells followed a normal distribution. This result implies that the Student t-Test and ANOVA are suited to identify positive responses. We also show experimentally that borderline responses can be reliably detected by involving more replicate wells, plating higher numbers of PBMC, addition of IL-7, or a combination of these. Furthermore, we have experimentally verified that the number of replicates needed for detection of weak responses can be calculated using parametric statistics.

  8. Strong stabilization servo controller with optimization of performance criteria.

    PubMed

    Sarjaš, Andrej; Svečko, Rajko; Chowdhury, Amor

    2011-07-01

    Synthesis of a simple robust controller with a pole placement technique and a H(∞) metrics is the method used for control of a servo mechanism with BLDC and BDC electric motors. The method includes solving a polynomial equation on the basis of the chosen characteristic polynomial using the Manabe standard polynomial form and parametric solutions. Parametric solutions are introduced directly into the structure of the servo controller. On the basis of the chosen parametric solutions the robustness of a closed-loop system is assessed through uncertainty models and assessment of the norm ‖•‖(∞). The design procedure and the optimization are performed with a genetic algorithm differential evolution - DE. The DE optimization method determines a suboptimal solution throughout the optimization on the basis of a spectrally square polynomial and Šiljak's absolute stability test. The stability of the designed controller during the optimization is being checked with Lipatov's stability condition. Both utilized approaches: Šiljak's test and Lipatov's condition, check the robustness and stability characteristics on the basis of the polynomial's coefficients, and are very convenient for automated design of closed-loop control and for application in optimization algorithms such as DE. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  9. A modified temporal criterion to meta-optimize the extended Kalman filter for land cover classification of remotely sensed time series

    NASA Astrophysics Data System (ADS)

    Salmon, B. P.; Kleynhans, W.; Olivier, J. C.; van den Bergh, F.; Wessels, K. J.

    2018-05-01

    Humans are transforming land cover at an ever-increasing rate. Accurate geographical maps on land cover, especially rural and urban settlements are essential to planning sustainable development. Time series extracted from MODerate resolution Imaging Spectroradiometer (MODIS) land surface reflectance products have been used to differentiate land cover classes by analyzing the seasonal patterns in reflectance values. The proper fitting of a parametric model to these time series usually requires several adjustments to the regression method. To reduce the workload, a global setting of parameters is done to the regression method for a geographical area. In this work we have modified a meta-optimization approach to setting a regression method to extract the parameters on a per time series basis. The standard deviation of the model parameters and magnitude of residuals are used as scoring function. We successfully fitted a triply modulated model to the seasonal patterns of our study area using a non-linear extended Kalman filter (EKF). The approach uses temporal information which significantly reduces the processing time and storage requirements to process each time series. It also derives reliability metrics for each time series individually. The features extracted using the proposed method are classified with a support vector machine and the performance of the method is compared to the original approach on our ground truth data.

  10. A robust multi-kernel change detection framework for detecting leaf beetle defoliation using Landsat 7 ETM+ data

    NASA Astrophysics Data System (ADS)

    Anees, Asim; Aryal, Jagannath; O'Reilly, Małgorzata M.; Gale, Timothy J.; Wardlaw, Tim

    2016-12-01

    A robust non-parametric framework, based on multiple Radial Basic Function (RBF) kernels, is proposed in this study, for detecting land/forest cover changes using Landsat 7 ETM+ images. One of the widely used frameworks is to find change vectors (difference image) and use a supervised classifier to differentiate between change and no-change. The Bayesian Classifiers e.g. Maximum Likelihood Classifier (MLC), Naive Bayes (NB), are widely used probabilistic classifiers which assume parametric models, e.g. Gaussian function, for the class conditional distributions. However, their performance can be limited if the data set deviates from the assumed model. The proposed framework exploits the useful properties of Least Squares Probabilistic Classifier (LSPC) formulation i.e. non-parametric and probabilistic nature, to model class posterior probabilities of the difference image using a linear combination of a large number of Gaussian kernels. To this end, a simple technique, based on 10-fold cross-validation is also proposed for tuning model parameters automatically instead of selecting a (possibly) suboptimal combination from pre-specified lists of values. The proposed framework has been tested and compared with Support Vector Machine (SVM) and NB for detection of defoliation, caused by leaf beetles (Paropsisterna spp.) in Eucalyptus nitens and Eucalyptus globulus plantations of two test areas, in Tasmania, Australia, using raw bands and band combination indices of Landsat 7 ETM+. It was observed that due to multi-kernel non-parametric formulation and probabilistic nature, the LSPC outperforms parametric NB with Gaussian assumption in change detection framework, with Overall Accuracy (OA) ranging from 93.6% (κ = 0.87) to 97.4% (κ = 0.94) against 85.3% (κ = 0.69) to 93.4% (κ = 0.85), and is more robust to changing data distributions. Its performance was comparable to SVM, with added advantages of being probabilistic and capable of handling multi-class problems naturally with its original formulation.

  11. Estimation of viscoelastic parameters in Prony series from shear wave propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Jae-Wook; Hong, Jung-Wuk, E-mail: j.hong@kaist.ac.kr, E-mail: jwhong@alum.mit.edu; Lee, Hyoung-Ki

    2016-06-21

    When acquiring accurate ultrasonic images, we must precisely estimate the mechanical properties of the soft tissue. This study investigates and estimates the viscoelastic properties of the tissue by analyzing shear waves generated through an acoustic radiation force. The shear waves are sourced from a localized pushing force acting for a certain duration, and the generated waves travel horizontally. The wave velocities depend on the mechanical properties of the tissue such as the shear modulus and viscoelastic properties; therefore, we can inversely calculate the properties of the tissue through parametric studies.

  12. Dispersive analysis of ω/Φ → 3π, πγ*

    DOE PAGES

    Danilkin, Igor V.; Fernandez Ramirez, Cesar; Guo, Peng; ...

    2015-05-01

    The decays ω/Φ → 3π are considered in the dispersive framework that is based on the isobar decomposition and subenergy unitarity. The inelastic contributions are parametrized by the power series in a suitably chosen conformal variable that properly accounts for the analytic properties of the amplitude. The Dalitz plot distributions and integrated decay widths are presented. Our results indicate that the final- state interactions may be sizable. As a further application of the formalism we also compute the electromagnetic transition form factors of ω/Φ → π⁰γ*.

  13. Posterior fusion only for thoracic adolescent idiopathic scoliosis of more than 80 degrees: pedicle screws versus hybrid instrumentation.

    PubMed

    Di Silvestre, Mario; Bakaloudis, Georgios; Lolli, Francesco; Vommaro, Francesco; Martikos, Konstantinos; Parisini, Patrizio

    2008-10-01

    The treatment of thoracic adolescent idiopathic scoliosis (AIS) of more than 80 degrees traditionally consisted of a combined procedure, an anterior release performed through an open thoracotomy followed by a posterior fusion. Recently, some studies have reassessed the role of posterior fusion only as treatment for severe thoracic AIS; the correction rate of the thoracic curves was comparable to most series of combined anterior and posterior surgery, with shorter surgery time and without the negative effect on pulmonary function of anterior transthoracic exposure. Compared with other studies published so far on the use of posterior fusion alone for severe thoracic AIS, the present study examines a larger group of patients (52 cases) reviewed at a longer follow-up (average 6.7 years, range 4.5-8.5 years). The aim of the study was to evaluate the clinical and radiographic outcome of surgical treatment for severe thoracic (>80 degrees) AIS treated with posterior spinal fusion alone, and compare comprehensively the results of posterior fusion with a hybrid construct (proximal hooks and distal pedicle screws) versus a pedicle screw instrumentation. All patients (n = 52) with main thoracic AIS curves greater than 80 degrees (Lenke type 1, 2, 3, and 4), surgically treated between 1996 and 2000 at one institution, by posterior spinal fusion either with hybrid instrumentation (PSF-H group; n = 27 patients), or with pedicle screw-only construct (PSF-S group; n = 25 patients) were reviewed. There were no differences between the two groups in terms of age, Risser's sign, Cobb preoperative main thoracic (MT) curve magnitude (PSF-H: 92 degrees vs. PSF-S: 88 degrees), or flexibility on bending films (PSF-H: 27% vs. PSF-S: 25%). Statistical analysis was performed using the t test (paired and unpaired), Wilcoxon test for non-parametric paired analysis, and the Mann-Whitney test for non-parametric unpaired analysis. At the last follow-up, the PSF-S group, when compared to the PSF-H group had a final MT correction rate of 52.4 versus 44.52% (P = 0.001), with a loss of -1.9 degrees versus -11.3 degrees (P = 0.0005), a TL/L correction of 50 versus 43% (ns), a greater correction of the lowest instrumented vertebra translation (-1.00 vs. -0.54 cm; P = 0.04), and tilt (-19 degrees vs. -10 degrees; P = 0.005) on the coronal plane. There were no statistically significant differences in sagittal and global coronal alignment between the two groups (C7-S1 offset: PSF-H = 0.5 cm vs. PSF-S = 0 cm). In the hybrid series (27 patients) surgery-related complications necessitated three revision surgeries, whereas in the screw group (25 patients) one revision surgery was performed. No neurological complications or deep wound infection occurred in this series. In conclusion, posterior spinal fusion for severe thoracic AIS with pedicle screws only, when compared to hybrid construct, allowed a greater coronal correction of both main thoracic and secondary lumbar curves, less loss of the postoperative correction achieved, and fewer revision surgeries. Posterior-only fusion with pedicle screws enabled a good and stable correction of severe scoliosis. However, severe curves may be amenable to hybrid instrumentation that produced analogous results to the screws-only constructs concerning patient satisfaction; at the latest follow-up, SRS-30 and SF-36 scores did not show any statistical differences between the two groups.

  14. Bootstrap versus Statistical Effect Size Corrections: A Comparison with Data from the Finding Embedded Figures Test.

    ERIC Educational Resources Information Center

    Thompson, Bruce; Melancon, Janet G.

    Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…

  15. Cardiovascular Risk Reduction for African-American Men through Health Empowerment and Anger Management

    ERIC Educational Resources Information Center

    Stephens, Torrance; Braithwaite, Harold; Johnson, Larry; Harris, Catrell; Katkowsky, Steven; Troutman, Adewale

    2008-01-01

    Objective: To examine impact of CVD risk reduction intervention for African-American men in the Atlanta Empowerment Zone (AEZ) designed to target anger management. Design: Wilcoxon Signed-Rank Test was employed as a non-parametric alternative to the t-test for independent samples. This test was employed because the data used in this analysis…

  16. Transistor step stress program for JANTX2N4150

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Reliability analysis of the transistor JANTX2N4150 manufactured by General Semiconductor and Transitron is reported. The discrete devices were subjected to power and temperature step stress tests and then to electrical tests after completing the power/temperature step stress point. Control sample units were maintained for verification of the electrical parametric testing. Results are presented.

  17. Accounting for Non-Gaussian Sources of Spatial Correlation in Parametric Functional Magnetic Resonance Imaging Paradigms II: A Method to Obtain First-Level Analysis Residuals with Uniform and Gaussian Spatial Autocorrelation Function and Independent and Identically Distributed Time-Series.

    PubMed

    Gopinath, Kaundinya; Krishnamurthy, Venkatagiri; Lacey, Simon; Sathian, K

    2018-02-01

    In a recent study Eklund et al. have shown that cluster-wise family-wise error (FWE) rate-corrected inferences made in parametric statistical method-based functional magnetic resonance imaging (fMRI) studies over the past couple of decades may have been invalid, particularly for cluster defining thresholds less stringent than p < 0.001; principally because the spatial autocorrelation functions (sACFs) of fMRI data had been modeled incorrectly to follow a Gaussian form, whereas empirical data suggest otherwise. Hence, the residuals from general linear model (GLM)-based fMRI activation estimates in these studies may not have possessed a homogenously Gaussian sACF. Here we propose a method based on the assumption that heterogeneity and non-Gaussianity of the sACF of the first-level GLM analysis residuals, as well as temporal autocorrelations in the first-level voxel residual time-series, are caused by unmodeled MRI signal from neuronal and physiological processes as well as motion and other artifacts, which can be approximated by appropriate decompositions of the first-level residuals with principal component analysis (PCA), and removed. We show that application of this method yields GLM residuals with significantly reduced spatial correlation, nearly Gaussian sACF and uniform spatial smoothness across the brain, thereby allowing valid cluster-based FWE-corrected inferences based on assumption of Gaussian spatial noise. We further show that application of this method renders the voxel time-series of first-level GLM residuals independent, and identically distributed across time (which is a necessary condition for appropriate voxel-level GLM inference), without having to fit ad hoc stochastic colored noise models. Furthermore, the detection power of individual subject brain activation analysis is enhanced. This method will be especially useful for case studies, which rely on first-level GLM analysis inferences.

  18. One-dimensional statistical parametric mapping in Python.

    PubMed

    Pataky, Todd C

    2012-01-01

    Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.

  19. Empirical Prediction of Aircraft Landing Gear Noise

    NASA Technical Reports Server (NTRS)

    Golub, Robert A. (Technical Monitor); Guo, Yue-Ping

    2005-01-01

    This report documents a semi-empirical/semi-analytical method for landing gear noise prediction. The method is based on scaling laws of the theory of aerodynamic noise generation and correlation of these scaling laws with current available test data. The former gives the method a sound theoretical foundation and the latter quantitatively determines the relations between the parameters of the landing gear assembly and the far field noise, enabling practical predictions of aircraft landing gear noise, both for parametric trends and for absolute noise levels. The prediction model is validated by wind tunnel test data for an isolated Boeing 737 landing gear and by flight data for the Boeing 777 airplane. In both cases, the predictions agree well with data, both in parametric trends and in absolute noise levels.

  20. Sensitivity of the Halstead and Wechsler Test Batteries to brain damage: Evidence from Reitan's original validation sample.

    PubMed

    Loring, David W; Larrabee, Glenn J

    2006-06-01

    The Halstead-Reitan Battery has been instrumental in the development of neuropsychological practice in the United States. Although Reitan administered both the Wechsler-Bellevue Intelligence Scale and Halstead's test battery when evaluating Halstead's theory of biologic intelligence, the relative sensitivity of each test battery to brain damage continues to be an area of controversy. Because Reitan did not perform direct parametric analysis to contrast group performances, we reanalyze Reitan's original validation data from both Halstead (Reitan, 1955) and Wechsler batteries (Reitan, 1959a) and calculate effect sizes and probability levels using traditional parametric approaches. Eight of the 10 tests comprising Halstead's original Impairment Index, as well as the Impairment Index itself, statistically differentiated patients with unequivocal brain damage from controls. In addition, 13 of 14 Wechsler measures including Full-Scale IQ also differed statistically between groups (Brain Damage Full-Scale IQ = 96.2; Control Group Full Scale IQ = 112.6). We suggest that differences in the statistical properties of each battery (e.g., raw scores vs. standardized scores) likely contribute to classification characteristics including test sensitivity and specificity.

  1. Development and Characterization Testing of an Air Pulsation Valve for a Pulse Detonation Engine Supersonic Parametric Inlet Test Section

    NASA Technical Reports Server (NTRS)

    Tornabene, Robert

    2005-01-01

    In pulse detonation engines, the potential exists for gas pulses from the combustor to travel upstream and adversely affect the inlet performance of the engine. In order to determine the effect of these high frequency pulses on the inlet performance, an air pulsation valve was developed to provide air pulses downstream of a supersonic parametric inlet test section. The purpose of this report is to document the design and characterization tests that were performed on a pulsation valve that was tested at the NASA Glenn Research Center 1x1 Supersonic Wind Tunnel (SWT) test facility. The high air flow pulsation valve design philosophy and analyses performed are discussed and characterization test results are presented. The pulsation valve model was devised based on the concept of using a free spinning ball valve driven from a variable speed electric motor to generate air flow pulses at preset frequencies. In order to deliver the proper flow rate, the flow port was contoured to maximize flow rate and minimize pressure drop. To obtain sharp pressure spikes the valve flow port was designed to be as narrow as possible to minimize port dwell time.

  2. Parametric down-conversion with nonideal and random quasi-phase-matching

    NASA Astrophysics Data System (ADS)

    Yang, Chun-Yao; Lin, Chun; Liljestrand, Charlotte; Su, Wei-Min; Canalias, Carlota; Chuu, Chih-Sung

    2016-05-01

    Quasi-phase-matching (QPM) has enriched the capacity of parametric down-conversion (PDC) in generating biphotons for many fundamental tests and advanced applications. However, it is not clear how the nonidealities and randomness in the QPM grating of a parametric down-converter may affect the quantum properties of the biphotons. This paper intends to provide insights into the interplay between PDC and nonideal or random QPM structures. Using a periodically poled nonlinear crystal with short periodicity, we conduct experimental and theoretical studies of PDC subject to nonideal duty cycle and random errors in domain lengths. We report the observation of biphotons emerging through noncritical birefringent-phasematching, which is impossible to occur in PDC with an ideal QPM grating, and a biphoton spectrum determined by the details of nonidealities and randomness. We also observed QPM biphotons with a diminished strength. These features are both confirmed by our theory. Our work provides new perspectives for biphoton engineering with QPM.

  3. Stress Recovery and Error Estimation for Shell Structures

    NASA Technical Reports Server (NTRS)

    Yazdani, A. A.; Riggs, H. R.; Tessler, A.

    2000-01-01

    The Penalized Discrete Least-Squares (PDLS) stress recovery (smoothing) technique developed for two dimensional linear elliptic problems is adapted here to three-dimensional shell structures. The surfaces are restricted to those which have a 2-D parametric representation, or which can be built-up of such surfaces. The proposed strategy involves mapping the finite element results to the 2-D parametric space which describes the geometry, and smoothing is carried out in the parametric space using the PDLS-based Smoothing Element Analysis (SEA). Numerical results for two well-known shell problems are presented to illustrate the performance of SEA/PDLS for these problems. The recovered stresses are used in the Zienkiewicz-Zhu a posteriori error estimator. The estimated errors are used to demonstrate the performance of SEA-recovered stresses in automated adaptive mesh refinement of shell structures. The numerical results are encouraging. Further testing involving more complex, practical structures is necessary.

  4. Experimental parametric study of servers cooling management in data centers buildings

    NASA Astrophysics Data System (ADS)

    Nada, S. A.; Elfeky, K. E.; Attia, Ali M. A.; Alshaer, W. G.

    2017-06-01

    A parametric study of air flow and cooling management of data centers servers is experimentally conducted for different design conditions. A physical scale model of data center accommodating one rack of four servers was designed and constructed for testing purposes. Front and rear rack and server's temperatures distributions and supply/return heat indices (SHI/RHI) are used to evaluate data center thermal performance. Experiments were conducted to parametrically study the effects of perforated tiles opening ratio, servers power load variation and rack power density. The results showed that (1) perforated tile of 25% opening ratio provides the best results among the other opening ratios, (2) optimum benefit of cold air in servers cooling is obtained at uniformly power loading of servers (3) increasing power density decrease air re-circulation but increase air bypass and servers temperature. The present results are compared with previous experimental and CFD results and fair agreement was found.

  5. Dorsomedial SCN neuronal subpopulations subserve different functions in human dementia.

    PubMed

    Harper, David G; Stopa, Edward G; Kuo-Leblanc, Victoria; McKee, Ann C; Asayama, Kentaro; Volicer, Ladislav; Kowall, Neil; Satlin, Andrew

    2008-06-01

    The suprachiasmatic nuclei (SCN) are necessary and sufficient for the maintenance of circadian rhythms in primate and other mammalian species. The human dorsomedial SCN contains populations of non-species-specific vasopressin and species-specific neurotensin neurons. We made time-series recordings of core body temperature and locomotor activity in 19 elderly, male, end-stage dementia patients and 8 normal elderly controls. Following the death of the dementia patients, neuropathological diagnostic information and tissue samples from the hypothalamus were obtained. Hypothalamic tissue was also obtained from eight normal control cases that had not had activity or core temperature recordings previously. Core temperature was analysed for parametric, circadian features, and activity was analysed for non-parametric and parametric circadian features. These indices were then correlated with the degree of degeneration seen in the SCN (glia/neuron ratio) and neuronal counts from the dorsomedial SCN (vasopressin, neurotensin). Specific loss of SCN neurotensin neurons was associated with loss of activity and temperature amplitude without increase in activity fragmentation. Loss of SCN vasopressin neurons was associated with increased activity fragmentation but not loss of amplitude. Evidence for a circadian rhythm of vasopressinergic activity was seen in the dementia cases but no evidence was seen for a circadian rhythm in neurotensinergic activity. These results provide evidence that the SCN is necessary for the maintenance of the circadian rhythm in humans, information on the role of neuronal subpopulations in subserving this function and the utility of dementia in elaborating brain-behaviour relationships in the human.

  6. Does partial Granger causality really eliminate the influence of exogenous inputs and latent variables?

    PubMed

    Roelstraete, Bjorn; Rosseel, Yves

    2012-04-30

    Partial Granger causality was introduced by Guo et al. (2008) who showed that it could better eliminate the influence of latent variables and exogenous inputs than conditional G-causality. In the recent literature we can find some reviews and applications of this type of Granger causality (e.g. Smith et al., 2011; Bressler and Seth, 2010; Barrett et al., 2010). These articles apparently do not take into account a serious flaw in the original work on partial G-causality, being the negative F values that were reported and even proven to be plausible. In our opinion, this undermines the credibility of the obtained results and thus the validity of the approach. Our study is aimed to further validate partial G-causality and to find an answer why negative partial Granger causality estimates were reported. Time series were simulated from the same toy model as used in the original paper and partial and conditional causal measures were compared in the presence of confounding variables. Inference was done parametrically and using non-parametric block bootstrapping. We counter the proof that partial Granger F values can be negative, but the main conclusion of the original article remains. In the presence of unknown latent and exogenous influences, it appears that partial G-causality will better eliminate their influence than conditional G-causality, at least when non-parametric inference is used. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Recurrent neural network-based modeling of gene regulatory network using elephant swarm water search algorithm.

    PubMed

    Mandal, Sudip; Saha, Goutam; Pal, Rajat Kumar

    2017-08-01

    Correct inference of genetic regulations inside a cell from the biological database like time series microarray data is one of the greatest challenges in post genomic era for biologists and researchers. Recurrent Neural Network (RNN) is one of the most popular and simple approach to model the dynamics as well as to infer correct dependencies among genes. Inspired by the behavior of social elephants, we propose a new metaheuristic namely Elephant Swarm Water Search Algorithm (ESWSA) to infer Gene Regulatory Network (GRN). This algorithm is mainly based on the water search strategy of intelligent and social elephants during drought, utilizing the different types of communication techniques. Initially, the algorithm is tested against benchmark small and medium scale artificial genetic networks without and with presence of different noise levels and the efficiency was observed in term of parametric error, minimum fitness value, execution time, accuracy of prediction of true regulation, etc. Next, the proposed algorithm is tested against the real time gene expression data of Escherichia Coli SOS Network and results were also compared with others state of the art optimization methods. The experimental results suggest that ESWSA is very efficient for GRN inference problem and performs better than other methods in many ways.

  8. Development of a Radial Deconsolidation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmreich, Grant W.; Montgomery, Fred C.; Hunn, John D.

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radiallymore » symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.« less

  9. Adaptation of a cubic smoothing spline algortihm for multi-channel data stitching at the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C; Adcock, A; Azevedo, S

    2010-12-28

    Some diagnostics at the National Ignition Facility (NIF), including the Gamma Reaction History (GRH) diagnostic, require multiple channels of data to achieve the required dynamic range. These channels need to be stitched together into a single time series, and they may have non-uniform and redundant time samples. We chose to apply the popular cubic smoothing spline technique to our stitching problem because we needed a general non-parametric method. We adapted one of the algorithms in the literature, by Hutchinson and deHoog, to our needs. The modified algorithm and the resulting code perform a cubic smoothing spline fit to multiple datamore » channels with redundant time samples and missing data points. The data channels can have different, time-varying, zero-mean white noise characteristics. The method we employ automatically determines an optimal smoothing level by minimizing the Generalized Cross Validation (GCV) score. In order to automatically validate the smoothing level selection, the Weighted Sum-Squared Residual (WSSR) and zero-mean tests are performed on the residuals. Further, confidence intervals, both analytical and Monte Carlo, are also calculated. In this paper, we describe the derivation of our cubic smoothing spline algorithm. We outline the algorithm and test it with simulated and experimental data.« less

  10. Identification of trend in long term precipitation and reference evapotranspiration over Narmada river basin (India)

    NASA Astrophysics Data System (ADS)

    Pandey, Brij Kishor; Khare, Deepak

    2018-02-01

    Precipitation and reference evapotranspiration are key parameters in hydro-meteorological studies and used for agricultural planning, irrigation system design and management. Precipitation and evaporative demand are expected to be alter under climate change and affect the sustainable development. In this article, spatial variability and temporal trend of precipitation and reference evapotranspiration (ETo) were investigated over Narmada river basin (India), a humid tropical climatic region. In the present study, 12 and 28 observatory stations were selected for precipitation and ETo, respectively of 102-years period (1901-2002). A rigorous analysis for trend detection was carried out using non parametric tests such as Mann-Kendall (MK) and Spearman Rho (SR). Sen's slope estimator was used to analyze the rate of change in long term series. Moreover, all the stations of basin exhibit positive trend for annual ETo, while 8% stations indicate significant negative trend for mean annual precipitation, respectively. Change points of annual precipitation were identified around the year 1962 applying Buishand's and Pettit's test. Annual mean precipitation reduced by 9% in upper part while increased maximum by 5% in lower part of the basin due temporal changes. Although annual mean ETo increase by 4-12% in most of the region. Moreover, results of the study are very helpful in planning and development of agricultural water resources.

  11. Trend assessment: applications for hydrology and climate research

    NASA Astrophysics Data System (ADS)

    Kallache, M.; Rust, H. W.; Kropp, J.

    2005-02-01

    The assessment of trends in climatology and hydrology still is a matter of debate. Capturing typical properties of time series, like trends, is highly relevant for the discussion of potential impacts of global warming or flood occurrences. It provides indicators for the separation of anthropogenic signals and natural forcing factors by distinguishing between deterministic trends and stochastic variability. In this contribution river run-off data from gauges in Southern Germany are analysed regarding their trend behaviour by combining a deterministic trend component and a stochastic model part in a semi-parametric approach. In this way the trade-off between trend and autocorrelation structure can be considered explicitly. A test for a significant trend is introduced via three steps: First, a stochastic fractional ARIMA model, which is able to reproduce short-term as well as long-term correlations, is fitted to the empirical data. In a second step, wavelet analysis is used to separate the variability of small and large time-scales assuming that the trend component is part of the latter. Finally, a comparison of the overall variability to that restricted to small scales results in a test for a trend. The extraction of the large-scale behaviour by wavelet analysis provides a clue concerning the shape of the trend.

  12. An introduction to modeling longitudinal data with generalized additive models: applications to single-case designs.

    PubMed

    Sullivan, Kristynn J; Shadish, William R; Steiner, Peter M

    2015-03-01

    Single-case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time in both the presence and absence of treatment. This article introduces a statistical technique for analyzing SCD data that has not been much used in psychological and educational research: generalized additive models (GAMs). In parametric regression, the researcher must choose a functional form to impose on the data, for example, that trend over time is linear. GAMs reverse this process by letting the data inform the choice of functional form. In this article we review the problem that trend poses in SCDs, discuss how current SCD analytic methods approach trend, describe GAMs as a possible solution, suggest a GAM model testing procedure for examining the presence of trend in SCDs, present a small simulation to show the statistical properties of GAMs, and illustrate the procedure on 3 examples of different lengths. Results suggest that GAMs may be very useful both as a form of sensitivity analysis for checking the plausibility of assumptions about trend and as a primary data analysis strategy for testing treatment effects. We conclude with a discussion of some problems with GAMs and some future directions for research on the application of GAMs to SCDs. (c) 2015 APA, all rights reserved).

  13. Water and Sediment Output Evaluation Using Cellular Automata on Alpine Catchment: Soana, Italy - Test Case

    NASA Astrophysics Data System (ADS)

    Pasculli, Antonio; Audisio, Chiara; Sciarra, Nicola

    2017-12-01

    In the alpine contest, the estimation of the rainfall (inflow) and the discharge (outflow) data are very important in order to, at least, analyse historical time series at catchment scale; determine the hydrological maximum and minimum estimate flood and drought frequency. Hydrological researches become a precious source of information for various human activities, in particular for land use management and planning. Many rainfall- runoff models have been proposed to reflect steady, gradually-varied flow condition inside a catchment. In these last years, the application of Reduced Complexity Models (RCM) has been representing an excellent alternative resource for evaluating the hydrological response of catchments, within a period of time up to decades. Hence, this paper is aimed at the discussion of the application of the research code CAESAR, based on cellular automaton (CA) approach, in order to evaluate the water and the sediment outputs from an alpine catchment (Soana, Italy), selected as test case. The comparison between the predicted numerical results, developed through parametric analysis, and the available measured data are discussed. Finally, the analysis of a numerical estimate of the sediment budget over ten years is presented. The necessity of a fast, but reliable numerical support when the measured data are not so easily accessible, as in Alpine catchments, is highlighted.

  14. Cluster-level statistical inference in fMRI datasets: The unexpected behavior of random fields in high dimensions.

    PubMed

    Bansal, Ravi; Peterson, Bradley S

    2018-06-01

    Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Nonparametric tests for equality of psychometric functions.

    PubMed

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2017-12-07

    Many empirical studies measure psychometric functions (curves describing how observers' performance varies with stimulus magnitude) because these functions capture the effects of experimental conditions. To assess these effects, parametric curves are often fitted to the data and comparisons are carried out by testing for equality of mean parameter estimates across conditions. This approach is parametric and, thus, vulnerable to violations of the implied assumptions. Furthermore, testing for equality of means of parameters may be misleading: Psychometric functions may vary meaningfully across conditions on an observer-by-observer basis with no effect on the mean values of the estimated parameters. Alternative approaches to assess equality of psychometric functions per se are thus needed. This paper compares three nonparametric tests that are applicable in all situations of interest: The existing generalized Mantel-Haenszel test, a generalization of the Berry-Mielke test that was developed here, and a split variant of the generalized Mantel-Haenszel test also developed here. Their statistical properties (accuracy and power) are studied via simulation and the results show that all tests are indistinguishable as to accuracy but they differ non-uniformly as to power. Empirical use of the tests is illustrated via analyses of published data sets and practical recommendations are given. The computer code in MATLAB and R to conduct these tests is available as Electronic Supplemental Material.

  16. BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs

    PubMed Central

    Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen

    2014-01-01

    Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471

  17. Nakagami-m parametric imaging for characterization of thermal coagulation and cavitation erosion induced by HIFU.

    PubMed

    Han, Meng; Wang, Na; Guo, Shifang; Chang, Nan; Lu, Shukuan; Wan, Mingxi

    2018-07-01

    Nowadays, both thermal and mechanical ablation techniques of HIFU associated with cavitation have been developed for noninvasive treatment. A specific challenge for the successful clinical implementation of HIFU is to achieve real-time imaging for the evaluation and determination of therapy outcomes such as necrosis or homogenization. Ultrasound Nakagami-m parametric imaging highlights the degrading shadowing effects of bubbles and can be used for tissue characterization. The aim of this study is to investigate the performance of Nakagami-m parametric imaging for evaluating and differentiating thermal coagulation and cavitation erosion induced by HIFU. Lesions were induced in basic bovine serum albumin (BSA) phantoms and ex vivo porcine livers using a 1.6 MHz single-element transducer. Thermal and mechanical lesions induced by two types of HIFU sequences respectively were evaluated using Nakagami-m parametric imaging and ultrasound B-mode imaging. The lesion sizes estimated using Nakagami-m parametric imaging technique were all closer to the actual sizes than those of B-mode imaging. The p-value obtained from the t-test between the mean m values of thermal coagulation and cavitation erosion was smaller than 0.05, demonstrating that the m values of thermal lesions were significantly different from that of mechanical lesions, which was confirmed by ex vivo experiments and histologic examination showed that different changes result from HIFU exposure, one of tissue dehydration resulting from the thermal effect, and the other of tissue homogenate resulting from mechanical effect. This study demonstrated that Nakagami-m parametric imaging is a potential real-time imaging technique for evaluating and differentiating thermal coagulation and cavitation erosion. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Parametric instability, inverse cascade and the range of solar-wind turbulence

    NASA Astrophysics Data System (ADS)

    Chandran, Benjamin D. G.

    2018-02-01

    In this paper, weak-turbulence theory is used to investigate the nonlinear evolution of the parametric instability in three-dimensional low- plasmas at wavelengths much greater than the ion inertial length under the assumption that slow magnetosonic waves are strongly damped. It is shown analytically that the parametric instability leads to an inverse cascade of Alfvén wave quanta, and several exact solutions to the wave kinetic equations are presented. The main results of the paper concern the parametric decay of Alfvén waves that initially satisfy +\\gg e-$ , where +$ and -$ are the frequency ( ) spectra of Alfvén waves propagating in opposite directions along the magnetic field lines. If +$ initially has a peak frequency 0$ (at which +$ is maximized) and an `infrared' scaling p$ at smaller with , then +$ acquires an -1$ scaling throughout a range of frequencies that spreads out in both directions from 0$ . At the same time, -$ acquires an -2$ scaling within this same frequency range. If the plasma parameters and infrared +$ spectrum are chosen to match conditions in the fast solar wind at a heliocentric distance of 0.3 astronomical units (AU), then the nonlinear evolution of the parametric instability leads to an +$ spectrum that matches fast-wind measurements from the Helios spacecraft at 0.3 AU, including the observed -1$ scaling at -4~\\text{Hz}$ . The results of this paper suggest that the -1$ spectrum seen by Helios in the fast solar wind at -4~\\text{Hz}$ is produced in situ by parametric decay and that the -1$ range of +$ extends over an increasingly narrow range of frequencies as decreases below 0.3 AU. This prediction will be tested by measurements from the Parker Solar Probe.

  19. Parametric fMRI of paced motor responses uncovers novel whole-brain imaging biomarkers in spinocerebellar ataxia type 3.

    PubMed

    Duarte, João Valente; Faustino, Ricardo; Lobo, Mercês; Cunha, Gil; Nunes, César; Ferreira, Carlos; Januário, Cristina; Castelo-Branco, Miguel

    2016-10-01

    Machado-Joseph Disease, inherited type 3 spinocerebellar ataxia (SCA3), is the most common form worldwide. Neuroimaging and neuropathology have consistently demonstrated cerebellar alterations. Here we aimed to discover whole-brain functional biomarkers, based on parametric performance-level-dependent signals. We assessed 13 patients with early SCA3 and 14 healthy participants. We used a combined parametric behavioral/functional neuroimaging design to investigate disease fingerprints, as a function of performance levels, coupled with structural MRI and voxel-based morphometry. Functional magnetic resonance imaging (fMRI) was designed to parametrically analyze behavior and neural responses to audio-paced bilateral thumb movements at temporal frequencies of 1, 3, and 5 Hz. Our performance-level-based design probing neuronal correlates of motor coordination enabled the discovery that neural activation and behavior show critical loss of parametric modulation specifically in SCA3, associated with frequency-dependent cortico/subcortical activation/deactivation patterns. Cerebellar/cortical rate-dependent dissociation patterns could clearly differentiate between groups irrespective of grey matter loss. Our findings suggest functional reorganization of the motor network and indicate a possible role of fMRI as a tool to monitor disease progression in SCA3. Accordingly, fMRI patterns proved to be potential biomarkers in early SCA3, as tested by receiver operating characteristic analysis of both behavior and neural activation at different frequencies. Discrimination analysis based on BOLD signal in response to the applied parametric finger-tapping task significantly often reached >80% sensitivity and specificity in single regions-of-interest.Functional fingerprints based on cerebellar and cortical BOLD performance dependent signal modulation can thus be combined as diagnostic and/or therapeutic targets in hereditary ataxia. Hum Brain Mapp 37:3656-3668, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. A parametric model order reduction technique for poroelastic finite element models.

    PubMed

    Lappano, Ettore; Polanz, Markus; Desmet, Wim; Mundo, Domenico

    2017-10-01

    This research presents a parametric model order reduction approach for vibro-acoustic problems in the frequency domain of systems containing poroelastic materials (PEM). The method is applied to the Finite Element (FE) discretization of the weak u-p integral formulation based on the Biot-Allard theory and makes use of reduced basis (RB) methods typically employed for parametric problems. The parametric reduction is obtained rewriting the Biot-Allard FE equations for poroelastic materials using an affine representation of the frequency (therefore allowing for RB methods) and projecting the frequency-dependent PEM system on a global reduced order basis generated with the proper orthogonal decomposition instead of standard modal approaches. This has proven to be better suited to describe the nonlinear frequency dependence and the strong coupling introduced by damping. The methodology presented is tested on two three-dimensional systems: in the first experiment, the surface impedance of a PEM layer sample is calculated and compared with results of the literature; in the second, the reduced order model of a multilayer system coupled to an air cavity is assessed and the results are compared to those of the reference FE model.

  1. STUDIES IN RESEARCH METHODOLOGY. IV. A SAMPLING STUDY OF THE CENTRAL LIMIT THEOREM AND THE ROBUSTNESS OF ONE-SAMPLE PARAMETRIC TESTS,

    DTIC Science & Technology

    iconoclastic . Even at N=1024 these departures were quite appreciable at the testing tails, being greatest for chi-square and least for Z, and becoming worse in all cases at increasingly extreme tail areas. (Author)

  2. A Simulation Comparison of Parametric and Nonparametric Dimensionality Detection Procedures

    ERIC Educational Resources Information Center

    Mroch, Andrew A.; Bolt, Daniel M.

    2006-01-01

    Recently, nonparametric methods have been proposed that provide a dimensionally based description of test structure for tests with dichotomous items. Because such methods are based on different notions of dimensionality than are assumed when using a psychometric model, it remains unclear whether these procedures might lead to a different…

  3. [Percutaneous surgery for plantar fasciitis due to a calcaneal spur].

    PubMed

    Apóstol-González, Saúl; Herrera, Jesús

    2009-01-01

    Determine the efficacy of percutaneous surgical treatment for talalgia due to a calcaneal spur. This is an observational, descriptive, clinical series analyzing the outcomes of 10 patients with a diagnosis of talalgia due to plantar fasciitis with a calcaneal spur treated with percutaneous foot surgery. The end result was assessed with a visual analog scale (VAS) to measure pain, the patients' opinion and their return to activities of daily living. Central tendency and scatter measurements were calculated. The inferential analysis was done with the non-parametric chi square (chi2) test. Most patients were females (90%) and mean age was 40.5 years. Follow-up was 12 months. One patient had bleeding of the approached area. Pain was reduced from 8 to 1.5 in the VAS. Nine patients returned to their activities. Two patients had occasional mild pain upon prolonged bipedestation. Ninety percent of results were satisfactory. Percutaneous foot surgery in talalgias caused by plantar fasciitis due to a calcaneal spur is a simple and effective method. It reduces the operative time and allows for an early return of patients to their activities of daily living.

  4. Impact of meteorology on air quality modeling over the Po valley in northern Italy

    NASA Astrophysics Data System (ADS)

    Pernigotti, D.; Georgieva, E.; Thunis, P.; Bessagnet, B.

    2012-05-01

    A series of sensitivity tests has been performed using both a mesoscale meteorological model (MM5) and a chemical transport model (CHIMERE) to better understand the reasons why all models underestimate particulate matter concentrations in the Po valley in winter. Different options are explored to nudge meteorological observations from regulatory networks into MM5 in order to improve model performances, especially during the low wind speed regimes frequently present in this area. The sensitivity of the CHIMERE modeled particulate matter concentrations to these different meteorological inputs are then evaluated for the January 2005 time period. A further analysis of the CHIMERE model results revealed the need of improving the parametrization of the in-cloud scavenging and vertical diffusivity schemes; such modifications are relevant especially when the model is applied under mist, fog and low stratus conditions, which frequently occur in the Po valley during winter. The sensitivity of modeled particulate matter concentrations to turbulence parameters, wind, temperature and cloud liquid water content in one of the most polluted and complex areas in Europe is finally discussed.

  5. Output-only modal parameter estimator of linear time-varying structural systems based on vector TAR model and least squares support vector machine

    NASA Astrophysics Data System (ADS)

    Zhou, Si-Da; Ma, Yuan-Chen; Liu, Li; Kang, Jie; Ma, Zhi-Sai; Yu, Lei

    2018-01-01

    Identification of time-varying modal parameters contributes to the structural health monitoring, fault detection, vibration control, etc. of the operational time-varying structural systems. However, it is a challenging task because there is not more information for the identification of the time-varying systems than that of the time-invariant systems. This paper presents a vector time-dependent autoregressive model and least squares support vector machine based modal parameter estimator for linear time-varying structural systems in case of output-only measurements. To reduce the computational cost, a Wendland's compactly supported radial basis function is used to achieve the sparsity of the Gram matrix. A Gamma-test-based non-parametric approach of selecting the regularization factor is adapted for the proposed estimator to replace the time-consuming n-fold cross validation. A series of numerical examples have illustrated the advantages of the proposed modal parameter estimator on the suppression of the overestimate and the short data. A laboratory experiment has further validated the proposed estimator.

  6. Efficient Posterior Probability Mapping Using Savage-Dickey Ratios

    PubMed Central

    Penny, William D.; Ridgway, Gerard R.

    2013-01-01

    Statistical Parametric Mapping (SPM) is the dominant paradigm for mass-univariate analysis of neuroimaging data. More recently, a Bayesian approach termed Posterior Probability Mapping (PPM) has been proposed as an alternative. PPM offers two advantages: (i) inferences can be made about effect size thus lending a precise physiological meaning to activated regions, (ii) regions can be declared inactive. This latter facility is most parsimoniously provided by PPMs based on Bayesian model comparisons. To date these comparisons have been implemented by an Independent Model Optimization (IMO) procedure which separately fits null and alternative models. This paper proposes a more computationally efficient procedure based on Savage-Dickey approximations to the Bayes factor, and Taylor-series approximations to the voxel-wise posterior covariance matrices. Simulations show the accuracy of this Savage-Dickey-Taylor (SDT) method to be comparable to that of IMO. Results on fMRI data show excellent agreement between SDT and IMO for second-level models, and reasonable agreement for first-level models. This Savage-Dickey test is a Bayesian analogue of the classical SPM-F and allows users to implement model comparison in a truly interactive manner. PMID:23533640

  7. Shock tunnel studies of scramjet phenomena, supplement 5

    NASA Technical Reports Server (NTRS)

    Casey, R.; Stalker, R. J.; Brescianini, C. P.; Morgan, R. G.; Jacobs, P. A.; Wendt, M.; Ward, N. R.; Akman, N.; Allen, G. A.; Skinner, K.

    1990-01-01

    A series of reports are presented on SCRAMjet studies, shock tunnel studies, and expansion tube studies. The SCRAMjet studies include: (1) Investigation of a Supersonic Combustion Layer; (2) Wall Injected SCRAMjet Experiments; (3) Supersonic Combustion with Transvers, Circular, Wall Jets; (4) Dissociated Test Gas Effects on SCRAMjet Combustors; (5) Use of Silane as a Fuel Additive for Hypersonic Thrust Production, (6) Pressure-length Correlations in Supersonic Combustion; (7) Hot Hydrogen Injection Technique for Shock Tunnels; (8) Heat Release - Wave Interaction Phenomena in Hypersonic Flows; (9) A Study of the Wave Drag in Hypersonic SCRAMjets; (10) Parametric Study of Thrust Production in the Two Dimensional SCRAMjet; (11) The Design of a Mass Spectrometer for use in Hypersonic Impulse Facilities; and (12) Development of a Skin Friction Gauge for use in an Impulse Facility. The shock tunnel studies include: (1) Hypervelocity flow in Axisymmetric Nozzles; (2) Shock Tunnel Development; and (3) Real Gas Efects in Hypervelocity Flows over an Inclined Cone. The expansion tube studies include: (1) Investigation of Flow Characteristics in TQ Expansion Tube; and (2) Disturbances in the Driver Gas of a Shock Tube.

  8. Air Leakage and Air Transfer Between Garage and Living Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudd, A.

    2014-09-01

    This research project focused on evaluation of air transfer between the garage and living space in a single-family detached home constructed by a production homebuilder in compliance with the 2009 International Residential Code and the 2009 International Energy Conservation Code. The project gathered important information about the performance of whole-building ventilation systems and garage ventilation systems as they relate to minimizing flow of contaminated air from garage to living space. A series of 25 multi-point fan pressurization tests and additional zone pressure diagnostic testing characterized the garage and house air leakage, the garage-to-house air leakage, and garage and house pressuremore » relationships to each other and to outdoors using automated fan pressurization and pressure monitoring techniques. While the relative characteristics of this house may not represent the entire population of new construction configurations and air tightness levels (house and garage) throughout the country, the technical approach was conservative and should reasonably extend the usefulness of the results to a large spectrum of house configurations from this set of parametric tests in this one house. Based on the results of this testing, the two-step garage-to-house air leakage test protocol described above is recommended where whole-house exhaust ventilation is employed. For houses employing whole-house supply ventilation (positive pressure) or balanced ventilation (same pressure effect as the Baseline condition), adherence to the EPA Indoor airPLUS house-to-garage air sealing requirements should be sufficient to expect little to no garage-to-house air transfer.« less

  9. Technology Solutions Case Study: Air Leakage and Air Transfer Between Garage and Living Space, Waldorf, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-11-01

    In this project, Building Science Corporation worked with production homebuilder K. Hovnanian to evaluate air transfer between the garage and living space in a single-family detached home constructed by a production homebuilder in compliance with the 2009 International Residential Code and the 2009 International Energy Conservation Code. The project gathered important information about the performance of whole-building ventilation systems and garage ventilation systems as they relate to minimizing flow of contaminated air from garage to living space. A series of 25 multipoint fan pressurization tests and additional zone pressure diagnostic testing measured the garage and house air leakage, the garage-to-housemore » air leakage, and garage and house pressure relationships to each other and to outdoors using automated fan pressurization and pressure monitoring techniques. While the relative characteristics of this house may not represent the entire population of new construction configurations and air tightness levels (house and garage) throughout the country, the technical approach was conservative and should reasonably extend the usefulness of the results to a large spectrum of house configurations from this set of parametric tests in this one house. Based on the results of this testing, the two-step garage-to-house air leakage test protocol described above is recommended where whole-house exhaust ventilation is employed. For houses employing whole-house supply ventilation (positive pressure) or balanced ventilation (same pressure effect as the baseline condition), adherence to the EPA Indoor airPLUS house-to-garage air sealing requirements should be sufficient to expect little to no garage-to-house air transfer.« less

  10. Building America Case Study: Air Leakage and Air Transfer Between Garage and Living Space, Waldorf, Maryland (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2014-11-01

    This research project focused on evaluation of air transfer between the garage and living space in a single-family detached home constructed by a production homebuilder in compliance with the 2009 International Residential Code and the 2009 International Energy Conservation Code. The project gathered important information about the performance of whole-building ventilation systems and garage ventilation systems as they relate to minimizing flow of contaminated air from garage to living space. A series of 25 multi-point fan pressurization tests and additional zone pressure diagnostic testing characterized the garage and house air leakage, the garage-to-house air leakage, and garage and house pressuremore » relationships to each other and to outdoors using automated fan pressurization and pressure monitoring techniques. While the relative characteristics of this house may not represent the entire population of new construction configurations and air tightness levels (house and garage) throughout the country, the technical approach was conservative and should reasonably extend the usefulness of the results to a large spectrum of house configurations from this set of parametric tests in this one house. Based on the results of this testing, the two-step garage-to-house air leakage test protocol described above is recommended where whole-house exhaust ventilation is employed. For houses employing whole-house supply ventilation (positive pressure) or balanced ventilation (same pressure effect as the Baseline condition), adherence to the EPA Indoor airPLUS house-to-garage air sealing requirements should be sufficient to expect little to no garage-to-house air transfer.« less

  11. Evaluating Parametrization Protocols for Hydration Free Energy Calculations with the AMOEBA Polarizable Force Field.

    PubMed

    Bradshaw, Richard T; Essex, Jonathan W

    2016-08-09

    Hydration free energy (HFE) calculations are often used to assess the performance of biomolecular force fields and the quality of assigned parameters. The AMOEBA polarizable force field moves beyond traditional pairwise additive models of electrostatics and may be expected to improve upon predictions of thermodynamic quantities such as HFEs over and above fixed-point-charge models. The recent SAMPL4 challenge evaluated the AMOEBA polarizable force field in this regard but showed substantially worse results than those using the fixed-point-charge GAFF model. Starting with a set of automatically generated AMOEBA parameters for the SAMPL4 data set, we evaluate the cumulative effects of a series of incremental improvements in parametrization protocol, including both solute and solvent model changes. Ultimately, the optimized AMOEBA parameters give a set of results that are not statistically significantly different from those of GAFF in terms of signed and unsigned error metrics. This allows us to propose a number of guidelines for new molecule parameter derivation with AMOEBA, which we expect to have benefits for a range of biomolecular simulation applications such as protein-ligand binding studies.

  12. Non-parametric and least squares Langley plot methods

    NASA Astrophysics Data System (ADS)

    Kiedron, P. W.; Michalsky, J. J.

    2016-01-01

    Langley plots are used to calibrate sun radiometers primarily for the measurement of the aerosol component of the atmosphere that attenuates (scatters and absorbs) incoming direct solar radiation. In principle, the calibration of a sun radiometer is a straightforward application of the Bouguer-Lambert-Beer law V = V0e-τ ṡ m, where a plot of ln(V) voltage vs. m air mass yields a straight line with intercept ln(V0). This ln(V0) subsequently can be used to solve for τ for any measurement of V and calculation of m. This calibration works well on some high mountain sites, but the application of the Langley plot calibration technique is more complicated at other, more interesting, locales. This paper is concerned with ferreting out calibrations at difficult sites and examining and comparing a number of conventional and non-conventional methods for obtaining successful Langley plots. The 11 techniques discussed indicate that both least squares and various non-parametric techniques produce satisfactory calibrations with no significant differences among them when the time series of ln(V0)'s are smoothed and interpolated with median and mean moving window filters.

  13. Effect of Impact Location on the Response of Shuttle Wing Leading Edge Panel 9

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Spellman, Regina L.; Hardy, Robin C.; Fasanella, Edwin L.; Jackson, Karen E.

    2005-01-01

    The objective of this paper is to compare the results of several simulations performed to determine the worst-case location for a foam impact on the Space Shuttle wing leading edge. The simulations were performed using the commercial non-linear transient dynamic finite element code, LS-DYNA. These simulations represent the first in a series of parametric studies performed to support the selection of the worst-case impact scenario. Panel 9 was selected for this study to enable comparisons with previous simulations performed during the Columbia Accident Investigation. The projectile for this study is a 5.5-in cube of typical external tank foam weighing 0.23 lb. Seven locations spanning the panel surface were impacted with the foam cube. For each of these cases, the foam was traveling at 1000 ft/s directly aft, along the orbiter X-axis. Results compared from the parametric studies included strains, contact forces, and material energies for various simulations. The results show that the worst case impact location was on the top surface, near the apex.

  14. An approach to trial design and analysis in the era of non-proportional hazards of the treatment effect.

    PubMed

    Royston, Patrick; Parmar, Mahesh K B

    2014-08-07

    Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.

  15. Some Advances in Downscaling Probabilistic Climate Forecasts for Agricultural Decision Support

    NASA Astrophysics Data System (ADS)

    Han, E.; Ines, A.

    2015-12-01

    Seasonal climate forecasts, commonly provided in tercile-probabilities format (below-, near- and above-normal), need to be translated into more meaningful information for decision support of practitioners in agriculture. In this paper, we will present two new novel approaches to temporally downscale probabilistic seasonal climate forecasts: one non-parametric and another parametric method. First, the non-parametric downscaling approach called FResampler1 uses the concept of 'conditional block sampling' of weather data to create daily weather realizations of a tercile-based seasonal climate forecasts. FResampler1 randomly draws time series of daily weather parameters (e.g., rainfall, maximum and minimum temperature and solar radiation) from historical records, for the season of interest from years that belong to a certain rainfall tercile category (e.g., being below-, near- and above-normal). In this way, FResampler1 preserves the covariance between rainfall and other weather parameters as if conditionally sampling maximum and minimum temperature and solar radiation if that day is wet or dry. The second approach called predictWTD is a parametric method based on a conditional stochastic weather generator. The tercile-based seasonal climate forecast is converted into a theoretical forecast cumulative probability curve. Then the deviates for each percentile is converted into rainfall amount or frequency or intensity to downscale the 'full' distribution of probabilistic seasonal climate forecasts. Those seasonal deviates are then disaggregated on a monthly basis and used to constrain the downscaling of forecast realizations at different percentile values of the theoretical forecast curve. As well as the theoretical basis of the approaches we will discuss sensitivity analysis (length of data and size of samples) of them. In addition their potential applications for managing climate-related risks in agriculture will be shown through a couple of case studies based on actual seasonal climate forecasts for: rice cropping in the Philippines and maize cropping in India and Kenya.

  16. Parametric and Non-Parametric Vibration-Based Structural Identification Under Earthquake Excitation

    NASA Astrophysics Data System (ADS)

    Pentaris, Fragkiskos P.; Fouskitakis, George N.

    2014-05-01

    The problem of modal identification in civil structures is of crucial importance, and thus has been receiving increasing attention in recent years. Vibration-based methods are quite promising as they are capable of identifying the structure's global characteristics, they are relatively easy to implement and they tend to be time effective and less expensive than most alternatives [1]. This paper focuses on the off-line structural/modal identification of civil (concrete) structures subjected to low-level earthquake excitations, under which, they remain within their linear operating regime. Earthquakes and their details are recorded and provided by the seismological network of Crete [2], which 'monitors' the broad region of south Hellenic arc, an active seismic region which functions as a natural laboratory for earthquake engineering of this kind. A sufficient number of seismic events are analyzed in order to reveal the modal characteristics of the structures under study, that consist of the two concrete buildings of the School of Applied Sciences, Technological Education Institute of Crete, located in Chania, Crete, Hellas. Both buildings are equipped with high-sensitivity and accuracy seismographs - providing acceleration measurements - established at the basement (structure's foundation) presently considered as the ground's acceleration (excitation) and at all levels (ground floor, 1st floor, 2nd floor and terrace). Further details regarding the instrumentation setup and data acquisition may be found in [3]. The present study invokes stochastic, both non-parametric (frequency-based) and parametric methods for structural/modal identification (natural frequencies and/or damping ratios). Non-parametric methods include Welch-based spectrum and Frequency response Function (FrF) estimation, while parametric methods, include AutoRegressive (AR), AutoRegressive with eXogeneous input (ARX) and Autoregressive Moving-Average with eXogeneous input (ARMAX) models[4, 5]. Preliminary results indicate that parametric methods are capable of sufficiently providing the structural/modal characteristics such as natural frequencies and damping ratios. The study also aims - at a further level of investigation - to provide a reliable statistically-based methodology for structural health monitoring after major seismic events which potentially cause harming consequences in structures. Acknowledgments This work was supported by the State Scholarships Foundation of Hellas. References [1] J. S. Sakellariou and S. D. Fassois, "Stochastic output error vibration-based damage detection and assessment in structures under earthquake excitation," Journal of Sound and Vibration, vol. 297, pp. 1048-1067, 2006. [2] G. Hloupis, I. Papadopoulos, J. P. Makris, and F. Vallianatos, "The South Aegean seismological network - HSNC," Adv. Geosci., vol. 34, pp. 15-21, 2013. [3] F. P. Pentaris, J. Stonham, and J. P. Makris, "A review of the state-of-the-art of wireless SHM systems and an experimental set-up towards an improved design," presented at the EUROCON, 2013 IEEE, Zagreb, 2013. [4] S. D. Fassois, "Parametric Identification of Vibrating Structures," in Encyclopedia of Vibration, S. G. Braun, D. J. Ewins, and S. S. Rao, Eds., ed London: Academic Press, London, 2001. [5] S. D. Fassois and J. S. Sakellariou, "Time-series methods for fault detection and identification in vibrating structures," Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 365, pp. 411-448, February 15 2007.

  17. Ladder beam and camera video recording system for evaluating forelimb and hindlimb deficits after sensorimotor cortex injury in rats.

    PubMed

    Soblosky, J S; Colgin, L L; Chorney-Lane, D; Davidson, J F; Carey, M E

    1997-12-30

    Hindlimb and forelimb deficits in rats caused by sensorimotor cortex lesions are frequently tested by using the narrow flat beam (hindlimb), the narrow pegged beam (hindlimb and forelimb) or the grid-walking (forelimb) tests. Although these are excellent tests, the narrow flat beam generates non-parametric data so that using more powerful parametric statistical analyses are prohibited. All these tests can be difficult to score if the rat is moving rapidly. Foot misplacements, especially on the grid-walking test, are indicative of an ongoing deficit, but have not been reliably and accurately described and quantified previously. In this paper we present an easy to construct and use horizontal ladder-beam with a camera system on rails which can be used to evaluate both hindlimb and forelimb deficits in a single test. By slow motion videotape playback we were able to quantify and demonstrate foot misplacements which go beyond the recovery period usually seen using more conventional measures (i.e. footslips and footfaults). This convenient system provides a rapid and reliable method for recording and evaluating rat performance on any type of beam and may be useful for measuring sensorimotor recovery following brain injury.

  18. Selecting a separable parametric spatiotemporal covariance structure for longitudinal imaging data.

    PubMed

    George, Brandon; Aban, Inmaculada

    2015-01-15

    Longitudinal imaging studies allow great insight into how the structure and function of a subject's internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures and the spatial from the outcomes of interest being observed at multiple points in a patient's body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on types I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be performed in practice, as well as how covariance structure choice can change inferences about fixed effects. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Two-dimensional time dependent hurricane overwash and erosion modeling at Santa Rosa Island

    USGS Publications Warehouse

    McCall, R.T.; Van Theil de Vries, J. S. M.; Plant, N.G.; Van Dongeren, A. R.; Roelvink, J.A.; Thompson, D.M.; Reniers, A.J.H.M.

    2010-01-01

    A 2DH numerical, model which is capable of computing nearshore circulation and morphodynamics, including dune erosion, breaching and overwash, is used to simulate overwash caused by Hurricane Ivan (2004) on a barrier island. The model is forced using parametric wave and surge time series based on field data and large-scale numerical model results. The model predicted beach face and dune erosion reasonably well as well as the development of washover fans. Furthermore, the model demonstrated considerable quantitative skill (upwards of 66% of variance explained, maximum bias - 0.21 m) in hindcasting the post-storm shape and elevation of the subaerial barrier island when a sheet flow sediment transport limiter was applied. The prediction skill ranged between 0.66 and 0.77 in a series of sensitivity tests in which several hydraulic forcing parameters were varied. The sensitivity studies showed that the variations in the incident wave height and wave period affected the entire simulated island morphology while variations in the surge level gradient between the ocean and back barrier bay affected the amount of deposition on the back barrier and in the back barrier bay. The model sensitivity to the sheet flow sediment transport limiter, which served as a proxy for unknown factors controlling the resistance to erosion, was significantly greater than the sensitivity to the hydraulic forcing parameters. If no limiter was applied the simulated morphological response of the barrier island was an order of magnitude greater than the measured morphological response.

  20. [Principal reasons for extraction of permanent tooth in a sample of Mexicans adults].

    PubMed

    Medina-Solís, Carlo Eduardo; Pontigo-Loyola, América Patricia; Pérez-Campos, Eduardo; Hernández-Cruz, Pedro; De la Rosa-Santillana, Ruben; Navarete-Hernández, José de Jesús; Maupomé, Gerardo

    2013-01-01

    Tooth extractions are one of the most common procedures in oral surgery. The objective of this study was to identify the reasons for tooth extraction in adult patients seeking care at teaching dental clinics. A cross-sectional study was carried out in 331 subjects between 18 and 85 (45.37 +/- 13.85) years of age seeking dental care in dental clinics of the Universidad Autónoma del Estado de Hidalgo, from January 2009 to December, 2009. Data pertaining to age, sex, tooth number and the reason for extraction according to Kay & Blinkhorn were analyzed with non-parametric tests. 779 extractions were undertaken. The main reason for extraction was dental caries (43.1%), periodontal disease (PD) (27.9%), and prosthetic reasons (21.5%). There was no significant difference across sex for reasons of extraction (p > 0.05). Significant differences (p < 0.001) were found for age (extraction due to periodontal disease increased with age); in patients attending in a single visit vs. patients attending a series of dental appointments (caries reasons were more common in patients having a single appointment vs. PD in those attending a series of appointments); for type of teeth (upper, posterior, and molars were extracted primarily because of caries, while lower, anterior and incisors were more often extracted because of PD). Dental caries was the most common reason for tooth extraction, followed by periodontal disease. Differences in the reasons for extraction were observed across patient characteristics and type of tooth.

  1. Changes in neck pain and active range of motion after a single thoracic spine manipulation in subjects presenting with mechanical neck pain: a case series.

    PubMed

    Fernández-de-las-Peñas, César; Palomeque-del-Cerro, Luis; Rodríguez-Blanco, Cleofás; Gómez-Conesa, Antonia; Miangolarra-Page, Juan C

    2007-05-01

    Our aim was to report changes in neck pain at rest, active cervical range of motion, and neck pain at end-range of cervical motion after a single thoracic spine manipulation in a case series of patients with mechanical neck pain. Seven patients with mechanical neck pain (2 men, 5 women), 20 to 33 years old, were included. All patients received a single thoracic manipulation by an experienced manipulative therapist. The outcome measures of these cases series were neck pain at rest, as measured by a numerical pain rating scale; active cervical range of motion; and neck pain at the end of each neck motion (eg, flexion or extension). These outcomes were assessed pre treatment, 5 minutes post manipulation, and 48 hours after the intervention. A repeated-measures analysis was made with parametric tests. Within-group effect sizes were calculated using Cohen d coefficients. A significant (P < .001) decrease, with large within-group effect sizes (d > 1), in neck pain at rest were found after the thoracic spinal manipulation. A trend toward an increase in all cervical motions (flexion, extension, right or left lateral flexion, and right or left rotation) and a trend toward a decrease in neck pain at the end of each cervical motion were also found, although differences did not reach the significance (P > .05). Nevertheless, medium to large within-group effect sizes (0.5 < d < 1) were found between preintervention data and both postintervention assessments in both active range of motion and neck pain at the end of each neck motion. The present results demonstrated a clinically significant reduction in pain at rest in subjects with mechanical neck pain immediately and 48 hours following a thoracic manipulation. Although increases in all tested ranges of motion were obtained, none of them reached statistical significance at either posttreatment point. The same was found for pain at the end of range of motion for all tested ranges, with the exception of pain at the end of forward flexion at 48 hours. More than one mechanism likely explains the effects of thoracic spinal manipulation. Future controlled studies comparing spinal manipulation vs spinal mobilization of the thoracic spine are required.

  2. Test Series 2. 4: detailed test plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Test Series 2.4 comprises the fourth sub-series of tests to be scheduled as a part of Test Series 2, the second stage of the combustion research program to be carried out at the Grimethorpe Experimental Pressurized Fluidized Bed Combustion Facility. Test Series 2.1, the first sub-series of tests, was completed in February 1983, and the first part of the second sub-series, Test Series 2.3, in October 1983. Test Series 2.2 was completed in February 1984 after which the second part of Test Series 2.3 commenced. The Plan for Test Series 2.4 consists of 350 data gathering hours to be completedmore » within 520 coal burning hours. This document provides a brief description of the Facility and modifications which have been made following the completion of Test Series 2.1. No further modifications were made following the completion of the first part of Test Series 2.3 or Test Series 2.2. The operating requirements for Test Series 2.4 are specified. The tests will be performed using a UK coal (Lady Windsor), and a UK limestone (Middleton) both nominated by the FRG. Seven objectives are proposed which are to be fulfilled by thirteen test conditions. Six part load tests based on input supplied by Kraftwerk Union AG are included. The cascade is expected to be on line for each test condition and total cascade exposure is expected to be in excess of 450 hours. Details of sampling and special measurements are given. A test plan schedule envisages the full test series being completed within a two month calendar period. Finally, a number of contingency strategies are proposed. 3 figures, 14 tables.« less

  3. Inhibition of Orthopaedic Implant Infections by Immunomodulatory Effects of Host Defense Peptides

    DTIC Science & Technology

    2014-12-01

    significance was determined by t- tests or by one-way analysis of variance (ANOVA) followed by Bonferroni post hoc tests in experiments with multiple...groups. Non- parametric Mann-Whitney tests , Kruskal-Wallis ANOVA followed by Newman-Kuels post hoc tests , or van Elteren’s two-way tests were applied to...in D, and black symbols in A), statistical analysis was by one-way ANOVA followed by Bonferroni versus control, post hoc tests . Otherwise, statistical

  4. A user oriented computer program for the analysis of microwave mixers, and a study of the effects of the series inductance and diode capacitance on the performance of some simple mixers

    NASA Technical Reports Server (NTRS)

    Siegel, P. H.; Kerr, A. R.

    1979-01-01

    A user oriented computer program for analyzing microwave and millimeter wave mixers with a single Schottky barrier diode of known I-V and C-V characteristics is described. The program first performs a nonlinear analysis to determine the diode conductance and capacitance waveforms produced by the local oscillator. A small signal linear analysis is then used to find the conversion loss, port impedances, and input noise temperature of the mixer. Thermal noise from the series resistance of the diode and shot noise from the periodically pumped current in the diode conductance are considered. The effects of the series inductance and diode capacitance on the performance of some simple mixer circuits using a conventional Schottky diode, a Schottky diode in which there is no capacitance variation, and a Mott diode are studied. It is shown that the parametric effects of the voltage dependent capacitance of a conventional Schottky diode may be either detrimental or beneficial depending on the diode and circuit parameters.

  5. Disentangling the stochastic behavior of complex time series

    NASA Astrophysics Data System (ADS)

    Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus

    2016-10-01

    Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events - or jumps - with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes.

  6. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  7. Assessing the performance of eight real-time updating models and procedures for the Brosna River

    NASA Astrophysics Data System (ADS)

    Goswami, M.; O'Connor, K. M.; Bhattarai, K. P.; Shamseldin, A. Y.

    2005-10-01

    The flow forecasting performance of eight updating models, incorporated in the Galway River Flow Modelling and Forecasting System (GFMFS), was assessed using daily data (rainfall, evaporation and discharge) of the Irish Brosna catchment (1207 km2), considering their one to six days lead-time discharge forecasts. The Perfect Forecast of Input over the Forecast Lead-time scenario was adopted, where required, in place of actual rainfall forecasts. The eight updating models were: (i) the standard linear Auto-Regressive (AR) model, applied to the forecast errors (residuals) of a simulation (non-updating) rainfall-runoff model; (ii) the Neural Network Updating (NNU) model, also using such residuals as input; (iii) the Linear Transfer Function (LTF) model, applied to the simulated and the recently observed discharges; (iv) the Non-linear Auto-Regressive eXogenous-Input Model (NARXM), also a neural network-type structure, but having wide options of using recently observed values of one or more of the three data series, together with non-updated simulated outflows, as inputs; (v) the Parametric Simple Linear Model (PSLM), of LTF-type, using recent rainfall and observed discharge data; (vi) the Parametric Linear perturbation Model (PLPM), also of LTF-type, using recent rainfall and observed discharge data, (vii) n-AR, an AR model applied to the observed discharge series only, as a naïve updating model; and (viii) n-NARXM, a naive form of the NARXM, using only the observed discharge data, excluding exogenous inputs. The five GFMFS simulation (non-updating) models used were the non-parametric and parametric forms of the Simple Linear Model and of the Linear Perturbation Model, the Linearly-Varying Gain Factor Model, the Artificial Neural Network Model, and the conceptual Soil Moisture Accounting and Routing (SMAR) model. As the SMAR model performance was found to be the best among these models, in terms of the Nash-Sutcliffe R2 value, both in calibration and in verification, the simulated outflows of this model only were selected for the subsequent exercise of producing updated discharge forecasts. All the eight forms of updating models for producing lead-time discharge forecasts were found to be capable of producing relatively good lead-1 (1-day ahead) forecasts, with R2 values almost 90% or above. However, for higher lead time forecasts, only three updating models, viz., NARXM, LTF, and NNU, were found to be suitable, with lead-6 values of R2 about 90% or higher. Graphical comparisons were made of the lead-time forecasts for the two largest floods, one in the calibration period and the other in the verification period.

  8. Hamiltonian Systems and Optimal Control in Computational Anatomy: 100 Years Since D'Arcy Thompson.

    PubMed

    Miller, Michael I; Trouvé, Alain; Younes, Laurent

    2015-01-01

    The Computational Anatomy project is the morphome-scale study of shape and form, which we model as an orbit under diffeomorphic group action. Metric comparison calculates the geodesic length of the diffeomorphic flow connecting one form to another. Geodesic connection provides a positioning system for coordinatizing the forms and positioning their associated functional information. This article reviews progress since the Euler-Lagrange characterization of the geodesics a decade ago. Geodesic positioning is posed as a series of problems in Hamiltonian control, which emphasize the key reduction from the Eulerian momentum with dimension of the flow of the group, to the parametric coordinates appropriate to the dimension of the submanifolds being positioned. The Hamiltonian viewpoint provides important extensions of the core setting to new, object-informed positioning systems. Several submanifold mapping problems are discussed as they apply to metamorphosis, multiple shape spaces, and longitudinal time series studies of growth and atrophy via shape splines.

  9. Empirical estimation of a distribution function with truncated and doubly interval-censored data and its application to AIDS studies.

    PubMed

    Sun, J

    1995-09-01

    In this paper we discuss the non-parametric estimation of a distribution function based on incomplete data for which the measurement origin of a survival time or the date of enrollment in a study is known only to belong to an interval. Also the survival time of interest itself is observed from a truncated distribution and is known only to lie in an interval. To estimate the distribution function, a simple self-consistency algorithm, a generalization of Turnbull's (1976, Journal of the Royal Statistical Association, Series B 38, 290-295) self-consistency algorithm, is proposed. This method is then used to analyze two AIDS cohort studies, for which direct use of the EM algorithm (Dempster, Laird and Rubin, 1976, Journal of the Royal Statistical Association, Series B 39, 1-38), which is computationally complicated, has previously been the usual method of the analysis.

  10. Space shuttle: Stability and control effectiveness of the MDAC parametric delta canard booster at Mach 0.38. Volume 1: Canard parametric variations

    NASA Technical Reports Server (NTRS)

    Bradley, D.; Buchholz, R. E.

    1971-01-01

    A 0.015 scale model of a modified version of the MDAC space shuttle booster was tested in the Naval Ship Research and Development Center 7 x 10 foot transonic wind tunnel, to obtain force, static stability, and control effectiveness data. Data were obtained for a cruise Mach Number of 0.38, altitude of 10,000 ft, and Reynolds Number per foot of approximately 2 x one million. The model was tested through an angle of attack range of -4 deg to 15 deg at zero degree angle of sideslip, and at an angle of sideslip range of -6 deg to 6 deg at fixed angles of attack of 0 deg, 6 deg, and 15 deg. Other test variables were elevon deflections, canard deflections, aileron deflections, rudder deflections, wing dihedral angle, canard incidence angle, wing incidence angle, canard position, wing position, wing and canard control flap size and dorsal fin size.

  11. On the efficacy of procedures to normalize Ex-Gaussian distributions

    PubMed Central

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2015-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results. PMID:25709588

  12. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    ERIC Educational Resources Information Center

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  13. Five-Point Likert Items: t Test versus Mann-Whitney-Wilcoxon

    ERIC Educational Resources Information Center

    de Winter, Joost C. F.; Dodou, Dimitra

    2010-01-01

    Likert questionnaires are widely used in survey research, but it is unclear whether the item data should be investigated by means of parametric or nonparametric procedures. This study compared the Type I and II error rates of the "t" test versus the Mann-Whitney-Wilcoxon (MWW) for five-point Likert items. Fourteen population…

  14. Bootstrapping in Applied Linguistics: Assessing Its Potential Using Shared Data

    ERIC Educational Resources Information Center

    Plonsky, Luke; Egbert, Jesse; Laflair, Geoffrey T.

    2015-01-01

    Parametric analyses such as t tests and ANOVAs are the norm--if not the default--statistical tests found in quantitative applied linguistics research (Gass 2009). Applied statisticians and one applied linguist (Larson-Hall 2010, 2012; Larson-Hall and Herrington 2010), however, have argued that this approach may not be appropriate for small samples…

  15. PILOT-SCALE PARAMETRIC TESTING OF SPRAY DRYER SO2 SCRUBBER FOR LOW-TO-MODERATE SULFUR COAL UTILITY APPLICATIONS

    EPA Science Inventory

    The report gives results of a comprehensive, pilot, dry, SO2 scrubbing test program to determine the effects of process variables on SO2 removal. In the spray dryer, stoichiometric ratio, flue gas temperature approach to adiabatic saturation, and temperature drop across the spray...

  16. Assessment of Adolescent Perceptions on Parental Attitudes on Different Variables

    ERIC Educational Resources Information Center

    Ersoy, Evren

    2015-01-01

    The purpose of this study is to examine secondary school student perceptions of parental attitudes with regards to specific variables. Independent samples t test for parametric distributions and one-way variance analysis (ANOVA) was used for analyzing the data, when the ANOVA analyses were significant Scheffe test was conducted on homogeneous…

  17. Evaluation of model-based versus non-parametric monaural noise-reduction approaches for hearing aids.

    PubMed

    Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker

    2012-08-01

    Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.

  18. Evaluation of Second-Level Inference in fMRI Analysis

    PubMed Central

    Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs

    2016-01-01

    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578

  19. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The parameters thus learned are used for calculating the joint distribution of the observations. However, this GMM assumption is essentially an approximation and signals the potential viability of non-parametric density estimators. This is the key idea underlying the new approach.

  20. Comparison of Salmonella enteritidis phage types isolated from layers and humans in Belgium in 2005.

    PubMed

    Welby, Sarah; Imberechts, Hein; Riocreux, Flavien; Bertrand, Sophie; Dierick, Katelijne; Wildemauwe, Christa; Hooyberghs, Jozef; Van der Stede, Yves

    2011-08-01

    The aim of this study was to investigate the available results for Belgium of the European Union coordinated monitoring program (2004/665 EC) on Salmonella in layers in 2005, as well as the results of the monthly outbreak reports of Salmonella Enteritidis in humans in 2005 to identify a possible statistical significant trend in both populations. Separate descriptive statistics and univariate analysis were carried out and the parametric and/or non-parametric hypothesis tests were conducted. A time cluster analysis was performed for all Salmonella Enteritidis phage types (PTs) isolated. The proportions of each Salmonella Enteritidis PT in layers and in humans were compared and the monthly distribution of the most common PT, isolated in both populations, was evaluated. The time cluster analysis revealed significant clusters during the months May and June for layers and May, July, August, and September for humans. PT21, the most frequently isolated PT in both populations in 2005, seemed to be responsible of these significant clusters. PT4 was the second most frequently isolated PT. No significant difference was found for the monthly trend evolution of both PT in both populations based on parametric and non-parametric methods. A similar monthly trend of PT distribution in humans and layers during the year 2005 was observed. The time cluster analysis and the statistical significance testing confirmed these results. Moreover, the time cluster analysis showed significant clusters during the summer time and slightly delayed in time (humans after layers). These results suggest a common link between the prevalence of Salmonella Enteritidis in layers and the occurrence of the pathogen in humans. Phage typing was confirmed to be a useful tool for identifying temporal trends.

  1. Covariate analysis of bivariate survival data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methodsmore » have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.« less

  2. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  3. Parametrization of Drag and Turbulence for Urban Neighbourhoods with Trees

    NASA Astrophysics Data System (ADS)

    Krayenhoff, E. S.; Santiago, J.-L.; Martilli, A.; Christen, A.; Oke, T. R.

    2015-08-01

    Urban canopy parametrizations designed to be coupled with mesoscale models must predict the integrated effect of urban obstacles on the flow at each height in the canopy. To assess these neighbourhood-scale effects, results of microscale simulations may be horizontally-averaged. Obstacle-resolving computational fluid dynamics (CFD) simulations of neutrally-stratified flow through canopies of blocks (buildings) with varying distributions and densities of porous media (tree foliage) are conducted, and the spatially-averaged impacts on the flow of these building-tree combinations are assessed. The accuracy with which a one-dimensional (column) model with a one-equation (-) turbulence scheme represents spatially-averaged CFD results is evaluated. Individual physical mechanisms by which trees and buildings affect flow in the column model are evaluated in terms of relative importance. For the treed urban configurations considered, effects of buildings and trees may be considered independently. Building drag coefficients and length scale effects need not be altered due to the presence of tree foliage; therefore, parametrization of spatially-averaged flow through urban neighbourhoods with trees is greatly simplified. The new parametrization includes only source and sink terms significant for the prediction of spatially-averaged flow profiles: momentum drag due to buildings and trees (and the associated wake production of turbulent kinetic energy), modification of length scales by buildings, and enhanced dissipation of turbulent kinetic energy due to the small scale of tree foliage elements. Coefficients for the Santiago and Martilli (Boundary-Layer Meteorol 137: 417-439, 2010) parametrization of building drag coefficients and length scales are revised. Inclusion of foliage terms from the new parametrization in addition to the Santiago and Martilli building terms reduces root-mean-square difference (RMSD) of the column model streamwise velocity component and turbulent kinetic energy relative to the CFD model by 89 % in the canopy and 71 % above the canopy on average for the highest leaf area density scenarios tested: . RMSD values with the new parametrization are less than 20 % of mean layer magnitude for the streamwise velocity component within and above the canopy, and for above-canopy turbulent kinetic energy; RMSD values for within-canopy turbulent kinetic energy are negligible for most scenarios. The foliage-related portion of the new parametrization is required for scenarios with tree foliage of equal or greater height than the buildings, and for scenarios with foliage below roof height for building plan area densities less than approximately 0.25.

  4. Near-Earth and near-Mars asteroids: Prognosis of pyroxene types

    NASA Technical Reports Server (NTRS)

    Shestopalov, D. I.; Golubeva, L. F.

    1991-01-01

    The diagnostic signs of ferrous absorption band at 505nm and color index (u-x) found at main-belt asteroids and 6-parametric classification of light stone meteorites have been the basis of the work. The colorimetric data of light near-Earth and near-Mars asteroids from TRIAD and ECAS were analyzed. Composition fields of pyroxenes were obtained for these asteroids by the value of (u-x) and 505-nm ferrous absorption band position within the pyroxenes quadrilateral. Pyroxenes of the S-asteroids from Apollo-Amor which have spectral parameters similar to achondrites may be presented by the diopside series.

  5. Cardiovascular oscillations: in search of a nonlinear parametric model

    NASA Astrophysics Data System (ADS)

    Bandrivskyy, Andriy; Luchinsky, Dmitry; McClintock, Peter V.; Smelyanskiy, Vadim; Stefanovska, Aneta; Timucin, Dogan

    2003-05-01

    We suggest a fresh approach to the modeling of the human cardiovascular system. Taking advantage of a new Bayesian inference technique, able to deal with stochastic nonlinear systems, we show that one can estimate parameters for models of the cardiovascular system directly from measured time series. We present preliminary results of inference of parameters of a model of coupled oscillators from measured cardiovascular data addressing cardiorespiratory interaction. We argue that the inference technique offers a very promising tool for the modeling, able to contribute significantly towards the solution of a long standing challenge -- development of new diagnostic techniques based on noninvasive measurements.

  6. Advanced propulsion system concept for hybrid vehicles

    NASA Technical Reports Server (NTRS)

    Bhate, S.; Chen, H.; Dochat, G.

    1980-01-01

    A series hybrid system, utilizing a free piston Stirling engine with a linear alternator, and a parallel hybrid system, incorporating a kinematic Stirling engine, are analyzed for various specified reference missions/vehicles ranging from a small two passenger commuter vehicle to a van. Parametric studies for each configuration, detail tradeoff studies to determine engine, battery and system definition, short term energy storage evaluation, and detail life cycle cost studies were performed. Results indicate that the selection of a parallel Stirling engine/electric, hybrid propulsion system can significantly reduce petroleum consumption by 70 percent over present conventional vehicles.

  7. Supercomputer analysis of purine and pyrimidine metabolism leading to DNA synthesis.

    PubMed

    Heinmets, F

    1989-06-01

    A model-system is established to analyze purine and pyrimidine metabolism leading to DNA synthesis. The principal aim is to explore the flow and regulation of terminal deoxynucleoside triophosphates (dNTPs) in various input and parametric conditions. A series of flow equations are established, which are subsequently converted to differential equations. These are programmed (Fortran) and analyzed on a Cray chi-MP/48 supercomputer. The pool concentrations are presented as a function of time in conditions in which various pertinent parameters of the system are modified. The system is formulated by 100 differential equations.

  8. Non-Grey Radiation Modeling using Thermal Desktop/Sindaworks TFAWS06-1009

    NASA Technical Reports Server (NTRS)

    Anderson, Kevin R.; Paine, Chris

    2006-01-01

    This paper provides an overview of the non-grey radiation modeling capabilities of Cullimore and Ring's Thermal Desktop(Registered TradeMark) Version 4.8 SindaWorks software. The non-grey radiation analysis theory implemented by Sindaworks and the methodology used by the software are outlined. Representative results from a parametric trade study of a radiation shield comprised of a series of v-grooved shaped deployable panels is used to illustrate the capabilities of the SindaWorks non-grey radiation thermal analysis software using emissivities with temperature and wavelength dependency modeled via a Hagen-Rubens relationship.

  9. Optimal control of parametric oscillations of compressed flexible bars

    NASA Astrophysics Data System (ADS)

    Alesova, I. M.; Babadzanjanz, L. K.; Pototskaya, I. Yu.; Pupysheva, Yu. Yu.; Saakyan, A. T.

    2018-05-01

    In this paper the problem of damping of the linear systems oscillations with piece-wise constant control is solved. The motion of bar construction is reduced to the form described by Hill's differential equation using the Bubnov-Galerkin method. To calculate switching moments of the one-side control the method of sequential linear programming is used. The elements of the fundamental matrix of the Hill's equation are approximated by trigonometric series. Examples of the optimal control of the systems for various initial conditions and different number of control stages have been calculated. The corresponding phase trajectories and transient processes are represented.

  10. Dynamical simulation of electron transfer processes in self-assembled monolayers at metal surfaces using a density matrix approach.

    PubMed

    Prucker, V; Bockstedte, M; Thoss, M; Coto, P B

    2018-03-28

    A single-particle density matrix approach is introduced to simulate the dynamics of heterogeneous electron transfer (ET) processes at interfaces. The characterization of the systems is based on a model Hamiltonian parametrized by electronic structure calculations and a partitioning method. The method is applied to investigate ET in a series of nitrile-substituted (poly)(p-phenylene)thiolate self-assembled monolayers adsorbed at the Au(111) surface. The results show a significant dependence of the ET on the orbital symmetry of the donor state and on the molecular and electronic structure of the spacer.

  11. A Compilation of Hazard and Test Data for Pyrotechnic Compositions

    DTIC Science & Technology

    1980-10-01

    heated. These changes may be related to dehydration , decomposition , crystal- line transition, melting, boiling, vaporization, polymerization, oxidation...123 180 + 66 162 + 16 506 +169 447 +199 448+ 159 Decomposition temperature °C 277 + 102 561 j; 135 205 + 75 182 + 24 550 + 168 505 +224 517 + 153...of compatibility or classification. The following tests are included in the parametric tests: 1. Autoignition Temperature 2. Decomposition

  12. The comparison between science virtual and paper based test in measuring grade 7 students’ critical thinking

    NASA Astrophysics Data System (ADS)

    Dhitareka, P. H.; Firman, H.; Rusyati, L.

    2018-05-01

    This research is comparing science virtual and paper-based test in measuring grade 7 students’ critical thinking based on Multiple Intelligences and gender. Quasi experimental method with within-subjects design is conducted in this research in order to obtain the data. The population of this research was all seventh grade students in ten classes of one public secondary school in Bandung. There were 71 students within two classes taken randomly became the sample in this research. The data are obtained through 28 questions with a topic of living things and environmental sustainability constructed based on eight critical thinking elements proposed by Inch then the questions provided in science virtual and paper-based test. The data was analysed by using paired-samples t test when the data are parametric and Wilcoxon signed ranks test when the data are non-parametric. In general comparison, the p-value of the comparison between science virtual and paper-based tests’ score is 0.506, indicated that there are no significance difference between science virtual and paper-based test based on the tests’ score. The results are furthermore supported by the students’ attitude result which is 3.15 from the scale from 1 to 4, indicated that they have positive attitudes towards Science Virtual Test.

  13. Testing primates with joystick-based automated apparatus - Lessons from the Language Research Center's Computerized Test System

    NASA Technical Reports Server (NTRS)

    Washburn, David A.; Rumbaugh, Duane M.

    1992-01-01

    Nonhuman primates provide useful models for studying a variety of medical, biological, and behavioral topics. Four years of joystick-based automated testing of monkeys using the Language Research Center's Computerized Test System (LRC-CTS) are examined to derive hints and principles for comparable testing with other species - including humans. The results of multiple parametric studies are reviewed, and reliability data are presented to reveal the surprises and pitfalls associated with video-task testing of performance.

  14. Assessment of Impact of the Rheological Parameters Change on Sensitivity of the Asphalt Strain Based on the Test Results / Ocena Wpływu Zmiany Parametrów Reologicznych Na Wrażliwość Deformacji Mieszanek Mineralno - Asfaltowych Na Podstawie Wyników Badań

    NASA Astrophysics Data System (ADS)

    Kurpiel, Artur; Wysokowski, Adam

    2015-03-01

    The creep test under the static loading, that allows to determine rheological properties of asphalt based on the creep curve, is the most effective test nowadays. Applied loads are non-destructive and allow to observe the course of the strain after the test load. The test can be carried out on compressing, shearing, bending as well as on triaxial test, that depends on the applied apparatus implementing different intensity [1, 2, 3, 4, 5, 6]. Based on the creep test, the stress of different properties can be specified. Among them there are valuable rheological properties based on selected viscoelascity models [1]. The properties of the viscoelascity models are relevant indexes depicting resistance to deformation. They can be used to forecast the wheel-truck in the accepted rheological model [1]. In this article it is shown the impact of different rheological properties of the viscoelacity model on the wheel-truck as well as the impact of different properties on shape and the course of the creep curve. The asphalt mixtures presented in this article are characterized by variable rheological properties. It is therefore difficult to determine which property mostly affects the size of the strain. However, the authors of this article attempted to analyse the change of the asphalt strain value of the different variables in particular rheological model, called Bürgers's model. Badanie pełzania pod obciążeniem statycznym jest obecnie najbardziej efektywnym badaniem pozwalającym na określenie reologicznych parametrów mieszanek mineralno - asfaltowych na podstawie krzywej pełzania. Stosowane obciążenia mają poziom nieniszczący i pozwalają na obserwację przebiegu odkształceń w czasie również po odciążeniu. Badanie może być realizowane przy ściskaniu, ścinaniu, rozciąganiu i zginaniu, a także w zakresie trójosiowym, w zależności od stosowanego aparatu realizującego zadany schemat naprężeń [1, 2, 3, 4, 5, 6]. Na podstawie badania pełzania można określić parametry oparte o różne teorie pełzania a szczególnie cenne parametry reologiczne w oparciu o wybrane modele lepkosprężyste [1]. Parametry z modeli lepkosprężystych są miarodajnymi wskaźnikami obrazującymi odporność mieszanek na deformacje. Można za ich pomocą prognozować głębokości koleiny w przyjętym modelu reologicznym [1]. W niniejszym artykule przedstawiono jaki wpływ na głębokość koleiny mają różne wartości parametrów reologicznych z analizowanego modelu lepkosprężystego oraz wpływ parametrów na kształt i przebieg krzywej pełzania. Przedstawione w artykule mieszanki mineralno - asfaltowe charakteryzują się zmiennymi parametrami reologicznymi, zatem trudno jest określić, który parametr decyduje o wielkości odkształcenia danej mieszanki. Mając na uwadze powyższe, w artykule podjęto próbę analizy zmiany wartości odkształcenia mieszanki mineralno - asfaltowej przy zmianie jednego oraz dwóch parametrów w danym modelu reologicznym - w tym przypadku - Bürgersa.

  15. Testing of the Trim Tab Parametric Model in NASA Langley's Unitary Plan Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Watkins, Anthony N.; Korzun, Ashley M.; Edquist, Karl T.

    2013-01-01

    In support of NASA's Entry, Descent, and Landing technology development efforts, testing of Langley's Trim Tab Parametric Models was conducted in Test Section 2 of NASA Langley's Unitary Plan Wind Tunnel. The objectives of these tests were to generate quantitative aerodynamic data and qualitative surface pressure data for experimental and computational validation and aerodynamic database development. Six component force-and-moment data were measured on 38 unique, blunt body trim tab configurations at Mach numbers of 2.5, 3.5, and 4.5, angles of attack from -4deg to +20deg, and angles of sideslip from 0deg to +8deg. Configuration parameters investigated in this study were forebody shape, tab area, tab cant angle, and tab aspect ratio. Pressure Sensitive Paint was used to provide qualitative surface pressure mapping for a subset of these flow and configuration variables. Over the range of parameters tested, the effects of varying tab area and tab cant angle were found to be much more significant than varying tab aspect ratio relative to key aerodynamic performance requirements. Qualitative surface pressure data supported the integrated aerodynamic data and provided information to aid in future analyses of localized phenomena for trim tab configurations.

  16. Selected Parametric Effects on Materials Flammability Limits

    NASA Technical Reports Server (NTRS)

    Hirsch, David B.; Juarez, Alfredo; Peyton, Gary J.; Harper, Susana A.; Olson, Sandra L.

    2011-01-01

    NASA-STD-(I)-6001B Test 1 is currently used to evaluate the flammability of materials intended for use in habitable environments of U.S. spacecraft. The method is a pass/fail upward flame propagation test conducted in the worst case configuration, which is defined as a combination of a material s thickness, test pressure, oxygen concentration, and temperature that make the material most flammable. Although simple parametric effects may be intuitive (such as increasing oxygen concentrations resulting in increased flammability), combinations of multi-parameter effects could be more complex. In addition, there are a variety of material configurations used in spacecraft. Such configurations could include, for example, exposed free edges where fire propagation may be different when compared to configurations commonly employed in standard testing. Studies involving combined oxygen concentration, pressure, and temperature on flammability limits have been conducted and are summarized in this paper. Additional effects on flammability limits of a material s thickness, mode of ignition, burn-length criteria, and exposed edges are presented. The information obtained will allow proper selection of ground flammability test conditions, support further studies comparing flammability in 1-g with microgravity and reduced gravity environments, and contribute to persuasive scientific cases for rigorous space system fire risk assessments.

  17. Multiple Hypothesis Testing for Experimental Gingivitis Based on Wilcoxon Signed Rank Statistics

    PubMed Central

    Preisser, John S.; Sen, Pranab K.; Offenbacher, Steven

    2011-01-01

    Dental research often involves repeated multivariate outcomes on a small number of subjects for which there is interest in identifying outcomes that exhibit change in their levels over time as well as to characterize the nature of that change. In particular, periodontal research often involves the analysis of molecular mediators of inflammation for which multivariate parametric methods are highly sensitive to outliers and deviations from Gaussian assumptions. In such settings, nonparametric methods may be favored over parametric ones. Additionally, there is a need for statistical methods that control an overall error rate for multiple hypothesis testing. We review univariate and multivariate nonparametric hypothesis tests and apply them to longitudinal data to assess changes over time in 31 biomarkers measured from the gingival crevicular fluid in 22 subjects whereby gingivitis was induced by temporarily withholding tooth brushing. To identify biomarkers that can be induced to change, multivariate Wilcoxon signed rank tests for a set of four summary measures based upon area under the curve are applied for each biomarker and compared to their univariate counterparts. Multiple hypothesis testing methods with choice of control of the false discovery rate or strong control of the family-wise error rate are examined. PMID:21984957

  18. Shuttle payload vibroacoustic test plan evaluation. Free flyer payload applications and sortie payload parametric variations

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloff, H. R.

    1977-01-01

    A preliminary assessment of vibroacoustic test plan optimization for free flyer STS payloads is presented and the effects on alternate test plans for Spacelab sortie payloads number of missions are also examined. The component vibration failure probability and the number of components in the housekeeping subassemblies are provided. Decision models are used to evaluate the cost effectiveness of seven alternate test plans using protoflight hardware.

  19. Parametric analyses of summative scores may lead to conflicting inferences when comparing groups: A simulation study.

    PubMed

    Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S

    2015-04-01

    To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.

  20. A review of recent developments in parametric based acoustic emission techniques applied to concrete structures

    NASA Astrophysics Data System (ADS)

    Vidya Sagar, R.; Raghu Prasad, B. K.

    2012-03-01

    This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.

Top