Fuel cell on-site integrated energy system parametric analysis of a residential complex
NASA Technical Reports Server (NTRS)
Simons, S. N.
1977-01-01
A parametric energy-use analysis was performed for a large apartment complex served by a fuel cell on-site integrated energy system (OS/IES). The variables parameterized include operating characteristics for four phosphoric acid fuel cells, eight OS/IES energy recovery systems, and four climatic locations. The annual fuel consumption for selected parametric combinations are presented and a breakeven economic analysis is presented for one parametric combination. The results show fuel cell electrical efficiency and system component choice have the greatest effect on annual fuel consumption; fuel cell thermal efficiency and geographic location have less of an effect.
NASA Technical Reports Server (NTRS)
Leininger, G.; Jutila, S.; King, J.; Muraco, W.; Hansell, J.; Lindeen, J.; Franckowiak, E.; Flaschner, A.
1975-01-01
Appendices are presented which include discussions of interest formulas, factors in regionalization, parametric modeling of discounted benefit-sacrifice streams, engineering economic calculations, and product innovation. For Volume 1, see .
NASA Technical Reports Server (NTRS)
1972-01-01
The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.
SYSTEMS ANALYSIS, * WATER SUPPLIES, MATHEMATICAL MODELS, OPTIMIZATION, ECONOMICS, LINEAR PROGRAMMING, HYDROLOGY, REGIONS, ALLOCATIONS, RESTRAINT, RIVERS, EVAPORATION, LAKES, UTAH, SALVAGE, MINES(EXCAVATIONS).
NASA Astrophysics Data System (ADS)
Alfieri, Luisa
2015-12-01
Power quality (PQ) disturbances are becoming an important issue in smart grids (SGs) due to the significant economic consequences that they can generate on sensible loads. However, SGs include several distributed energy resources (DERs) that can be interconnected to the grid with static converters, which lead to a reduction of the PQ levels. Among DERs, wind turbines and photovoltaic systems are expected to be used extensively due to the forecasted reduction in investment costs and other economic incentives. These systems can introduce significant time-varying voltage and current waveform distortions that require advanced spectral analysis methods to be used. This paper provides an application of advanced parametric methods for assessing waveform distortions in SGs with dispersed generation. In particular, the Standard International Electrotechnical Committee (IEC) method, some parametric methods (such as Prony and Estimation of Signal Parameters by Rotational Invariance Technique (ESPRIT)), and some hybrid methods are critically compared on the basis of their accuracy and the computational effort required.
Single-stage-to-orbit versus two-stage-two-orbit: A cost perspective
NASA Astrophysics Data System (ADS)
Hamaker, Joseph W.
1996-03-01
This paper considers the possible life-cycle costs of single-stage-to-orbit (SSTO) and two-stage-to-orbit (TSTO) reusable launch vehicles (RLV's). The analysis parametrically addresses the issue such that the preferred economic choice comes down to the relative complexity of the TSTO compared to the SSTO. The analysis defines the boundary complexity conditions at which the two configurations have equal life-cycle costs, and finally, makes a case for the economic preference of SSTO over TSTO.
Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik
2017-12-15
Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
Parametric sensitivity analysis of an agro-economic model of management of irrigation water
NASA Astrophysics Data System (ADS)
El Ouadi, Ihssan; Ouazar, Driss; El Menyari, Younesse
2015-04-01
The current work aims to build an analysis and decision support tool for policy options concerning the optimal allocation of water resources, while allowing a better reflection on the issue of valuation of water by the agricultural sector in particular. Thus, a model disaggregated by farm type was developed for the rural town of Ait Ben Yacoub located in the east Morocco. This model integrates economic, agronomic and hydraulic data and simulates agricultural gross margin across in this area taking into consideration changes in public policy and climatic conditions, taking into account the competition for collective resources. To identify the model input parameters that influence over the results of the model, a parametric sensitivity analysis is performed by the "One-Factor-At-A-Time" approach within the "Screening Designs" method. Preliminary results of this analysis show that among the 10 parameters analyzed, 6 parameters affect significantly the objective function of the model, it is in order of influence: i) Coefficient of crop yield response to water, ii) Average daily gain in weight of livestock, iii) Exchange of livestock reproduction, iv) maximum yield of crops, v) Supply of irrigation water and vi) precipitation. These 6 parameters register sensitivity indexes ranging between 0.22 and 1.28. Those results show high uncertainties on these parameters that can dramatically skew the results of the model or the need to pay particular attention to their estimates. Keywords: water, agriculture, modeling, optimal allocation, parametric sensitivity analysis, Screening Designs, One-Factor-At-A-Time, agricultural policy, climate change.
NASA Astrophysics Data System (ADS)
Wibowo, Wahyu; Wene, Chatrien; Budiantara, I. Nyoman; Permatasari, Erma Oktania
2017-03-01
Multiresponse semiparametric regression is simultaneous equation regression model and fusion of parametric and nonparametric model. The regression model comprise several models and each model has two components, parametric and nonparametric. The used model has linear function as parametric and polynomial truncated spline as nonparametric component. The model can handle both linearity and nonlinearity relationship between response and the sets of predictor variables. The aim of this paper is to demonstrate the application of the regression model for modeling of effect of regional socio-economic on use of information technology. More specific, the response variables are percentage of households has access to internet and percentage of households has personal computer. Then, predictor variables are percentage of literacy people, percentage of electrification and percentage of economic growth. Based on identification of the relationship between response and predictor variable, economic growth is treated as nonparametric predictor and the others are parametric predictors. The result shows that the multiresponse semiparametric regression can be applied well as indicate by the high coefficient determination, 90 percent.
An economics systems analysis of land mobile radio telephone services
NASA Technical Reports Server (NTRS)
Leroy, B. E.; Stevenson, S. M.
1980-01-01
The economic interaction of the terrestrial and satellite systems is considered. Parametric equations are formulated to allow examination of necessary user thresholds and growth rates as a function of system costs. Conversely, first order allowable systems costs are found as a function of user thresholds and growth rates. Transitions between satellite and terrestrial service systems are examined. User growth rate density (user/year/sq km) is shown to be a key parameter in the analysis of systems compatibility. The concept of system design matching the price/demand curves is introduced and examples are given. The role of satellite systems is critically examined and the economic conditions necessary for the introduction of satellite service are identified.
Analysis and assessment of STES technologies
NASA Astrophysics Data System (ADS)
Brown, D. R.; Blahnik, D. E.; Huber, H. D.
1982-12-01
Technical and economic assessments completed in FY 1982 in support of the Seasonal Thermal Energy Storage (STES) segment of the Underground Energy Storage Program included: (1) a detailed economic investigation of the cost of heat storage in aquifers, (2) documentation for AQUASTOR, a computer model for analyzing aquifer thermal energy storage (ATES) coupled with district heating or cooling, and (3) a technical and economic evaluation of several ice storage concepts. This paper summarizes the research efforts and main results of each of these three activities. In addition, a detailed economic investigation of the cost of chill storage in aquifers is currently in progress. The work parallels that done for ATES heat storage with technical and economic assumptions being varied in a parametric analysis of the cost of ATES delivered chill. The computer model AQUASTOR is the principal analytical tool being employed.
Parametric analysis of ATT configurations.
NASA Technical Reports Server (NTRS)
Lange, R. H.
1972-01-01
This paper describes the results of a Lockheed parametric analysis of the performance, environmental factors, and economics of an advanced commercial transport envisioned for operation in the post-1985 time period. The design parameters investigated include cruise speeds from Mach 0.85 to Mach 1.0, passenger capacities from 200 to 500, ranges of 2800 to 5500 nautical miles, and noise level criteria. NASA high performance configurations and alternate configurations are operated over domestic and international route structures. Indirect and direct costs and return on investment are determined for approximately 40 candidate aircraft configurations. The candidate configurations are input to an aircraft sizing and performance program which includes a subroutine for noise criteria. Comparisons are made between preferred configurations on the basis of maximum return on investment as a function of payload, range, and design cruise speed.
An economic systems analysis of land mobile radio telephone services
NASA Technical Reports Server (NTRS)
Leroy, B. E.; Stevenson, S. M.
1980-01-01
This paper deals with the economic interaction of the terrestrial and satellite land-mobile radio service systems. The cellular, trunked and satellite land-mobile systems are described. Parametric equations are formulated to allow examination of necessary user thresholds and growth rates as functions of system costs. Conversely, first order allowable systems costs are found as a function of user thresholds and growth rates. Transitions between satellite and terrestrial service systems are examined. User growth rate density (user/year/km squared) is shown to be a key parameter in the analysis of systems compatibility. The concept of system design matching the price demand curves is introduced and examples are given. The role of satellite systems is critically examined and the economic conditions necessary for the introduction of satellite service are identified.
NASA Technical Reports Server (NTRS)
Watters, H.; Steadman, J.
1976-01-01
A modular training approach for Spacelab payload crews is described. Representative missions are defined for training requirements analysis, training hardware, and simulations. Training times are projected for each experiment of each representative flight. A parametric analysis of the various flights defines resource requirements for a modular training facility at different flight frequencies. The modular approach is believed to be more flexible, time saving, and economical than previous single high fidelity trainer concepts. Block diagrams of training programs are shown.
Illiquidity premium and expected stock returns in the UK: A new approach
NASA Astrophysics Data System (ADS)
Chen, Jiaqi; Sherif, Mohamed
2016-09-01
This study examines the relative importance of liquidity risk for the time-series and cross-section of stock returns in the UK. We propose a simple way to capture the multidimensionality of illiquidity. Our analysis indicates that existing illiquidity measures have considerable asset specific components, which justifies our new approach. Further, we use an alternative test of the Amihud (2002) measure and parametric and non-parametric methods to investigate whether liquidity risk is priced in the UK. We find that the inclusion of the illiquidity factor in the capital asset pricing model plays a significant role in explaining the cross-sectional variation in stock returns, in particular with the Fama-French three-factor model. Further, using Hansen-Jagannathan non-parametric bounds, we find that the illiquidity-augmented capital asset pricing models yield a small distance error, other non-liquidity based models fail to yield economically plausible distance values. Our findings have important implications for managing the liquidity risk of equity portfolios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emery, A.F.; Heerwage, D.R.; Kippehan, C.J.
A parametric study has been conducted of passive heating devices that are to be used to provide environmental conditioning for a single-family house. This study has been performed using the thermal simulation computer program UWENSOL. Climatic data used in this analysis were for Yokohama, Japan, which has a subtropical humid climate similar to Washington, D.C. (in terms of winter air temperatures and useful radiation). Initial studies considered the use of different wall thicknesses, glazing types, and orientations for a Trombe wall and alternate storage quantities for a walk-in greenhouse. Employing a number of comparative parametric studies an economical and efficientmore » combination of devices was selected. Then, using a computer routine COMFORT which is based on the Fanger Comfort Equation, another series of parametric analyses were performed to evaluate the degree of thermal comfort for the occupants of the house. The results of these analyses demonstrated that an averaged Predicted Mean Vote of less than 0.3 from a thermally-neutral condition could be maintained and that less than 10% of all occupants of such a passively-heated house would be thermally uncomfortable.« less
Bulsei, Julie; Darlington, Meryl; Durand-Zaleski, Isabelle; Azizi, Michel
2018-04-01
Whilst much uncertainty exists as to the efficacy of renal denervation (RDN), the positive results of the DENERHTN study in France confirmed the interest of an economic evaluation in order to assess efficiency of RDN and inform local decision makers about the costs and benefits of this intervention. The uncertainty surrounding both the outcomes and the costs can be described using health economic methods such as the non-parametric bootstrap. Internationally, numerous health economic studies using a cost-effectiveness model to assess the impact of RDN in terms of cost and effectiveness compared to antihypertensive medical treatment have been conducted. The DENERHTN cost-effectiveness study was the first health economic evaluation specifically designed to assess the cost-effectiveness of RDN using individual data. Using the DENERHTN results as an example, we provide here a summary of the principle methods used to perform a cost-effectiveness analysis.
Technical and Economic Assessment of Span-Loaded Cargo Aircraft Concepts
NASA Technical Reports Server (NTRS)
1976-01-01
The benefits are assessed of span distributed loading concepts as applied to future commercial air cargo operations. A two phased program is used to perform this assessment. The first phase consists of selected parametric studies to define significant configuration, performance, and economic trends. The second phase consists of more detailed engineering design, analysis, and economic evaluations to define the technical and economic feasibility of a selected spanloader design. A conventional all-cargo aircraft of comparable technology and size is used as a comparator system. The technical feasibility is demonstrated of the spanloader concept with no new major technology efforts required to implement the system. However, certain high pay-off technologies such as winglets, airfoil design, and advanced structural materials and manufacturing techniques need refinement and definition prior to application. In addition, further structural design analysis could establish the techniques and criteria necessary to fully capitalize upon the high degree of structural commonality and simplicity inherent in the spanloader concept.
Askin, Amanda Christine; Barter, Garrett; West, Todd H.; ...
2015-02-14
Here, we present a parametric analysis of factors that can influence advanced fuel and technology deployments in U.S. Class 7–8 trucks through 2050. The analysis focuses on the competition between traditional diesel trucks, natural gas vehicles (NGVs), and ultra-efficient powertrains. Underlying the study is a vehicle choice and stock model of the U.S. heavy-duty vehicle market. Moreover, the model is segmented by vehicle class, body type, powertrain, fleet size, and operational type. We find that conventional diesel trucks will dominate the market through 2050, but NGVs could have significant market penetration depending on key technological and economic uncertainties. Compressed naturalmore » gas trucks conducting urban trips in fleets that can support private infrastructure are economically viable now and will continue to gain market share. Ultra-efficient diesel trucks, exemplified by the U.S. Department of Energy's SuperTruck program, are the preferred alternative in the long haul segment, but could compete with liquefied natural gas (LNG) trucks if the fuel price differential between LNG and diesel increases. However, the greatest impact in reducing petroleum consumption and pollutant emissions is had by investing in efficiency technologies that benefit all powertrains, especially the conventional diesels that comprise the majority of the stock, instead of incentivizing specific alternatives.« less
Feasibility study of modern airships, phase 1. Volume 3: Historical overview (task 1)
NASA Technical Reports Server (NTRS)
Faurote, G. L.
1975-01-01
The history of lighter-than-air vehicles is reviewed in terms of providing a background for the mission analysis and parametric analysis tasks. Data from past airships and airship operations are presented in the following areas: (1) parameterization of design characteristics; (2) markets, missions, costs, and operating procedures, (3) indices of efficiency for comparison; (4) identification of critical design and operational characteristics; and (5) definition of the 1930 state-of-the-art and the 1974 state-of-the-art from a technical and economic standpoint.
Study of aircraft in intraurban transportation systems
NASA Technical Reports Server (NTRS)
Stout, E. G.
1972-01-01
A systems analysis was conducted to define the technical economic and operational characteristics of an aircraft transportation system for short-range intracity commutor operations. The analysis was for 1975 and 1985 in the seven county, Detroit, Michigan area. STOL and VTOL aircraft were studied in sizes from 40 to 120 passengers. The preferred vehicle for the Detroit area was the deflected slipstream STOL. Since the study was parametric in nature, it is applicable to generalization, and it was concluded that a feasible intraurban air transportation system could be developed in many viable situations.
Improved heliostat field design for solar tower plants
NASA Astrophysics Data System (ADS)
Collado, Francisco J.; Guallar, Jesús
2017-06-01
In solar power tower (SPT) systems, selecting the optimum location of thousands of heliostats and the most profitable tower height and receiver size remains a challenge. Campo code is prepared for the detailed design of such plants in particular, the optimum layout, provided that the plant size is known. Therefore, less exhaustive codes, as DELSOL3, are also needed to perform preliminary parametric analysis that narrows the most economic size of the plant.
Research on AutoCAD secondary development and function expansion based on VBA technology
NASA Astrophysics Data System (ADS)
Zhang, Runmei; Gu, Yehuan
2017-06-01
AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.
NASA Technical Reports Server (NTRS)
Stow, W. K.; Cheeseman, C.; Dallam, W.; Dietrich, D.; Dorfman, G.; Fleming, R.; Fries, R.; Guard, W.; Jackson, F.; Jankowski, H.
1975-01-01
Economic benefits studies regarding the application of remote sensing to resource management and the Total Earth Resources for the Shuttle Era (TERSSE) study to outline the structure and development of future systems are used, along with experience from LANDSAT and LACIE, to define the system performance and economics of an operational Earth Resources system. The system is to be based on current (LANDSAT follow-on) technology and its application to high priority resource management missions, such as global crop inventory. The TERSSE Operational System Study (TOSS) investigated system-level design alternatives using economic performance as the evaluation criterion. As such, the TOSS effort represented a significant step forward in the systems engineering and economic analysis of Earth Resources programs. By parametrically relating engineering design parameters, such as sensor performance details, to the economic benefit mechanisms a new level of confidence in the conclusions concerning the implementation of such systems can be reached.
Carbon accounting and economic model uncertainty of emissions from biofuels-induced land use change.
Plevin, Richard J; Beckman, Jayson; Golub, Alla A; Witcover, Julie; O'Hare, Michael
2015-03-03
Few of the numerous published studies of the emissions from biofuels-induced "indirect" land use change (ILUC) attempt to propagate and quantify uncertainty, and those that have done so have restricted their analysis to a portion of the modeling systems used. In this study, we pair a global, computable general equilibrium model with a model of greenhouse gas emissions from land-use change to quantify the parametric uncertainty in the paired modeling system's estimates of greenhouse gas emissions from ILUC induced by expanded production of three biofuels. We find that for the three fuel systems examined--US corn ethanol, Brazilian sugar cane ethanol, and US soybean biodiesel--95% of the results occurred within ±20 g CO2e MJ(-1) of the mean (coefficient of variation of 20-45%), with economic model parameters related to crop yield and the productivity of newly converted cropland (from forestry and pasture) contributing most of the variance in estimated ILUC emissions intensity. Although the experiments performed here allow us to characterize parametric uncertainty, changes to the model structure have the potential to shift the mean by tens of grams of CO2e per megajoule and further broaden distributions for ILUC emission intensities.
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1995-01-01
Parametric cost analysis is a mathematical approach to estimating cost. Parametric cost analysis uses non-cost parameters, such as quality characteristics, to estimate the cost to bring forth, sustain, and retire a product. This paper reviews parametric cost analysis and shows how it can be used within the cost deployment process.
Analysis of operational requirements for medium density air transportation, volume 2
NASA Technical Reports Server (NTRS)
1975-01-01
The medium density air travel market is examined and defined in terms of numbers of people transported per route per day and frequency of service. The operational characteristics for aircraft to serve this market are determined and a basepoint aircraft is designed from which tradeoff studies and parametric variations can be conducted. The impact of the operational characteristics on the air travel system is evaluated along with the economic viability of the study aircraft. Research and technology programs for future study consideration are identified.
Debt and growth: A non-parametric approach
NASA Astrophysics Data System (ADS)
Brida, Juan Gabriel; Gómez, David Matesanz; Seijas, Maria Nela
2017-11-01
In this study, we explore the dynamic relationship between public debt and economic growth by using a non-parametric approach based on data symbolization and clustering methods. The study uses annual data of general government consolidated gross debt-to-GDP ratio and gross domestic product for sixteen countries between 1977 and 2015. Using symbolic sequences, we introduce a notion of distance between the dynamical paths of different countries. Then, a Minimal Spanning Tree and a Hierarchical Tree are constructed from time series to help detecting the existence of groups of countries sharing similar economic performance. The main finding of the study appears for the period 2008-2016 when several countries surpassed the 90% debt-to-GDP threshold. During this period, three groups (clubs) of countries are obtained: high, mid and low indebted countries, suggesting that the employed debt-to-GDP threshold drives economic dynamics for the selected countries.
Exergy & economic analysis of biogas fueled solid oxide fuel cell systems
NASA Astrophysics Data System (ADS)
Siefert, Nicholas S.; Litster, Shawn
2014-12-01
We present an exergy and an economic analysis of a power plant that uses biogas produced from a thermophilic anaerobic digester (AD) to fuel a solid oxide fuel cell (SOFC). We performed a 4-variable parametric analysis of the AD-SOFC system in order to determine the optimal design operation conditions, depending on the objective function of interest. We present results on the exergy efficiency (%), power normalized capital cost ( kW-1), and the internal rate of return on investment, IRR, (% yr-1) as a function of the current density, the stack pressure, the fuel utilization, and the total air stoichiometric ratio. To the authors' knowledge, this is the first AD-SOFC paper to include the cost of the AD when conducting economic optimization of the AD-SOFC plant. Our calculations show that adding a new AD-SOFC system to an existing waste water treatment (WWT) plant could yield positives values of IRR at today's average electricity prices and could significantly out-compete other options for using biogas to generate electricity. AD-SOFC systems could likely convert WWT plants into net generators of electricity rather than net consumers of electricity while generating economically viable rates of return on investment if the costs of SOFC systems are within a factor of two of the DOE/SECA cost targets.
Economic analysis and assessment of syngas production using a modeling approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei
Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost ofmore » syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.« less
NASA Technical Reports Server (NTRS)
Stokes, B. O.; Wallace, C. J.
1978-01-01
Ammonia production by Klebsiella pneumoniae is not economical with present strains and improving nitrogen fixation to its theoretical limits in this organism is not sufficient to achieve economic viability. Because the value of both the hydrogen produced by this organism and the methane value of the carbon source required greatly exceed the value of the ammonia formed, ammonia (fixed nitrogen) should be considered the by-product. The production of hydrogen by KLEBSIELLA or other anaerobic nitrogen fixers should receive additional study, because the activity of nitrogenase offers a significant improvement in hydrogen production. The production of fixed nitrogen in the form of cell mass by Azotobacter is also uneconomical and the methane value of the carbon substrate exceeds the value of the nitrogen fixed. Parametric studies indicate that as efficiencies approach the theoretical limits the economics may become competitive. The use of nif-derepressed microorganisms, particularly blue-green algae, may have significant potential for in situ fertilization in the environment.
The ASAC Air Carrier Investment Model (Third Generation)
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Gaier, Eric M.; Santmire, Tara E.
1998-01-01
To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. To accomplish this, NASA is building an Aviation System Analysis Capability (ASAC). The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. To link the economics of flight with the technology of flight, ASAC requires a parametrically based model with extensions that link airline operations and investments in aircraft with aircraft characteristics. This model also must provide a mechanism for incorporating air travel demand and profitability factors into the airlines' investment decisions. Finally, the model must be flexible and capable of being incorporated into a wide-ranging suite of economic and technical models flat are envisioned for ASAC.
To Invest or Not to Invest, That Is the Question: Analysis of Firm Behavior under Anticipated Shocks
Kovac, Dejan; Vukovic, Vuk; Kleut, Nikola; Podobnik, Boris
2016-01-01
When companies are faced with an upcoming and expected economic shock some of them tend to react better than others. They adapt by initiating investments thus successfully weathering the storm, while others, even though they possess the same information set, fail to adopt the same business strategy and eventually succumb to the crisis. We use a unique setting of the recent financial crisis in Croatia as an exogenous shock that hit the country with a time lag, allowing the domestic firms to adapt. We perform a survival analysis on the entire population of 144,000 firms in Croatia during the period from 2003 to 2015, and test whether investment prior to the anticipated shock makes firms more likely to survive the recession. We find that small and micro firms, which decided to invest, had between 60 and 70% higher survival rates than similar firms that chose not to invest. This claim is supported by both non-parametric and parametric tests in the survival analysis. From a normative perspective this finding could be important in mitigating the negative effects on aggregate demand during strong recessionary periods. PMID:27508896
Kovac, Dejan; Vukovic, Vuk; Kleut, Nikola; Podobnik, Boris
2016-01-01
When companies are faced with an upcoming and expected economic shock some of them tend to react better than others. They adapt by initiating investments thus successfully weathering the storm, while others, even though they possess the same information set, fail to adopt the same business strategy and eventually succumb to the crisis. We use a unique setting of the recent financial crisis in Croatia as an exogenous shock that hit the country with a time lag, allowing the domestic firms to adapt. We perform a survival analysis on the entire population of 144,000 firms in Croatia during the period from 2003 to 2015, and test whether investment prior to the anticipated shock makes firms more likely to survive the recession. We find that small and micro firms, which decided to invest, had between 60 and 70% higher survival rates than similar firms that chose not to invest. This claim is supported by both non-parametric and parametric tests in the survival analysis. From a normative perspective this finding could be important in mitigating the negative effects on aggregate demand during strong recessionary periods.
Commercial D-T FRC Power Plant Systems Analysis
NASA Astrophysics Data System (ADS)
Nguyen, Canh; Santarius, John; Emmert, Gilbert; Steinhauer, Loren; Stubna, Michael
1998-11-01
Results of an engineering issues scoping study of a Field-Reversed Configuration (FRC) burning D-T fuel will be presented. The study primarily focuses on engineering issues, such as tritium-breeding blanket design, radiation shielding, neutron damage, activation, safety, and environment. This presentation will concentrate on plasma physics, current drive, economics, and systems integration, which are important for the overall systems analysis. A systems code serves as the key tool in defining a reference point for detailed physics and engineering calculations plus parametric variations, and typical cases will be presented. Advantages of the cylindrical geometry and high beta (plasma pressure/magnetic-field pressure) are evident.
Evaluation of automobiles with alternative fuels utilizing multicriteria techniques
NASA Astrophysics Data System (ADS)
Brey, J. J.; Contreras, I.; Carazo, A. F.; Brey, R.; Hernández-Díaz, A. G.; Castro, A.
This work applies the non-parametric technique of Data Envelopment Analysis (DEA) to conduct a multicriteria comparison of some existing and under development technologies in the automotive sector. The results indicate that some of the technologies under development, such as hydrogen fuel cell vehicles, can be classified as efficient when evaluated in function of environmental and economic criteria, with greater importance being given to the environmental criteria. The article also demonstrates the need to improve the hydrogen-based technology, in comparison with the others, in aspects such as vehicle sale costs and fuel price.
ERIC Educational Resources Information Center
Osler, James Edward
2014-01-01
This monograph provides an epistemological rational for the design of a novel post hoc statistical measure called "Tri-Center Analysis". This new statistic is designed to analyze the post hoc outcomes of the Tri-Squared Test. In Tri-Center Analysis trichotomous parametric inferential parametric statistical measures are calculated from…
Potassium topping cycles for stationary power. [conceptual analysis
NASA Technical Reports Server (NTRS)
Rossbach, R. J.
1975-01-01
A design study was made of the potassium topping cycle powerplant for central station use. Initially, powerplant performance and economics were studied parametrically by using an existing steam plant as the bottom part of the cycle. Two distinct powerplants were identified which had good thermodynamic and economic performance. Conceptual designs were made of these two powerplants in the 1200 MWe size, and capital and operating costs were estimated for these powerplants. A technical evaluation of these plants was made including conservation of fuel resources, environmental impact, technology status, and degree of development risk. It is concluded that the potassium topping cycle could have a significant impact on national goals such as air and water pollution control and conservation of natural resources because of its higher energy conversion efficiency.
Economic policy optimization based on both one stochastic model and the parametric control theory
NASA Astrophysics Data System (ADS)
Ashimov, Abdykappar; Borovskiy, Yuriy; Onalbekov, Mukhit
2016-06-01
A nonlinear dynamic stochastic general equilibrium model with financial frictions is developed to describe two interacting national economies in the environment of the rest of the world. Parameters of nonlinear model are estimated based on its log-linearization by the Bayesian approach. The nonlinear model is verified by retroprognosis, estimation of stability indicators of mappings specified by the model, and estimation the degree of coincidence for results of internal and external shocks' effects on macroeconomic indicators on the basis of the estimated nonlinear model and its log-linearization. On the base of the nonlinear model, the parametric control problems of economic growth and volatility of macroeconomic indicators of Kazakhstan are formulated and solved for two exchange rate regimes (free floating and managed floating exchange rates)
Siciliani, Luigi
2006-01-01
Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.
Prospects for reduced energy transports: A preliminary analysis
NASA Technical Reports Server (NTRS)
Ardema, M. D.; Harper, M.; Smith, C. L.; Waters, M. H.; Williams, L. J.
1974-01-01
The recent energy crisis and subsequent substantial increase in fuel prices have provided increased incentive to reduce the fuel consumption of civil transport aircraft. At the present time many changes in operational procedures have been introduced to decrease fuel consumption of the existing fleet. In the future, however, it may become desirable or even necessary to introduce new fuel-conservative aircraft designs. This paper reports the results of a preliminary study of new near-term fuel conservative aircraft. A parametric study was made to determine the effects of cruise Mach number and fuel cost on the optimum configuration characteristics and on economic performance. For each design, the wing geometry was optimized to give maximum return on investment at a particular fuel cost. Based on the results of the parametric study, a nominal reduced energy configuration was selected. Compared with existing transport designs, the reduced energy design has a higher aspect ratio wing with lower sweep, and cruises at a lower Mach number. It has about 30% less fuel consumption on a seat-mile basis.
Zhang, Chen; Wang, Yuan; Song, Xiaowei; Kubota, Jumpei; He, Yanmin; Tojo, Junji; Zhu, Xiaodong
2017-12-31
This paper concentrates on a Chinese context and makes efforts to develop an integrated process to explicitly elucidate the relationship between economic growth and water pollution discharge-chemical oxygen demand (COD) discharge and ammonia nitrogen (NH 3 -N), using two unbalanced panel data sets covering the period separately from 1990 to 2014, and 2001 to 2014. In our present study, the panel unit root tests, cointegration tests, and Granger causality tests allowing for cross-sectional dependence, nonstationary, and heterogeneity are conducted to examine the causal effects of economic growth on COD/NH 3 -N discharge. Further, we simultaneously apply semi-parametric fixed effects estimation and parametric fixed effects estimation to investigate environmental Kuznets curve relationship for COD/NH 3 -N discharge. Our empirical results show a long-term bidirectional causality between economic growth and COD/NH 3 -N discharge in China. Within the Stochastic Impacts by Regression on Population, Affluence and Technology framework, we find evidence in support of an inverted U-shaped curved link between economic growth and COD/NH 3 -N discharge. To the best of our knowledge, there have not been any efforts made in investigating the nexus of economic growth and water pollution in such an integrated manner. Therefore, this study takes a fresh look on this topic. Copyright © 2017 Elsevier B.V. All rights reserved.
The quantile regression approach to efficiency measurement: insights from Monte Carlo simulations.
Liu, Chunping; Laporte, Audrey; Ferguson, Brian S
2008-09-01
In the health economics literature there is an ongoing debate over approaches used to estimate the efficiency of health systems at various levels, from the level of the individual hospital - or nursing home - up to that of the health system as a whole. The two most widely used approaches to evaluating the efficiency with which various units deliver care are non-parametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Productivity researchers tend to have very strong preferences over which methodology to use for efficiency estimation. In this paper, we use Monte Carlo simulation to compare the performance of DEA and SFA in terms of their ability to accurately estimate efficiency. We also evaluate quantile regression as a potential alternative approach. A Cobb-Douglas production function, random error terms and a technical inefficiency term with different distributions are used to calculate the observed output. The results, based on these experiments, suggest that neither DEA nor SFA can be regarded as clearly dominant, and that, depending on the quantile estimated, the quantile regression approach may be a useful addition to the armamentarium of methods for estimating technical efficiency.
Study of aircraft in intraurban transportation systems, volume 1
NASA Technical Reports Server (NTRS)
Stout, E. G.; Kesling, P. H.; Matteson, H. C.; Sherwood, D. E.; Tuck, W. R., Jr.; Vaughn, L. A.
1971-01-01
An analysis of an effective short range, high density computer transportation system for intraurban systems is presented. The seven county Detroit, Michigan, metropolitan area, was chosen as the scenario for the analysis. The study consisted of an analysis and forecast of the Detroit market through 1985, a parametric analysis of appropriate short haul aircraft concepts and associated ground systems, and a preliminary overall economic analysis of a simplified total system designed to evaluate the candidate vehicles and select the most promising VTOL and STOL aircraft. Data are also included on the impact of advanced technology on the system, the sensitivity of mission performance to changes in aircraft characteristics and system operations, and identification of key problem areas that may be improved by additional research. The approach, logic, and computer models used are adaptable to other intraurban or interurban areas.
Conceptual design of reduced energy transports
NASA Technical Reports Server (NTRS)
Ardema, M. D.; Harper, M.; Smith, C. L.; Waters, M. H.; Williams, L. J.
1975-01-01
This paper reports the results of a conceptual design study of new, near-term fuel-conservative aircraft. A parametric study was made to determine the effects of cruise Mach number and fuel cost on the 'optimum' configuration characteristics and on economic performance. Supercritical wing technology and advanced engine cycles were assumed. For each design, the wing geometry was optimized to give maximum return on investment at a particular fuel cost. Based on the results of the parametric study, a reduced energy configuration was selected. Compared with existing transport designs, the reduced energy design has a higher aspect ratio wing with lower sweep, and cruises at a lower Mach number. It yields about 30% more seat-miles/gal than current wide-body aircraft. At the higher fuel costs anticipated in the future, the reduced energy design has about the same economic performance as existing designs.
A Cartesian parametrization for the numerical analysis of material instability
Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.; ...
2016-02-25
We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less
A Cartesian parametrization for the numerical analysis of material instability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.
We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less
Energy efficient engine: Preliminary design and integration studies
NASA Technical Reports Server (NTRS)
Johnston, R. P.; Hirschkron, R.; Koch, C. C.; Neitzel, R. E.; Vinson, P. W.
1978-01-01
Parametric design and mission evaluations of advanced turbofan configurations were conducted for future transport aircraft application. Economics, environmental suitability and fuel efficiency were investigated and compared with goals set by NASA. Of the candidate engines which included mixed- and separate-flow, direct-drive and geared configurations, an advanced mixed-flow direct-drive configuration was selected for further design and evaluation. All goals were judged to have been met except the acoustic goal. Also conducted was a performance risk analysis and a preliminary aerodynamic design of the 10 stage 23:1 pressure ratio compressor used in the study engines.
Parametric Methods for Dynamic 11C-Phenytoin PET Studies.
Mansor, Syahir; Yaqub, Maqsood; Boellaard, Ronald; Froklage, Femke E; de Vries, Anke; Bakker, Esther D M; Voskuyl, Rob A; Eriksson, Jonas; Schwarte, Lothar A; Verbeek, Joost; Windhorst, Albert D; Lammertsma, Adriaan A
2017-03-01
In this study, the performance of various methods for generating quantitative parametric images of dynamic 11 C-phenytoin PET studies was evaluated. Methods: Double-baseline 60-min dynamic 11 C-phenytoin PET studies, including online arterial sampling, were acquired for 6 healthy subjects. Parametric images were generated using Logan plot analysis, a basis function method, and spectral analysis. Parametric distribution volume (V T ) and influx rate ( K 1 ) were compared with those obtained from nonlinear regression analysis of time-activity curves. In addition, global and regional test-retest (TRT) variability was determined for parametric K 1 and V T values. Results: Biases in V T observed with all parametric methods were less than 5%. For K 1 , spectral analysis showed a negative bias of 16%. The mean TRT variabilities of V T and K 1 were less than 10% for all methods. Shortening the scan duration to 45 min provided similar V T and K 1 with comparable TRT performance compared with 60-min data. Conclusion: Among the various parametric methods tested, the basis function method provided parametric V T and K 1 values with the least bias compared with nonlinear regression data and showed TRT variabilities lower than 5%, also for smaller volume-of-interest sizes (i.e., higher noise levels) and shorter scan duration. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Housing price prediction: parametric versus semi-parametric spatial hedonic models
NASA Astrophysics Data System (ADS)
Montero, José-María; Mínguez, Román; Fernández-Avilés, Gema
2018-01-01
House price prediction is a hot topic in the economic literature. House price prediction has traditionally been approached using a-spatial linear (or intrinsically linear) hedonic models. It has been shown, however, that spatial effects are inherent in house pricing. This article considers parametric and semi-parametric spatial hedonic model variants that account for spatial autocorrelation, spatial heterogeneity and (smooth and nonparametrically specified) nonlinearities using penalized splines methodology. The models are represented as a mixed model that allow for the estimation of the smoothing parameters along with the other parameters of the model. To assess the out-of-sample performance of the models, the paper uses a database containing the price and characteristics of 10,512 homes in Madrid, Spain (Q1 2010). The results obtained suggest that the nonlinear models accounting for spatial heterogeneity and flexible nonlinear relationships between some of the individual or areal characteristics of the houses and their prices are the best strategies for house price prediction.
Biophysics and economic potential analysis of vertisols for maize in the humid tropics of Indonesia
NASA Astrophysics Data System (ADS)
Neswati, R.; Lopulisa, C.; Ahmad, A.; Nathan, M.
2018-05-01
The main objective of this study is to establish the potential of Vertisols both biophysical and economic found in the humid relatively dry tropics region in the southern part of Sulawesi to the development of maize. This study used a spatial approach for establishing research sites totalling in 3 observations units of soil profile and involved 30 farmers as respondents. Land potential analysis of was conducted using parametric approach while economic analysis using the benefit-cost (B-C) ratio. The results show that the growing period of the study sites was in November to June, or 240 days and classified as climate type E3. The locations potentially moderately suitable or S2s for maize with soil texture as limiting factors in the growing season from November to June. In the growing season from July to October, the locations are not suitable or Nc with limiting factors such as very limited rainfall in the period of crop ripening. Land suitability Index of study site during the growing season from November to June ranged from 52 to 72 with an average productivity of maize obtained by farmers ranged from 4.3 to 5.7 tons of dry grain per hectare. Analysis of B-C ratio indicates that cultivation of maize in the growing period at such study locations are feasible with the value of B-C ratio ranged from 1.8 to 2.5. These results show that Vertisols in the humid tropics of Indonesia are physically and economically potential for maize development.
NASA Astrophysics Data System (ADS)
Rounaghi, Mohammad Mahdi; Abbaszadeh, Mohammad Reza; Arashi, Mohammad
2015-11-01
One of the most important topics of interest to investors is stock price changes. Investors whose goals are long term are sensitive to stock price and its changes and react to them. In this regard, we used multivariate adaptive regression splines (MARS) model and semi-parametric splines technique for predicting stock price in this study. The MARS model as a nonparametric method is an adaptive method for regression and it fits for problems with high dimensions and several variables. semi-parametric splines technique was used in this study. Smoothing splines is a nonparametric regression method. In this study, we used 40 variables (30 accounting variables and 10 economic variables) for predicting stock price using the MARS model and using semi-parametric splines technique. After investigating the models, we select 4 accounting variables (book value per share, predicted earnings per share, P/E ratio and risk) as influencing variables on predicting stock price using the MARS model. After fitting the semi-parametric splines technique, only 4 accounting variables (dividends, net EPS, EPS Forecast and P/E Ratio) were selected as variables effective in forecasting stock prices.
Problems of the design of low-noise input devices. [parametric amplifiers
NASA Technical Reports Server (NTRS)
Manokhin, V. M.; Nemlikher, Y. A.; Strukov, I. A.; Sharfov, Y. A.
1974-01-01
An analysis is given of the requirements placed on the elements of parametric centimeter waveband amplifiers for achievement of minimal noise temperatures. A low-noise semiconductor parametric amplifier using germanium parametric diodes for a receiver operating in the 4 GHz band was developed and tested confirming the possibility of satisfying all requirements.
Analysis of operational requirements for medium density air transportation. Volume 1: Summary
NASA Technical Reports Server (NTRS)
1975-01-01
The medium density air travel market was studied to determine the aircraft design and operational requirements. The impact of operational characteristics on the air travel system and the economic viability of the study aircraft were also evaluated. Medium density is defined in terms of numbers of people transported (20 to 500 passengers per day on round trip routes), and frequency of service ( a minumium of two and maximum of eight round trips per day) for 10 regional carriers. The operational characteristics of aircraft best suited to serve the medium density air transportation market are determined and a basepoint aircraft is designed from which tradeoff studies and parametric variations could be conducted. The impact of selected aircraft on the medium density market, economics, and operations is ascertained. Research and technology objectives for future programs in medium density air transportation are identified and ranked.
NASA Technical Reports Server (NTRS)
Wetzler, E.; Sand, F.; Stevenson, P.; Putnam, M.
1975-01-01
A case study analysis is presented of the relationships between improvements in the accuracy, frequency, and timeliness of information used in making hydrological forecasts and economic benefits in the areas of hydropower and irrigation. The area chosen for the case study is the Oroville Dam and Reservoir. Emphasis is placed on the use of timely and accurate mapping of the aerial extent of snow in the basin by earth resources survey systems such as LANDSAT. The subject of benefits resulting from improved runoff forecasts is treated in a generalized way without specifying the source of the improvements.
Economic geology of lunar Helium-3
NASA Technical Reports Server (NTRS)
Schmitt, Harrison H.
1988-01-01
Economic geology evaluation of lunar He-3 should answer the question: Can lunar He-3 be sold on Earth with sufficient profit margins and low enough risk to attract capital investment in the enterprise. Concepts that relate to economic geology of recovering He-3 from the lunar maria are not new to human experience. A parametric cost and technology evaluation scheme, based on existing and future data, is required to qualitatively and quantitatively assess the comprehensive economic feasibility and return on investment of He-3 recovery from the lunar maria. There are also many political issues which must be considered as a result of nuclear fusion and lunar mining.
Schmidt, K; Witte, H
1999-11-01
Recently the assumption of the independence of individual frequency components in a signal has been rejected, for example, for the EEG during defined physiological states such as sleep or sedation [9, 10]. Thus, the use of higher-order spectral analysis capable of detecting interrelations between individual signal components has proved useful. The aim of the present study was to investigate the quality of various non-parametric and parametric estimation algorithms using simulated as well as true physiological data. We employed standard algorithms available for the MATLAB. The results clearly show that parametric bispectral estimation is superior to non-parametric estimation in terms of the quality of peak localisation and the discrimination from other peaks.
1983-04-11
existing ones. * -37- !I T-472 REFERENCES [1] Avriel, M., W. E. Diewert, S. Schaible and W. T. Ziemba (1981). Introduction to concave and generalized concave...functions. In Generalized Concavity in Optimization and Economics (S. Schaible and W. T. Ziemba , eds.), Academic Press, New York, pp. 21-50. (21 Bank...Optimality conditions involving generalized convex mappings. In Generalized Concavity in Optimization and Economics (S. Schaible and W. T. Ziemba
First-Order Parametric Model of Reflectance Spectra for Dyed Fabrics
2016-02-19
Unclassified Unlimited 31 Daniel Aiken (202) 279-5293 Parametric modeling Inverse /direct analysis This report describes a first-order parametric model of...Appendix: Dielectric Response Functions for Dyes Obtained by Inverse Analysis ……………………………...…………………………………………………….19 1 First-Order Parametric...which provides for both their inverse and direct modeling1. The dyes considered contain spectral features that are of interest to the U.S. Navy for
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
Parametric versus Cox's model: an illustrative analysis of divorce in Canada.
Balakrishnan, T R; Rao, K V; Krotki, K J; Lapierre-adamcyk, E
1988-06-01
Recent demographic literature clearly recognizes the importance of survival modes in the analysis of cross-sectional event histories. Of the various survival models, Cox's (1972) partial parametric model has been very popular due to its simplicity, and readily available computer software for estimation, sometimes at the cost of precision and parsimony of the model. This paper focuses on parametric failure time models for event history analysis such as Weibell, lognormal, loglogistic, and exponential models. The authors also test the goodness of fit of these parametric models versus the Cox's proportional hazards model taking Kaplan-Meier estimate as base. As an illustration, the authors reanalyze the Canadian Fertility Survey data on 1st marriage dissolution with parametric models. Though these parametric model estimates were not very different from each other, there seemed to be a slightly better fit with loglogistic. When 8 covariates were used in the analysis, it was found that the coefficients were similar in the models, and the overall conclusions about the relative risks would not have been different. The findings reveal that in marriage dissolution, the differences according to demographic and socioeconomic characteristics may be far more important than is generally found in many studies. Therefore, one should not treat the population as homogeneous in analyzing survival probabilities of marriages, other than for cursory analysis of overall trends.
EEG Correlates of Fluctuation in Cognitive Performance in an Air Traffic Control Task
2014-11-01
using non-parametric statistical analysis to identify neurophysiological patterns due to the time-on-task effect. Significant changes in EEG power...EEG, Cognitive Performance, Power Spectral Analysis , Non-Parametric Analysis Document is available to the public through the Internet...3 Performance Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 EEG
Fan, Zhen; Dani, Melanie; Femminella, Grazia D; Wood, Melanie; Calsolaro, Valeria; Veronese, Mattia; Turkheimer, Federico; Gentleman, Steve; Brooks, David J; Hinz, Rainer; Edison, Paul
2018-07-01
Neuroinflammation and microglial activation play an important role in amnestic mild cognitive impairment (MCI) and Alzheimer's disease. In this study, we investigated the spatial distribution of neuroinflammation in MCI subjects, using spectral analysis (SA) to generate parametric maps and quantify 11 C-PBR28 PET, and compared these with compartmental and other kinetic models of quantification. Thirteen MCI and nine healthy controls were enrolled in this study. Subjects underwent 11 C-PBR28 PET scans with arterial cannulation. Spectral analysis with an arterial plasma input function was used to generate 11 C-PBR28 parametric maps. These maps were then compared with regional 11 C-PBR28 V T (volume of distribution) using a two-tissue compartment model and Logan graphic analysis. Amyloid load was also assessed with 18 F-Flutemetamol PET. With SA, three component peaks were identified in addition to blood volume. The 11 C-PBR28 impulse response function (IRF) at 90 min produced the lowest coefficient of variation. Single-subject analysis using this IRF demonstrated microglial activation in five out of seven amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake revealed a group-wise significant increase in neuroinflammation in amyloid-positive MCI subjects versus HC in multiple cortical association areas, and particularly in the temporal lobe. Interestingly, compartmental analysis detected group-wise increase in 11 C-PBR28 binding in the thalamus of amyloid-positive MCI subjects, while Logan parametric maps did not perform well. This study demonstrates for the first time that spectral analysis can be used to generate parametric maps of 11 C-PBR28 uptake, and is able to detect microglial activation in amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake allow voxel-wise single-subject analysis and could be used to evaluate microglial activation in individual subjects.
Flood protection diversification to reduce probabilities of extreme losses.
Zhou, Qian; Lambert, James H; Karvetski, Christopher W; Keisler, Jeffrey M; Linkov, Igor
2012-11-01
Recent catastrophic losses because of floods require developing resilient approaches to flood risk protection. This article assesses how diversification of a system of coastal protections might decrease the probabilities of extreme flood losses. The study compares the performance of portfolios each consisting of four types of flood protection assets in a large region of dike rings. A parametric analysis suggests conditions in which diversifications of the types of included flood protection assets decrease extreme flood losses. Increased return periods of extreme losses are associated with portfolios where the asset types have low correlations of economic risk. The effort highlights the importance of understanding correlations across asset types in planning for large-scale flood protection. It allows explicit integration of climate change scenarios in developing flood mitigation strategy. © 2012 Society for Risk Analysis.
Gao, Lan; Hu, Hao; Zhao, Fei-Li; Li, Shu-Chuen
2016-01-01
Objectives To systematically review cost of illness studies for schizophrenia (SC), epilepsy (EP) and type 2 diabetes mellitus (T2DM) and explore the transferability of direct medical cost across countries. Methods A comprehensive literature search was performed to yield studies that estimated direct medical costs. A generalized linear model (GLM) with gamma distribution and log link was utilized to explore the variation in costs that accounted by the included factors. Both parametric (Random-effects model) and non-parametric (Boot-strapping) meta-analyses were performed to pool the converted raw cost data (expressed as percentage of GDP/capita of the country where the study was conducted). Results In total, 93 articles were included (40 studies were for T2DM, 34 studies for EP and 19 studies for SC). Significant variances were detected inter- and intra-disease classes for the direct medical costs. Multivariate analysis identified that GDP/capita (p<0.05) was a significant factor contributing to the large variance in the cost results. Bootstrapping meta-analysis generated more conservative estimations with slightly wider 95% confidence intervals (CI) than the parametric meta-analysis, yielding a mean (95%CI) of 16.43% (11.32, 21.54) for T2DM, 36.17% (22.34, 50.00) for SC and 10.49% (7.86, 13.41) for EP. Conclusions Converting the raw cost data into percentage of GDP/capita of individual country was demonstrated to be a feasible approach to transfer the direct medical cost across countries. The approach from our study to obtain an estimated direct cost value along with the size of specific disease population from each jurisdiction could be used for a quick check on the economic burden of particular disease for countries without such data. PMID:26814959
Parametric Modelling of As-Built Beam Framed Structure in Bim Environment
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2017-02-01
A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.
Linking Physical Climate Research and Economic Assessments of Mitigation Policies
NASA Astrophysics Data System (ADS)
Stainforth, David; Calel, Raphael
2017-04-01
Evaluating climate change policies requires economic assessments which balance the costs and benefits of climate action. A certain class of Integrated Assessment Models (IAMS) are widely used for this type of analysis; DICE, PAGE and FUND are three of the most influential. In the economics community there has been much discussion and debate about the economic assumptions implemented within these models. Two aspects in particular have gained much attention: i) the costs of damages resulting from climate change - the so-called damage function, and ii) the choice of discount rate applied to future costs and benefits. There has, however, been rather little attention given to the consequences of the choices made in the physical climate models within these IAMS. Here we discuss the practical aspects of the implementation of the physical models in these IAMS, as well as the implications of choices made in these physical science components for economic assessments[1]. We present a simple breakdown of how these IAMS differently represent the climate system as a consequence of differing underlying physical models, different parametric assumptions (for parameters representing, for instance, feedbacks and ocean heat uptake) and different numerical approaches to solving the models. We present the physical and economic consequences of these differences and reflect on how we might better incorporate the latest physical science understanding in economic models of this type. [1] Calel, R. and Stainforth D.A., "On the Physics of Three Integrated Assessment Models", Bulletin of the American Meteorological Society, in press.
A new simple form of quark mixing matrix
NASA Astrophysics Data System (ADS)
Qin, Nan; Ma, Bo-Qiang
2011-01-01
Although different parametrizations of quark mixing matrix are mathematically equivalent, the consequences of experimental analysis may be distinct. Based on the triminimal expansion of Kobayashi-Maskawa matrix around the unit matrix, we propose a new simple parametrization. Compared with the Wolfenstein parametrization, we find that the new form is not only consistent with the original one in the hierarchical structure, but also more convenient for numerical analysis and measurement of the CP-violating phase. By discussing the relation between our new form and the unitarity boomerang, we point out that along with the unitarity boomerang, this new parametrization is useful in hunting for new physics.
Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A
2015-05-01
Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong Yuli; Zou Xubo; Guo Guangcan
We investigate the economical Gaussian cloning of coherent states with the known phase, which produces M copies from N input replica and can be implemented with degenerate parametric amplifiers and beam splitters.The achievable fidelity of single copy is given by 2M{radical}(N)/[{radical}(N)(M-1)+{radical}((1+N)(M{sup 2}+N))], which is bigger than the optimal fidelity of the universal Gaussian cloning. The cloning machine presented here works without ancillary optical modes and can be regarded as the continuous variable generalization of the economical cloning machine for qudits.
Parametric Mass Modeling for Mars Entry, Descent and Landing System Analysis Study
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.; Komar, D. R.
2011-01-01
This paper provides an overview of the parametric mass models used for the Entry, Descent, and Landing Systems Analysis study conducted by NASA in FY2009-2010. The study examined eight unique exploration class architectures that included elements such as a rigid mid-L/D aeroshell, a lifting hypersonic inflatable decelerator, a drag supersonic inflatable decelerator, a lifting supersonic inflatable decelerator implemented with a skirt, and subsonic/supersonic retro-propulsion. Parametric models used in this study relate the component mass to vehicle dimensions and mission key environmental parameters such as maximum deceleration and total heat load. The use of a parametric mass model allows the simultaneous optimization of trajectory and mass sizing parameters.
Pretest uncertainty analysis for chemical rocket engine tests
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1987-01-01
A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.
A Parametric Rosetta Energy Function Analysis with LK Peptides on SAM Surfaces.
Lubin, Joseph H; Pacella, Michael S; Gray, Jeffrey J
2018-05-08
Although structures have been determined for many soluble proteins and an increasing number of membrane proteins, experimental structure determination methods are limited for complexes of proteins and solid surfaces. An economical alternative or complement to experimental structure determination is molecular simulation. Rosetta is one software suite that models protein-surface interactions, but Rosetta is normally benchmarked on soluble proteins. For surface interactions, the validity of the energy function is uncertain because it is a combination of independent parameters from energy functions developed separately for solution proteins and mineral surfaces. Here, we assess the performance of the RosettaSurface algorithm and test the accuracy of its energy function by modeling the adsorption of leucine/lysine (LK)-repeat peptides on methyl- and carboxy-terminated self-assembled monolayers (SAMs). We investigated how RosettaSurface predictions for this system compare with the experimental results, which showed that on both surfaces, LK-α peptides folded into helices and LK-β peptides held extended structures. Utilizing this model system, we performed a parametric analysis of Rosetta's Talaris energy function and determined that adjusting solvation parameters offered improved predictive accuracy. Simultaneously increasing lysine carbon hydrophilicity and the hydrophobicity of the surface methyl head groups yielded computational predictions most closely matching the experimental results. De novo models still should be interpreted skeptically unless bolstered in an integrative approach with experimental data.
A Semi-parametric Transformation Frailty Model for Semi-competing Risks Survival Data
Jiang, Fei; Haneuse, Sebastien
2016-01-01
In the analysis of semi-competing risks data interest lies in estimation and inference with respect to a so-called non-terminal event, the observation of which is subject to a terminal event. Multi-state models are commonly used to analyse such data, with covariate effects on the transition/intensity functions typically specified via the Cox model and dependence between the non-terminal and terminal events specified, in part, by a unit-specific shared frailty term. To ensure identifiability, the frailties are typically assumed to arise from a parametric distribution, specifically a Gamma distribution with mean 1.0 and variance, say, σ2. When the frailty distribution is misspecified, however, the resulting estimator is not guaranteed to be consistent, with the extent of asymptotic bias depending on the discrepancy between the assumed and true frailty distributions. In this paper, we propose a novel class of transformation models for semi-competing risks analysis that permit the non-parametric specification of the frailty distribution. To ensure identifiability, the class restricts to parametric specifications of the transformation and the error distribution; the latter are flexible, however, and cover a broad range of possible specifications. We also derive the semi-parametric efficient score under the complete data setting and propose a non-parametric score imputation method to handle right censoring; consistency and asymptotic normality of the resulting estimators is derived and small-sample operating characteristics evaluated via simulation. Although the proposed semi-parametric transformation model and non-parametric score imputation method are motivated by the analysis of semi-competing risks data, they are broadly applicable to any analysis of multivariate time-to-event outcomes in which a unit-specific shared frailty is used to account for correlation. Finally, the proposed model and estimation procedures are applied to a study of hospital readmission among patients diagnosed with pancreatic cancer. PMID:28439147
NASA Astrophysics Data System (ADS)
Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.
2015-12-01
Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.
NASA Astrophysics Data System (ADS)
Kazmi, K. R.; Khan, F. A.
2008-01-01
In this paper, using proximal-point mapping technique of P-[eta]-accretive mapping and the property of the fixed-point set of set-valued contractive mappings, we study the behavior and sensitivity analysis of the solution set of a parametric generalized implicit quasi-variational-like inclusion involving P-[eta]-accretive mapping in real uniformly smooth Banach space. Further, under suitable conditions, we discuss the Lipschitz continuity of the solution set with respect to the parameter. The technique and results presented in this paper can be viewed as extension of the techniques and corresponding results given in [R.P. Agarwal, Y.-J. Cho, N.-J. Huang, Sensitivity analysis for strongly nonlinear quasi-variational inclusions, Appl. MathE Lett. 13 (2002) 19-24; S. Dafermos, Sensitivity analysis in variational inequalities, Math. Oper. Res. 13 (1988) 421-434; X.-P. Ding, Sensitivity analysis for generalized nonlinear implicit quasi-variational inclusions, Appl. Math. Lett. 17 (2) (2004) 225-235; X.-P. Ding, Parametric completely generalized mixed implicit quasi-variational inclusions involving h-maximal monotone mappings, J. Comput. Appl. Math. 182 (2) (2005) 252-269; X.-P. Ding, C.L. Luo, On parametric generalized quasi-variational inequalities, J. Optim. Theory Appl. 100 (1999) 195-205; Z. Liu, L. Debnath, S.M. Kang, J.S. Ume, Sensitivity analysis for parametric completely generalized nonlinear implicit quasi-variational inclusions, J. Math. Anal. Appl. 277 (1) (2003) 142-154; R.N. Mukherjee, H.L. Verma, Sensitivity analysis of generalized variational inequalities, J. Math. Anal. Appl. 167 (1992) 299-304; M.A. Noor, Sensitivity analysis framework for general quasi-variational inclusions, Comput. Math. Appl. 44 (2002) 1175-1181; M.A. Noor, Sensitivity analysis for quasivariational inclusions, J. Math. Anal. Appl. 236 (1999) 290-299; J.Y. Park, J.U. Jeong, Parametric generalized mixed variational inequalities, Appl. Math. Lett. 17 (2004) 43-48].
Ng, S K; McLachlan, G J
2003-04-15
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.
Kramer, Gerbrand Maria; Frings, Virginie; Heijtel, Dennis; Smit, E F; Hoekstra, Otto S; Boellaard, Ronald
2017-06-01
The objective of this study was to validate several parametric methods for quantification of 3'-deoxy-3'- 18 F-fluorothymidine ( 18 F-FLT) PET in advanced-stage non-small cell lung carcinoma (NSCLC) patients with an activating epidermal growth factor receptor mutation who were treated with gefitinib or erlotinib. Furthermore, we evaluated the impact of noise on accuracy and precision of the parametric analyses of dynamic 18 F-FLT PET/CT to assess the robustness of these methods. Methods : Ten NSCLC patients underwent dynamic 18 F-FLT PET/CT at baseline and 7 and 28 d after the start of treatment. Parametric images were generated using plasma input Logan graphic analysis and 2 basis functions-based methods: a 2-tissue-compartment basis function model (BFM) and spectral analysis (SA). Whole-tumor-averaged parametric pharmacokinetic parameters were compared with those obtained by nonlinear regression of the tumor time-activity curve using a reversible 2-tissue-compartment model with blood volume fraction. In addition, 2 statistically equivalent datasets were generated by countwise splitting the original list-mode data, each containing 50% of the total counts. Both new datasets were reconstructed, and parametric pharmacokinetic parameters were compared between the 2 replicates and the original data. Results: After the settings of each parametric method were optimized, distribution volumes (V T ) obtained with Logan graphic analysis, BFM, and SA all correlated well with those derived using nonlinear regression at baseline and during therapy ( R 2 ≥ 0.94; intraclass correlation coefficient > 0.97). SA-based V T images were most robust to increased noise on a voxel-level (repeatability coefficient, 16% vs. >26%). Yet BFM generated the most accurate K 1 values ( R 2 = 0.94; intraclass correlation coefficient, 0.96). Parametric K 1 data showed a larger variability in general; however, no differences were found in robustness between methods (repeatability coefficient, 80%-84%). Conclusion: Both BFM and SA can generate quantitatively accurate parametric 18 F-FLT V T images in NSCLC patients before and during therapy. SA was more robust to noise, yet BFM provided more accurate parametric K 1 data. We therefore recommend BFM as the preferred parametric method for analysis of dynamic 18 F-FLT PET/CT studies; however, SA can also be used. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Parametric analysis of ATM solar array.
NASA Technical Reports Server (NTRS)
Singh, B. K.; Adkisson, W. B.
1973-01-01
The paper discusses the methods used for the calculation of ATM solar array performance characteristics and provides the parametric analysis of solar panels used in SKYLAB. To predict the solar array performance under conditions other than test conditions, a mathematical model has been developed. Four computer programs have been used to convert the solar simulator test data to the parametric curves. The first performs module summations, the second determines average solar cell characteristics which will cause a mathematical model to generate a curve matching the test data, the third is a polynomial fit program which determines the polynomial equations for the solar cell characteristics versus temperature, and the fourth program uses the polynomial coefficients generated by the polynomial curve fit program to generate the parametric data.
The parametric resonance—from LEGO Mindstorms to cold atoms
NASA Astrophysics Data System (ADS)
Kawalec, Tomasz; Sierant, Aleksandra
2017-07-01
We show an experimental setup based on a popular LEGO Mindstorms set, allowing us to both observe and investigate the parametric resonance phenomenon. The presented method is simple but covers a variety of student activities like embedded software development, conducting measurements, data collection and analysis. It may be used during science shows, as part of student projects and to illustrate the parametric resonance in mechanics or even quantum physics, during lectures or classes. The parametrically driven LEGO pendulum gains energy in a spectacular way, increasing its amplitude from 10° to about 100° within a few tens of seconds. We provide also a short description of a wireless absolute orientation sensor that may be used in quantitative analysis of driven or free pendulum movement.
Investigating Sustainability Impacts of Bioenergy Usage Within the Eisenwurzen Region in Austria
NASA Astrophysics Data System (ADS)
Putzhuber, F.; Hasenauer, H.
2009-04-01
Within the past few years sustainability and bioenergy usage become a key term in emphasizing the relationship between economic progress and the protection of the environment. One key difficulty is the definition of criteria and indicators for assessing sustainability issues and their change over time. This work introduces methods to create linear parametric models of the sustainable impact issues relevant in the establishment of new bio-energetic heating systems. Our application example is the Eisenwurzen region in Austria. The total area covers 5743 km km² and includes 99 municipalities. A total of 11 impact issues covering the economic, social and environmental areas are proposed for developing the linear parametric models. The indicator selection for deriving the impact issues is based on public official data from 68 indicators, as well as stakeholder interviews and the impact assessment framework. In total we obtained 415 variables from the 99 municipalities to create the 68 indicators for the Local Administration Unit 2 (LAU2) over the last (if available) 25 years. The 68 indicators are on a relative scale to address the size differences of the municipalities. The idea of the analysis is to create linear models which derive 11 defined impact issues related to the establishment of new bio-energetic heating systems. Each analysis follows a strict statistical procedure based on (i) independent indicator selection, (ii) remove indicators with higher VIF value grater then 6, (iii) remove indicators with α higher than 0,05, (iv) possible linear transformation, (v) remove the non-significant indicators (p-value >0,05), (vi) model valuation, (vii) remove the out-lines plots and (viii) test of the normality distribution of the residual with a Kolmogorov- Smirnov test. The results suggest that for the 11 sustainable impact issues 21 of the 68 indicators are significant drives. The models revealed that it is possible to create tools for assessing impact issues in a municipality level. In this case impact issues related to bio-energy usages on a rural mountain region.
Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.
Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben
2017-06-06
Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
The Problem of Size in Robust Design
NASA Technical Reports Server (NTRS)
Koch, Patrick N.; Allen, Janet K.; Mistree, Farrokh; Mavris, Dimitri
1997-01-01
To facilitate the effective solution of multidisciplinary, multiobjective complex design problems, a departure from the traditional parametric design analysis and single objective optimization approaches is necessary in the preliminary stages of design. A necessary tradeoff becomes one of efficiency vs. accuracy as approximate models are sought to allow fast analysis and effective exploration of a preliminary design space. In this paper we apply a general robust design approach for efficient and comprehensive preliminary design to a large complex system: a high speed civil transport (HSCT) aircraft. Specifically, we investigate the HSCT wing configuration design, incorporating life cycle economic uncertainties to identify economically robust solutions. The approach is built on the foundation of statistical experimentation and modeling techniques and robust design principles, and is specialized through incorporation of the compromise Decision Support Problem for multiobjective design. For large problems however, as in the HSCT example, this robust design approach developed for efficient and comprehensive design breaks down with the problem of size - combinatorial explosion in experimentation and model building with number of variables -and both efficiency and accuracy are sacrificed. Our focus in this paper is on identifying and discussing the implications and open issues associated with the problem of size for the preliminary design of large complex systems.
Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters
NASA Astrophysics Data System (ADS)
Kim, T.; Kim, Y. S.
2017-12-01
The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).
Power flow analysis of two coupled plates with arbitrary characteristics
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1990-01-01
In the last progress report (Feb. 1988) some results were presented for a parametric analysis on the vibrational power flow between two coupled plate structures using the mobility power flow approach. The results reported then were for changes in the structural parameters of the two plates, but with the two plates identical in their structural characteristics. Herein, limitation is removed. The vibrational power input and output are evaluated for different values of the structural damping loss factor for the source and receiver plates. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. The results obtained from the mobility power flow approach are compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between the SEA results and the mobility power flow results. Furthermore, the benefits derived from using the mobility power flow approach are examined.
1982-12-21
and W. T. ZIEMBA (1981). Intro- duction to concave and generalized concave functions. In Gener- alized Concavity in Optimization and Economics (S...Schaible and W. T. Ziemba , eds.), pp. 21-50. Academic Press, New York. BANK, B., J. GUDDAT, D. KLATTE, B. KUMMER, and K. TAMMER (1982). Non- Linear
Multi-parametric centrality method for graph network models
NASA Astrophysics Data System (ADS)
Ivanov, Sergei Evgenievich; Gorlushkina, Natalia Nikolaevna; Ivanova, Lubov Nikolaevna
2018-04-01
The graph model networks are investigated to determine centrality, weights and the significance of vertices. For centrality analysis appliesa typical method that includesany one of the properties of graph vertices. In graph theory, methods of analyzing centrality are used: in terms by degree, closeness, betweenness, radiality, eccentricity, page-rank, status, Katz and eigenvector. We have proposed a new method of multi-parametric centrality, which includes a number of basic properties of the network member. The mathematical model of multi-parametric centrality method is developed. Comparison of results for the presented method with the centrality methods is carried out. For evaluate the results for the multi-parametric centrality methodthe graph model with hundreds of vertices is analyzed. The comparative analysis showed the accuracy of presented method, includes simultaneously a number of basic properties of vertices.
Winfield, Jessica M.; Payne, Geoffrey S.; Weller, Alex; deSouza, Nandita M.
2016-01-01
Abstract Multi-parametric magnetic resonance imaging (mpMRI) offers a unique insight into tumor biology by combining functional MRI techniques that inform on cellularity (diffusion-weighted MRI), vascular properties (dynamic contrast-enhanced MRI), and metabolites (magnetic resonance spectroscopy) and has scope to provide valuable information for prognostication and response assessment. Challenges in the application of mpMRI in the clinic include the technical considerations in acquiring good quality functional MRI data, development of robust techniques for analysis, and clinical interpretation of the results. This article summarizes the technical challenges in acquisition and analysis of multi-parametric MRI data before reviewing the key applications of multi-parametric MRI in clinical research and practice. PMID:27748710
A Parametric Analysis of HELSTAR
1983-12-01
AFIT/GSO/OS/83D-7 S....A PARAMETRIC ANALYSIS OF HELSTAR THESIS James Miklasevich Captain, USAF AFIT/CSO/OS/83D-7 ’- 3 - Reproduced From J 04. • ’ S...1 Statement of Problem. ...... ................ ......... 3 Objectives of the Research. .... ............ . . . 3 ...Launch Scenarios ................. 39 Launch Sequencel................... 39 Launch Sequence 2 . . . . . .. . . . .. . . . . . 1 Launch Sequence 3
A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods
ERIC Educational Resources Information Center
Ritter, Nicola L.
2012-01-01
Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…
NASA Technical Reports Server (NTRS)
Shaw, Eric J.
2001-01-01
This paper will report on the activities of the IAA Launcher Systems Economics Working Group in preparations for its Launcher Systems Development Cost Behavior Study. The Study goals include: improve launcher system and other space system parametric cost analysis accuracy; improve launcher system and other space system cost analysis credibility; and provide launcher system and technology development program managers and other decisionmakers with useful information on development cost impacts of their decisions. The Working Group plans to explore at least the following five areas in the Study: define and explain development cost behavior terms and concepts for use in the Study; identify and quantify sources of development cost and cost estimating uncertainty; identify and quantify significant influences on development cost behavior; identify common barriers to development cost understanding and reduction; and recommend practical, realistic strategies to accomplish reductions in launcher system development cost.
NASA Astrophysics Data System (ADS)
Etemadi, Halimeh; Samadi, S. Zahra; Sharifikia, Mohammad; Smoak, Joseph M.
2016-10-01
Mangrove wetlands exist in the transition zone between terrestrial and marine environments and have remarkable ecological and socio-economic value. This study uses climate change downscaling to address the question of non-stationarity influences on mangrove variations (expansion and contraction) within an arid coastal region. Our two-step approach includes downscaling models and uncertainty assessment, followed by a non-stationary and trend procedure using the Extreme Value Analysis (extRemes code). The Long Ashton Research Station Weather Generator (LARS-WG) model along with two different general circulation model (GCMs) (MIRH and HadCM3) were used to downscale climatic variables during current (1968-2011) and future (2011-2030, 2045-2065, and 2080-2099) periods. Parametric and non-parametric bootstrapping uncertainty tests demonstrated that the LARS-WGS model skillfully downscaled climatic variables at the 95 % significance level. Downscaling results using MIHR model show that minimum and maximum temperatures will increase in the future (2011-2030, 2045-2065, and 2080-2099) during winter and summer in a range of +4.21 and +4.7 °C, and +3.62 and +3.55 °C, respectively. HadCM3 analysis also revealed an increase in minimum (˜+3.03 °C) and maximum (˜+3.3 °C) temperatures during wet and dry seasons. In addition, we examined how much mangrove area has changed during the past decades and, thus, if climate change non-stationarity impacts mangrove ecosystems. Our results using remote sensing techniques and the non-parametric Mann-Whitney two-sample test indicated a sharp decline in mangrove area during 1972,1987, and 1997 periods ( p value = 0.002). Non-stationary assessment using the generalized extreme value (GEV) distributions by including mangrove area as a covariate further indicated that the null hypothesis of the stationary climate (no trend) should be rejected due to the very low p values for precipitation ( p value = 0.0027), minimum ( p value = 0.000000029) and maximum ( p value = 0.00016) temperatures. Based on non-stationary analysis and an upward trend in downscaled temperature extremes, climate change may control mangrove development in the future.
Parametrically excited non-linear multidegree-of-freedom systems with repeated natural frequencies
NASA Astrophysics Data System (ADS)
Tezak, E. G.; Nayfeh, A. H.; Mook, D. T.
1982-12-01
A method for analyzing multidegree-of-freedom systems having a repeated natural frequency subjected to a parametric excitation is presented. Attention is given to the ordering of the various terms (linear and non-linear) in the governing equations. The analysis is based on the method of multiple scales. As a numerical example involving a parametric resonance, panel flutter is discussed in detail in order to illustrate the type of results one can expect to obtain with this analysis. Some of the analytical results are verified by a numerical integration of the governing equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1984-08-01
The initial objective of this work was to develop a methodology for analyzing the impact of technological advances as a tool to help establish priorities for R and D options in the field of biocatalysis. As an example of a biocatalyzed process, butanol/acetone fermentation (ABE process) was selected as the specific topic of study. A base case model characterizing the technology and economics associated with the ABE process was developed in the previous first phase of study. The project objectives were broadened in this second phase of work to provide parametric estimates of the economic and energy impacts of amore » variety of research advances in the hydrolysis, fermentation and purification sections of the process. The research advances analyzed in this study were based on a comprehensive literature review. The six process options analyzed were: continuous ABE fermentaton; vacuum ABE fermentation; Baelene solvent extraction; HRI's Lignol process; improved prehydrolysis/dual enzyme hydrolysis; and improved microorganism tolerance to butanol toxicity. Of the six options analyzed, only improved microorganism tolerance to butanol toxicity had a significant positive effect on energy efficiency and economics. This particular process option reduced the base case production cost (including 10% DCF return) by 20% and energy consumption by 16%. Figures and tables.« less
Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure
2014-12-01
We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data.
Coupled parametric design of flow control and duct shape
NASA Technical Reports Server (NTRS)
Florea, Razvan (Inventor); Bertuccioli, Luca (Inventor)
2009-01-01
A method for designing gas turbine engine components using a coupled parametric analysis of part geometry and flow control is disclosed. Included are the steps of parametrically defining the geometry of the duct wall shape, parametrically defining one or more flow control actuators in the duct wall, measuring a plurality of performance parameters or metrics (e.g., flow characteristics) of the duct and comparing the results of the measurement with desired or target parameters, and selecting the optimal duct geometry and flow control for at least a portion of the duct, the selection process including evaluating the plurality of performance metrics in a pareto analysis. The use of this method in the design of inter-turbine transition ducts, serpentine ducts, inlets, diffusers, and similar components provides a design which reduces pressure losses and flow profile distortions.
Economic opportunity in Mexico and return migration from the United States.
Lindstrom, D P
1996-08-01
I analyze the influence of the economic characteristics of origin area on trip duration for Mexican migrants in the United States. I argue that migrants from economically dynamic areas in Mexico with favorable opportunities for employment and small capital investment have a larger incentive to stay in the United States longer and to withstand the psychic costs of separation from family and friends than do migrants from economically stagnant areas in Mexico, where the productive uses of savings are severely limited. In line with this argument we should expect investment opportunities in migrants' origin areas to be associated positively with migrants' trip duration in the United States. To test this hypothesis I use individual- and household-level data on U.S. migration experience collected in 13 Mexican communities. Evidence from parametric hazards models supports the idea that economic characteristics of origin areas influence the motivations and strategies of Mexican migrants in the United States.
Preliminary design study of advanced multistage axial flow core compressors
NASA Technical Reports Server (NTRS)
Wisler, D. C.; Koch, C. C.; Smith, L. H., Jr.
1977-01-01
A preliminary design study was conducted to identify an advanced core compressor for use in new high-bypass-ratio turbofan engines to be introduced into commercial service in the 1980's. An evaluation of anticipated compressor and related component 1985 state-of-the-art technology was conducted. A parametric screening study covering a large number of compressor designs was conducted to determine the influence of the major compressor design features on efficiency, weight, cost, blade life, aircraft direct operating cost, and fuel usage. The trends observed in the parametric screening study were used to develop three high-efficiency, high-economic-payoff compressor designs. These three compressors were studied in greater detail to better evaluate their aerodynamic and mechanical feasibility.
NASA Astrophysics Data System (ADS)
Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.
2018-03-01
This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-06-01
The conceptual design of an advanced central receiver power system using liquid sodium as a heat transport medium has been completed by a team consisting of the Energy Systems Group (prime contractor), McDonnell Douglas, Stearns-Roger, The University of Houston, and Salt River Project. The purpose of this study was to determine the technical and economic advantages of this concept for commercial-scale power plants. This final report covers all tasks of the project. These tasks were as follows: (1) review and analysis of preliminary specification; (2) parametric analysis; (3) select commercial configuration; (4) commercial plant conceptual design; (5) assessment of commercialmore » plant; (6) advanced central receiver power system development plan; (7) program plan; (8) reports and data; (9) program management; and (10) safety analysis. A programmatic overview of the accomplishments of this program is given. The 100-MW conceptual commercial plant, the 281-MW optimum plant, and the 10-MW pilot plant are described. (WHK)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Looney, J.H.; Im, C.J.
Under the sponsorship of DOE/METC, UCC Research completed a program in 1984 concerned with the development, testing, and manufacture of an ultra-clean coal-water mixture fuel using the UCC two-stage physical beneficiation and coal-water mixture preparation process. Several gallons of ultra-clean coal-water slurry produced at the UCC Research pilot facility was supplied to DOE/METC for combustion testing. The finalization of this project resulted in the presentation of a conceptual design and economic analysis of an ultra-clean coal-water mixture processing facility sufficient in size to continuously supply fuel to a 100 MW turbine power generation system. Upon completion of the above program,more » it became evident that substantial technological and economic improvement could be realized through further laboratory and engineering investigation of the UCC two-stage physical beneficiation process. Therefore, as an extension to the previous work, the purpose of the present program was to define the relationship between the controlling technical parameters as related to coal-water slurry quality and product price, and to determine the areas of improvement in the existing flow-scheme, associated cost savings, and the overall effect of these savings on final coal-water slurry price. Contents of this report include: (1) introduction; (2) process refinement (improvement of coal beneficiation process, different source coals and related cleanability, dispersants and other additives); (3) coal beneficiation and cost parametrics summary; (4) revised conceptual design and economic analysis; (5) operating and capital cost reduction; (6) conclusion; and (7) appendices. 24 figs., 12 tabs.« less
Parametric Analysis of Light Truck and Automobile Maintenance
DOT National Transportation Integrated Search
1979-05-01
Utilizing the Automotive and Light Truck Service and Repair Data Base developed in the campanion report, parametric analyses were made of the relationships between maintenance costs, schduled and unschduled, and vehicle parameters; body class, manufa...
Parametric resonance in the early Universe—a fitting analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Figueroa, Daniel G.; Torrentí, Francisco, E-mail: daniel.figueroa@cern.ch, E-mail: f.torrenti@csic.es
Particle production via parametric resonance in the early Universe, is a non-perturbative, non-linear and out-of-equilibrium phenomenon. Although it is a well studied topic, whenever a new scenario exhibits parametric resonance, a full re-analysis is normally required. To avoid this tedious task, many works present often only a simplified linear treatment of the problem. In order to surpass this circumstance in the future, we provide a fitting analysis of parametric resonance through all its relevant stages: initial linear growth, non-linear evolution, and relaxation towards equilibrium. Using lattice simulations in an expanding grid in 3+1 dimensions, we parametrize the dynamics' outcome scanningmore » over the relevant ingredients: role of the oscillatory field, particle coupling strength, initial conditions, and background expansion rate. We emphasize the inaccuracy of the linear calculation of the decay time of the oscillatory field, and propose a more appropriate definition of this scale based on the subsequent non-linear dynamics. We provide simple fits to the relevant time scales and particle energy fractions at each stage. Our fits can be applied to post-inflationary preheating scenarios, where the oscillatory field is the inflaton, or to spectator-field scenarios, where the oscillatory field can be e.g. a curvaton, or the Standard Model Higgs.« less
Park, Taeyoung; Krafty, Robert T; Sánchez, Alvaro I
2012-07-27
A Poisson regression model with an offset assumes a constant baseline rate after accounting for measured covariates, which may lead to biased estimates of coefficients in an inhomogeneous Poisson process. To correctly estimate the effect of time-dependent covariates, we propose a Poisson change-point regression model with an offset that allows a time-varying baseline rate. When the nonconstant pattern of a log baseline rate is modeled with a nonparametric step function, the resulting semi-parametric model involves a model component of varying dimension and thus requires a sophisticated varying-dimensional inference to obtain correct estimates of model parameters of fixed dimension. To fit the proposed varying-dimensional model, we devise a state-of-the-art MCMC-type algorithm based on partial collapse. The proposed model and methods are used to investigate an association between daily homicide rates in Cali, Colombia and policies that restrict the hours during which the legal sale of alcoholic beverages is permitted. While simultaneously identifying the latent changes in the baseline homicide rate which correspond to the incidence of sociopolitical events, we explore the effect of policies governing the sale of alcohol on homicide rates and seek a policy that balances the economic and cultural dependencies on alcohol sales to the health of the public.
Combined non-parametric and parametric approach for identification of time-variant systems
NASA Astrophysics Data System (ADS)
Dziedziech, Kajetan; Czop, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz
2018-03-01
Identification of systems, structures and machines with variable physical parameters is a challenging task especially when time-varying vibration modes are involved. The paper proposes a new combined, two-step - i.e. non-parametric and parametric - modelling approach in order to determine time-varying vibration modes based on input-output measurements. Single-degree-of-freedom (SDOF) vibration modes from multi-degree-of-freedom (MDOF) non-parametric system representation are extracted in the first step with the use of time-frequency wavelet-based filters. The second step involves time-varying parametric representation of extracted modes with the use of recursive linear autoregressive-moving-average with exogenous inputs (ARMAX) models. The combined approach is demonstrated using system identification analysis based on the experimental mass-varying MDOF frame-like structure subjected to random excitation. The results show that the proposed combined method correctly captures the dynamics of the analysed structure, using minimum a priori information on the model.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates
NASA Technical Reports Server (NTRS)
Peffley, Al F.
1991-01-01
The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.
Four photon parametric amplification. [in unbiased Josephson junction
NASA Technical Reports Server (NTRS)
Parrish, P. T.; Feldman, M. J.; Ohta, H.; Chiao, R. Y.
1974-01-01
An analysis is presented describing four-photon parametric amplification in an unbiased Josephson junction. Central to the theory is the model of the Josephson effect as a nonlinear inductance. Linear, small signal analysis is applied to the two-fluid model of the Josephson junction. The gain, gain-bandwidth product, high frequency limit, and effective noise temperature are calculated for a cavity reflection amplifier. The analysis is extended to multiple (series-connected) junctions and subharmonic pumping.
Evaluation of advanced lift concepts and fuel conservative short-haul aircraft, volume 1
NASA Technical Reports Server (NTRS)
Renshaw, J. H.; Bowden, M. K.; Narucki, C. W.; Bennett, J. A.; Smith, P. R.; Ferrill, R. S.; Randall, C. C.; Tibbetts, J. G.; Patterson, R. W.; Meyer, R. T.
1974-01-01
The performance and economics of a twin-engine augmentor wing airplane were evaluated in two phases. Design aspects of the over-the-wing/internally blown flap hybrid, augmentor wing, and mechanical flap aircraft were investigated for 910 m. field length with parametric extension to other field lengths. Fuel savings achievable by application of advanced lift concepts to short-haul aircraft were evaluated and the effect of different field lengths, cruise requirements, and noise levels on fuel consumption and airplane economics at higher fuel prices were determined. Conclusions and recommendations are presented.
Technical and economic assessment of span-distributed loading cargo aircraft concepts
NASA Technical Reports Server (NTRS)
Whitlow, D. H.; Whitner, P. C.
1976-01-01
A preliminary design study of the performance and economics resulting from the application of the distributed load concept to large freighter aircraft was made. The study was limited to configurations having the payload entirely contained in unswept wings of constant chord with conventional tail surfaces supported from the wing by twin booms. A parametric study based on current technology showed that increases in chord had a similar effect on the economics as increases in span. Increases in both span and chord or airplane size had the largest and most favorable effect. At 600,000 lbs payload a configuration was selected and refined to incorporate advanced technology that could be in production by 1990 and compared with a reference conventional airplane having similar technology.
Antle, John M.; Stoorvogel, Jetse J.; Valdivia, Roberto O.
2014-01-01
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models. PMID:24535388
Antle, John M; Stoorvogel, Jetse J; Valdivia, Roberto O
2014-04-05
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models.
Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald
2007-05-01
(R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).
Towards the generation of a parametric foot model using principal component analysis: A pilot study.
Scarton, Alessandra; Sawacha, Zimi; Cobelli, Claudio; Li, Xinshan
2016-06-01
There have been many recent developments in patient-specific models with their potential to provide more information on the human pathophysiology and the increase in computational power. However they are not yet successfully applied in a clinical setting. One of the main challenges is the time required for mesh creation, which is difficult to automate. The development of parametric models by means of the Principle Component Analysis (PCA) represents an appealing solution. In this study PCA has been applied to the feet of a small cohort of diabetic and healthy subjects, in order to evaluate the possibility of developing parametric foot models, and to use them to identify variations and similarities between the two populations. Both the skin and the first metatarsal bones have been examined. Besides the reduced sample of subjects considered in the analysis, results demonstrated that the method adopted herein constitutes a first step towards the realization of a parametric foot models for biomechanical analysis. Furthermore the study showed that the methodology can successfully describe features in the foot, and evaluate differences in the shape of healthy and diabetic subjects. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1973-01-01
Parametric studies and subsystem comparisons for the orbital radar mapping mission to planet Venus are presented. Launch vehicle requirements and primary orbiter propulsion system requirements are evaluated. The systems parametric analysis indicated that orbit size and orientation interrelated with almost all of the principal spacecraft systems and influenced significantly the definition of orbit insertion propulsion requirements, weight in orbit capability, radar system design, and mapping strategy.
Technical and Economical Feasibility of SSTO and TSTO Launch Vehicles
NASA Astrophysics Data System (ADS)
Lerch, Jens
This paper discusses whether it is more cost effective to launch to low earth orbit in one or two stages, assuming current or near future technologies. First the paper provides an overview of the current state of the launch market and the hurdles to introducing new launch vehicles capable of significantly lowering the cost of access to space and discusses possible routes to solve those problems. It is assumed that reducing the complexity of launchers by reducing the number of stages and engines, and introducing reusability will result in lower launch costs. A number of operational and historic launch vehicle stages capable of near single stage to orbit (SSTO) performance are presented and the necessary steps to modify them into an expendable SSTO launcher and an optimized two stage to orbit (TSTO) launcher are shown, through parametric analysis. Then a ballistic reentry and recovery system is added to show that reusable SSTO and TSTO vehicles are also within the current state of the art. The development and recurring costs of the SSTO and the TSTO systems are estimated and compared. This analysis shows whether it is more economical to develop and operate expendable or reusable SSTO or TSTO systems under different assumption for launch rate and initial investment.
Parametric Analysis and Safety Concepts of CWR Track Buckling.
DOT National Transportation Integrated Search
1993-12-01
The report presents a comprehensive study of continuous welded rail (CWR) track buckling strength as influenced by the range of all key parameters such as the lateral, torsional and longitudinal resistance, vehicle loads, etc. The parametric study pr...
Parametric Shape Optimization of Lens-Focused Piezoelectric Ultrasound Transducers.
Thomas, Gilles P L; Chapelon, Jean-Yves; Bera, Jean-Christophe; Lafon, Cyril
2018-05-01
Focused transducers composed of flat piezoelectric ceramic coupled with an acoustic lens present an economical alternative to curved piezoelectric ceramics and are already in use in a variety of fields. Using a displacement/pressure (u/p) mixed finite element formulation combined with parametric level-set functions to implicitly define the boundaries between the materials and the fluid-structure interface, a method to optimize the shape of acoustic lens made of either one or multiple materials is presented. From that method, two 400 kHz focused transducers using acoustic lens were designed and built with different rapid prototyping methods, one of them made with a combination of two materials, and experimental measurements of the pressure field around the focal point are in good agreement with the presented model.
Bridge maintenance to enhance corrosion resistance and performance of steel girder bridges
NASA Astrophysics Data System (ADS)
Moran Yanez, Luis M.
The integrity and efficiency of any national highway system relies on the condition of the various components. Bridges are fundamental elements of a highway system, representing an important investment and a strategic link that facilitates the transport of persons and goods. The cost to rehabilitate or replace a highway bridge represents an important expenditure to the owner, who needs to evaluate the correct time to assume that cost. Among the several factors that affect the condition of steel highway bridges, corrosion is identified as the main problem. In the USA corrosion is the primary cause of structurally deficient steel bridges. The benefit of regular high-pressure superstructure washing and spot painting were evaluated as effective maintenance activities to reduce the corrosion process. The effectiveness of steel girder washing was assessed by developing models of corrosion deterioration of composite steel girders and analyzing steel coupons at the laboratory under atmospheric corrosion for two alternatives: when high-pressure washing was performed and when washing was not considered. The effectiveness of spot painting was assessed by analyzing the corrosion on steel coupons, with small damages, unprotected and protected by spot painting. A parametric analysis of corroded steel girder bridges was considered. The emphasis was focused on the parametric analyses of corroded steel girder bridges under two alternatives: (a) when steel bridge girder washing is performed according to a particular frequency, and (b) when no bridge washing is performed to the girders. The reduction of structural capacity was observed for both alternatives along the structure service life, estimated at 100 years. An economic analysis, using the Life-Cycle Cost Analysis method, demonstrated that it is more cost-effective to perform steel girder washing as a scheduled maintenance activity in contrast to the no washing alternative.
NASA Astrophysics Data System (ADS)
Wu, Bing-Fei; Ma, Li-Shan; Perng, Jau-Woei
This study analyzes the absolute stability in P and PD type fuzzy logic control systems with both certain and uncertain linear plants. Stability analysis includes the reference input, actuator gain and interval plant parameters. For certain linear plants, the stability (i.e. the stable equilibriums of error) in P and PD types is analyzed with the Popov or linearization methods under various reference inputs and actuator gains. The steady state errors of fuzzy control systems are also addressed in the parameter plane. The parametric robust Popov criterion for parametric absolute stability based on Lur'e systems is also applied to the stability analysis of P type fuzzy control systems with uncertain plants. The PD type fuzzy logic controller in our approach is a single-input fuzzy logic controller and is transformed into the P type for analysis. In our work, the absolute stability analysis of fuzzy control systems is given with respect to a non-zero reference input and an uncertain linear plant with the parametric robust Popov criterion unlike previous works. Moreover, a fuzzy current controlled RC circuit is designed with PSPICE models. Both numerical and PSPICE simulations are provided to verify the analytical results. Furthermore, the oscillation mechanism in fuzzy control systems is specified with various equilibrium points of view in the simulation example. Finally, the comparisons are also given to show the effectiveness of the analysis method.
Yang, Li; Wang, Guobao; Qi, Jinyi
2016-04-01
Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.
Ghaffari, Mahsa; Tangen, Kevin; Alaraj, Ali; Du, Xinjian; Charbel, Fady T; Linninger, Andreas A
2017-12-01
In this paper, we present a novel technique for automatic parametric mesh generation of subject-specific cerebral arterial trees. This technique generates high-quality and anatomically accurate computational meshes for fast blood flow simulations extending the scope of 3D vascular modeling to a large portion of cerebral arterial trees. For this purpose, a parametric meshing procedure was developed to automatically decompose the vascular skeleton, extract geometric features and generate hexahedral meshes using a body-fitted coordinate system that optimally follows the vascular network topology. To validate the anatomical accuracy of the reconstructed vasculature, we performed statistical analysis to quantify the alignment between parametric meshes and raw vascular images using receiver operating characteristic curve. Geometric accuracy evaluation showed an agreement with area under the curves value of 0.87 between the constructed mesh and raw MRA data sets. Parametric meshing yielded on-average, 36.6% and 21.7% orthogonal and equiangular skew quality improvement over the unstructured tetrahedral meshes. The parametric meshing and processing pipeline constitutes an automated technique to reconstruct and simulate blood flow throughout a large portion of the cerebral arterial tree down to the level of pial vessels. This study is the first step towards fast large-scale subject-specific hemodynamic analysis for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.
2018-03-01
We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.
Kargarian-Marvasti, Sadegh; Rimaz, Shahnaz; Abolghasemi, Jamileh; Heydari, Iraj
2017-01-01
Cox proportional hazard model is the most common method for analyzing the effects of several variables on survival time. However, under certain circumstances, parametric models give more precise estimates to analyze survival data than Cox. The purpose of this study was to investigate the comparative performance of Cox and parametric models in a survival analysis of factors affecting the event time of neuropathy in patients with type 2 diabetes. This study included 371 patients with type 2 diabetes without neuropathy who were registered at Fereydunshahr diabetes clinic. Subjects were followed up for the development of neuropathy between 2006 to March 2016. To investigate the factors influencing the event time of neuropathy, significant variables in univariate model ( P < 0.20) were entered into the multivariate Cox and parametric models ( P < 0.05). In addition, Akaike information criterion (AIC) and area under ROC curves were used to evaluate the relative goodness of fitted model and the efficiency of each procedure, respectively. Statistical computing was performed using R software version 3.2.3 (UNIX platforms, Windows and MacOS). Using Kaplan-Meier, survival time of neuropathy was computed 76.6 ± 5 months after initial diagnosis of diabetes. After multivariate analysis of Cox and parametric models, ethnicity, high-density lipoprotein and family history of diabetes were identified as predictors of event time of neuropathy ( P < 0.05). According to AIC, "log-normal" model with the lowest Akaike's was the best-fitted model among Cox and parametric models. According to the results of comparison of survival receiver operating characteristics curves, log-normal model was considered as the most efficient and fitted model.
Diaby, Vakaramoko; Adunlin, Georges; Montero, Alberto J
2014-02-01
Survival modeling techniques are increasingly being used as part of decision modeling for health economic evaluations. As many models are available, it is imperative for interested readers to know about the steps in selecting and using the most suitable ones. The objective of this paper is to propose a tutorial for the application of appropriate survival modeling techniques to estimate transition probabilities, for use in model-based economic evaluations, in the absence of individual patient data (IPD). An illustration of the use of the tutorial is provided based on the final progression-free survival (PFS) analysis of the BOLERO-2 trial in metastatic breast cancer (mBC). An algorithm was adopted from Guyot and colleagues, and was then run in the statistical package R to reconstruct IPD, based on the final PFS analysis of the BOLERO-2 trial. It should be emphasized that the reconstructed IPD represent an approximation of the original data. Afterwards, we fitted parametric models to the reconstructed IPD in the statistical package Stata. Both statistical and graphical tests were conducted to verify the relative and absolute validity of the findings. Finally, the equations for transition probabilities were derived using the general equation for transition probabilities used in model-based economic evaluations, and the parameters were estimated from fitted distributions. The results of the application of the tutorial suggest that the log-logistic model best fits the reconstructed data from the latest published Kaplan-Meier (KM) curves of the BOLERO-2 trial. Results from the regression analyses were confirmed graphically. An equation for transition probabilities was obtained for each arm of the BOLERO-2 trial. In this paper, a tutorial was proposed and used to estimate the transition probabilities for model-based economic evaluation, based on the results of the final PFS analysis of the BOLERO-2 trial in mBC. The results of our study can serve as a basis for any model (Markov) that needs the parameterization of transition probabilities, and only has summary KM plots available.
Parametric number covariance in quantum chaotic spectra.
Vinayak; Kumar, Sandeep; Pandey, Akhilesh
2016-03-01
We study spectral parametric correlations in quantum chaotic systems and introduce the number covariance as a measure of such correlations. We derive analytic results for the classical random matrix ensembles using the binary correlation method and obtain compact expressions for the covariance. We illustrate the universality of this measure by presenting the spectral analysis of the quantum kicked rotors for the time-reversal invariant and time-reversal noninvariant cases. A local version of the parametric number variance introduced earlier is also investigated.
Parametric models of reflectance spectra for dyed fabrics
NASA Astrophysics Data System (ADS)
Aiken, Daniel C.; Ramsey, Scott; Mayo, Troy; Lambrakos, Samuel G.; Peak, Joseph
2016-05-01
This study examines parametric modeling of NIR reflectivity spectra for dyed fabrics, which provides for both their inverse and direct modeling. The dye considered for prototype analysis is triarylamine dye. The fabrics considered are camouflage textiles characterized by color variations. The results of this study provide validation of the constructed parametric models, within reasonable error tolerances for practical applications, including NIR spectral characteristics in camouflage textiles, for purposes of simulating NIR spectra corresponding to various dye concentrations in host fabrics, and potentially to mixtures of dyes.
Schwalenberg, Simon
2005-06-01
The present work represents a first attempt to perform computations of output intensity distributions for different parametric holographic scattering patterns. Based on the model for parametric four-wave mixing processes in photorefractive crystals and taking into account realistic material properties, we present computed images of selected scattering patterns. We compare these calculated light distributions to the corresponding experimental observations. Our analysis is especially devoted to dark scattering patterns as they make high demands on the underlying model.
Numerical prediction of 3-D ejector flows
NASA Technical Reports Server (NTRS)
Roberts, D. W.; Paynter, G. C.
1979-01-01
The use of parametric flow analysis, rather than parametric scale testing, to support the design of an ejector system offers a number of potential advantages. The application of available 3-D flow analyses to the design ejectors can be subdivided into several key elements. These are numerics, turbulence modeling, data handling and display, and testing in support of analysis development. Experimental and predicted jet exhaust for the Boeing 727 aircraft are examined.
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan
2016-10-01
This paper introduces mixed fuzzy and interval parametric uncertainties into the FE components of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model for mid-frequency analysis of built-up systems, thus an uncertain ensemble combining non-parametric with mixed fuzzy and interval parametric uncertainties comes into being. A fuzzy interval Finite Element/Statistical Energy Analysis (FIFE/SEA) framework is proposed to obtain the uncertain responses of built-up systems, which are described as intervals with fuzzy bounds, termed as fuzzy-bounded intervals (FBIs) in this paper. Based on the level-cut technique, a first-order fuzzy interval perturbation FE/SEA (FFIPFE/SEA) and a second-order fuzzy interval perturbation FE/SEA method (SFIPFE/SEA) are developed to handle the mixed parametric uncertainties efficiently. FFIPFE/SEA approximates the response functions by the first-order Taylor series, while SFIPFE/SEA improves the accuracy by considering the second-order items of Taylor series, in which all the mixed second-order items are neglected. To further improve the accuracy, a Chebyshev fuzzy interval method (CFIM) is proposed, in which the Chebyshev polynomials is used to approximate the response functions. The FBIs are eventually reconstructed by assembling the extrema solutions at all cut levels. Numerical results on two built-up systems verify the effectiveness of the proposed methods.
Simulation of parametric model towards the fixed covariate of right censored lung cancer data
NASA Astrophysics Data System (ADS)
Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila
2017-09-01
In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.
Chaotic map clustering algorithm for EEG analysis
NASA Astrophysics Data System (ADS)
Bellotti, R.; De Carlo, F.; Stramaglia, S.
2004-03-01
The non-parametric chaotic map clustering algorithm has been applied to the analysis of electroencephalographic signals, in order to recognize the Huntington's disease, one of the most dangerous pathologies of the central nervous system. The performance of the method has been compared with those obtained through parametric algorithms, as K-means and deterministic annealing, and supervised multi-layer perceptron. While supervised neural networks need a training phase, performed by means of data tagged by the genetic test, and the parametric methods require a prior choice of the number of classes to find, the chaotic map clustering gives a natural evidence of the pathological class, without any training or supervision, thus providing a new efficient methodology for the recognition of patterns affected by the Huntington's disease.
Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung
2014-10-01
Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.
An appraisal of statistical procedures used in derivation of reference intervals.
Ichihara, Kiyoshi; Boyd, James C
2010-11-01
When conducting studies to derive reference intervals (RIs), various statistical procedures are commonly applied at each step, from the planning stages to final computation of RIs. Determination of the necessary sample size is an important consideration, and evaluation of at least 400 individuals in each subgroup has been recommended to establish reliable common RIs in multicenter studies. Multiple regression analysis allows identification of the most important factors contributing to variation in test results, while accounting for possible confounding relationships among these factors. Of the various approaches proposed for judging the necessity of partitioning reference values, nested analysis of variance (ANOVA) is the likely method of choice owing to its ability to handle multiple groups and being able to adjust for multiple factors. Box-Cox power transformation often has been used to transform data to a Gaussian distribution for parametric computation of RIs. However, this transformation occasionally fails. Therefore, the non-parametric method based on determination of the 2.5 and 97.5 percentiles following sorting of the data, has been recommended for general use. The performance of the Box-Cox transformation can be improved by introducing an additional parameter representing the origin of transformation. In simulations, the confidence intervals (CIs) of reference limits (RLs) calculated by the parametric method were narrower than those calculated by the non-parametric approach. However, the margin of difference was rather small owing to additional variability in parametrically-determined RLs introduced by estimation of parameters for the Box-Cox transformation. The parametric calculation method may have an advantage over the non-parametric method in allowing identification and exclusion of extreme values during RI computation.
Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.
2016-01-01
In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration (High Freq) performed similarly to non-parametric methods, but had the highest recall values, suggesting that this method could be employed for automatic tremor detection. PMID:27258018
Optimization of space manufacturing systems
NASA Technical Reports Server (NTRS)
Akin, D. L.
1979-01-01
Four separate analyses are detailed: transportation to low earth orbit, orbit-to-orbit optimization, parametric analysis of SPS logistics based on earth and lunar source locations, and an overall program option optimization implemented with linear programming. It is found that smaller vehicles are favored for earth launch, with the current Space Shuttle being right at optimum payload size. Fully reusable launch vehicles represent a savings of 50% over the Space Shuttle; increased reliability with less maintenance could further double the savings. An optimization of orbit-to-orbit propulsion systems using lunar oxygen for propellants shows that ion propulsion is preferable by a 3:1 cost margin over a mass driver reaction engine at optimum values; however, ion engines cannot yet operate in the lower exhaust velocity range where the optimum lies, and total program costs between the two systems are ambiguous. Heavier payloads favor the use of a MDRE. A parametric model of a space manufacturing facility is proposed, and used to analyze recurring costs, total costs, and net present value discounted cash flows. Parameters studied include productivity, effects of discounting, materials source tradeoffs, economic viability of closed-cycle habitats, and effects of varying degrees of nonterrestrial SPS materials needed from earth. Finally, candidate optimal scenarios are chosen, and implemented in a linear program with external constraints in order to arrive at an optimum blend of SPS production strategies in order to maximize returns.
Unemployment and subsequent depression: A mediation analysis using the parametric G-formula.
Bijlsma, Maarten J; Tarkiainen, Lasse; Myrskylä, Mikko; Martikainen, Pekka
2017-12-01
The effects of unemployment on depression are difficult to establish because of confounding and limited understanding of the mechanisms at the population level. In particular, due to longitudinal interdependencies between exposures, mediators and outcomes, intermediate confounding is an obstacle for mediation analyses. Using longitudinal Finnish register data on socio-economic characteristics and medication purchases, we extracted individuals who entered the labor market between ages 16 and 25 in the period 1996 to 2001 and followed them until the year 2007 (n = 42,172). With the parametric G-formula we estimated the population-averaged effect on first antidepressant purchase of a simulated intervention which set all unemployed person-years to employed. In the data, 74% of person-years were employed and 8% unemployed, the rest belonging to studying or other status. In the intervention scenario, employment rose to 85% and the hazard of first antidepressant purchase decreased by 7.6%. Of this reduction 61% was mediated, operating primarily through changes in income and household status, while mediation through other health conditions was negligible. These effects were negligible for women and particularly prominent among less educated men. By taking complex interdependencies into account in a framework of observed repeated measures data, we found that eradicating unemployment raises income levels, promotes family formation, and thereby reduces antidepressant consumption at the population-level. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Yadage and Packtivity - analysis preservation using parametrized workflows
NASA Astrophysics Data System (ADS)
Cranmer, Kyle; Heinrich, Lukas
2017-10-01
Preserving data analyses produced by the collaborations at LHC in a parametrized fashion is crucial in order to maintain reproducibility and re-usability. We argue for a declarative description in terms of individual processing steps - “packtivities” - linked through a dynamic directed acyclic graph (DAG) and present an initial set of JSON schemas for such a description and an implementation - “yadage” - capable of executing workflows of analysis preserved via Linux containers.
Climate change and vector-borne diseases: an economic impact analysis of malaria in Africa.
Egbendewe-Mondzozo, Aklesso; Musumba, Mark; McCarl, Bruce A; Wu, Ximing
2011-03-01
A semi-parametric econometric model is used to study the relationship between malaria cases and climatic factors in 25 African countries. Results show that a marginal change in temperature and precipitation levels would lead to a significant change in the number of malaria cases for most countries by the end of the century. Consistent with the existing biophysical malaria model results, the projected effects of climate change are mixed. Our model projects that some countries will see an increase in malaria cases but others will see a decrease. We estimate projected malaria inpatient and outpatient treatment costs as a proportion of annual 2000 health expenditures per 1,000 people. We found that even under minimal climate change scenario, some countries may see their inpatient treatment cost of malaria increase more than 20%.
Modeling integrated water user decisions in intermittent supply systems
NASA Astrophysics Data System (ADS)
Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.
2007-07-01
We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.
NASA Technical Reports Server (NTRS)
Bradford, D. F.; Kelejian, H. H.; Brusch, R.; Gross, J.; Fishman, H.; Feenberg, D.
1974-01-01
The value of improving information for forecasting future crop harvests was investigated. Emphasis was placed upon establishing practical evaluation procedures firmly based in economic theory. The analysis was applied to the case of U.S. domestic wheat consumption. Estimates for a cost of storage function and a demand function for wheat were calculated. A model of market determinations of wheat inventories was developed for inventory adjustment. The carry-over horizon is computed by the solution of a nonlinear programming problem, and related variables such as spot and future price at each stage are determined. The model is adaptable to other markets. Results are shown to depend critically on the accuracy of current and proposed measurement techniques. The quantitative results are presented parametrically, in terms of various possible values of current and future accuracies.
NASA Astrophysics Data System (ADS)
Yang, Yang; Peng, Zhike; Dong, Xingjian; Zhang, Wenming; Clifton, David A.
2018-03-01
A challenge in analysing non-stationary multi-component signals is to isolate nonlinearly time-varying signals especially when they are overlapped in time and frequency plane. In this paper, a framework integrating time-frequency analysis-based demodulation and a non-parametric Gaussian latent feature model is proposed to isolate and recover components of such signals. The former aims to remove high-order frequency modulation (FM) such that the latter is able to infer demodulated components while simultaneously discovering the number of the target components. The proposed method is effective in isolating multiple components that have the same FM behavior. In addition, the results show that the proposed method is superior to generalised demodulation with singular-value decomposition-based method, parametric time-frequency analysis with filter-based method and empirical model decomposition base method, in recovering the amplitude and phase of superimposed components.
Network structure of multivariate time series.
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-10-21
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.
BLIND EXTRACTION OF AN EXOPLANETARY SPECTRUM THROUGH INDEPENDENT COMPONENT ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldmann, I. P.; Tinetti, G.; Hollis, M. D. J.
2013-03-20
Blind-source separation techniques are used to extract the transmission spectrum of the hot-Jupiter HD189733b recorded by the Hubble/NICMOS instrument. Such a 'blind' analysis of the data is based on the concept of independent component analysis. The detrending of Hubble/NICMOS data using the sole assumption that nongaussian systematic noise is statistically independent from the desired light-curve signals is presented. By not assuming any prior or auxiliary information but the data themselves, it is shown that spectroscopic errors only about 10%-30% larger than parametric methods can be obtained for 11 spectral bins with bin sizes of {approx}0.09 {mu}m. This represents a reasonablemore » trade-off between a higher degree of objectivity for the non-parametric methods and smaller standard errors for the parametric de-trending. Results are discussed in light of previous analyses published in the literature. The fact that three very different analysis techniques yield comparable spectra is a strong indication of the stability of these results.« less
Adam, Stéphane; Bonsang, Eric; Grotz, Catherine; Perelman, Sergio
2013-01-01
This paper investigates the relationship between the concept of activity (including both professional and nonprofessional) and cognitive functioning among older European individuals. In this research, we used data collected during the first wave of SHARE (Survey on Health, Ageing and Retirement in Europe), and a measurement approach known as stochastic frontier analysis, derived from the economic literature. SHARE includes a large population (n > 25,000) geographically distributed across Europe, and analyzes several dimensions simultaneously, including physical and mental health activity. The main advantages of stochastic frontier analysis are that it allows estimation of parametric function relating cognitive scores and driving factors at the boundary and disentangles frontier noise and distance to frontier components, as well as testing the effect of potential factors on these distances simultaneously. The analysis reveals that all activities are positively related to cognitive functioning in elderly people. Our results are discussed in terms of prevention of cognitive aging and Alzheimer’s disease, and regarding the potential impact that some retirement programs might have on cognitive functioning in individuals across Europe. PMID:23671387
Craven, Michael P; Allsop, Matthew J; Morgan, Stephen P; Martin, Jennifer L
2012-09-03
With increased governmental interest in value assessment of technologies and where medical device manufacturers are finding it increasingly necessary to become more familiar with economic evaluation methods, the study sought to explore the levels of health economics knowledge within small and medium-sized enterprises (SMEs) and to scope strategies they employ to demonstrate the value of their products to purchasers. A short questionnaire was completed by participants attending one of five workshops on product development in the medical device sector that took place in England between 2007 and 2011. From all responses obtained, a large proportion of participants were based in SMEs (N = 43), and these responses were used for the analysis. Statistical analysis using non-parametric tests was performed on questions with approximately interval scales. Qualitative data from participant responses were analysed to reveal emerging themes. The questionnaire results revealed that 60% of SME participants (mostly company directors or managers, including product or project managers) rated themselves as having low or no knowledge of health economics prior to the workshops but the rest professed at least medium knowledge. Clinical trials and cost analyses or cost-effectiveness studies were the most highly cited means by which SMEs aim to demonstrate value of products to purchasers. Purchasers were perceived to place most importance on factors of safety, expert opinion, cost-effectiveness and price. However many companies did not utilise formal decision-making tools to prioritise these factors. There was no significant dependence of the use of decision-making tools in general with respect to professed knowledge of health economics methods. SMEs did not state a preference for any particular aspect of potential value when deciding whether to develop a product. A majority of SMEs stated they would use a health economics tool. Research and development teams or marketing and sales departments would most likely use one. This study points to the need for further research into the education requirements of SMEs in the area of Health Technology Assessment (HTA) and also for investigation into how SMEs engage with existing HTA processes as required by assessors such as NICE.
2012-01-01
Background With increased governmental interest in value assessment of technologies and where medical device manufacturers are finding it increasingly necessary to become more familiar with economic evaluation methods, the study sought to explore the levels of health economics knowledge within small and medium-sized enterprises (SMEs) and to scope strategies they employ to demonstrate the value of their products to purchasers. Methods A short questionnaire was completed by participants attending one of five workshops on product development in the medical device sector that took place in England between 2007 and 2011. From all responses obtained, a large proportion of participants were based in SMEs (N = 43), and these responses were used for the analysis. Statistical analysis using non-parametric tests was performed on questions with approximately interval scales. Qualitative data from participant responses were analysed to reveal emerging themes. Results The questionnaire results revealed that 60% of SME participants (mostly company directors or managers, including product or project managers) rated themselves as having low or no knowledge of health economics prior to the workshops but the rest professed at least medium knowledge. Clinical trials and cost analyses or cost-effectiveness studies were the most highly cited means by which SMEs aim to demonstrate value of products to purchasers. Purchasers were perceived to place most importance on factors of safety, expert opinion, cost-effectiveness and price. However many companies did not utilise formal decision-making tools to prioritise these factors. There was no significant dependence of the use of decision-making tools in general with respect to professed knowledge of health economics methods. SMEs did not state a preference for any particular aspect of potential value when deciding whether to develop a product. A majority of SMEs stated they would use a health economics tool. Research and development teams or marketing and sales departments would most likely use one. Conclusion This study points to the need for further research into the education requirements of SMEs in the area of Health Technology Assessment (HTA) and also for investigation into how SMEs engage with existing HTA processes as required by assessors such as NICE. PMID:22943625
Assessment of Dimensionality in Social Science Subtest
ERIC Educational Resources Information Center
Ozbek Bastug, Ozlem Yesim
2012-01-01
Most of the literature on dimensionality focused on either comparison of parametric and nonparametric dimensionality detection procedures or showing the effectiveness of one type of procedure. There is no known study to shown how to do combined parametric and nonparametric dimensionality analysis on real data. The current study is aimed to fill…
Sengupta Chattopadhyay, Amrita; Hsiao, Ching-Lin; Chang, Chien Ching; Lian, Ie-Bin; Fann, Cathy S J
2014-01-01
Identifying susceptibility genes that influence complex diseases is extremely difficult because loci often influence the disease state through genetic interactions. Numerous approaches to detect disease-associated SNP-SNP interactions have been developed, but none consistently generates high-quality results under different disease scenarios. Using summarizing techniques to combine a number of existing methods may provide a solution to this problem. Here we used three popular non-parametric methods-Gini, absolute probability difference (APD), and entropy-to develop two novel summary scores, namely principle component score (PCS) and Z-sum score (ZSS), with which to predict disease-associated genetic interactions. We used a simulation study to compare performance of the non-parametric scores, the summary scores, the scaled-sum score (SSS; used in polymorphism interaction analysis (PIA)), and the multifactor dimensionality reduction (MDR). The non-parametric methods achieved high power, but no non-parametric method outperformed all others under a variety of epistatic scenarios. PCS and ZSS, however, outperformed MDR. PCS, ZSS and SSS displayed controlled type-I-errors (<0.05) compared to GS, APDS, ES (>0.05). A real data study using the genetic-analysis-workshop 16 (GAW 16) rheumatoid arthritis dataset identified a number of interesting SNP-SNP interactions. © 2013 Elsevier B.V. All rights reserved.
Parametrically excited helicopter ground resonance dynamics with high blade asymmetries
NASA Astrophysics Data System (ADS)
Sanches, L.; Michon, G.; Berlioz, A.; Alazard, D.
2012-07-01
The present work is aimed at verifying the influence of high asymmetries in the variation of in-plane lead-lag stiffness of one blade on the ground resonance phenomenon in helicopters. The periodical equations of motions are analyzed by using Floquet's Theory (FM) and the boundaries of instabilities predicted. The stability chart obtained as a function of asymmetry parameters and rotor speed reveals a complex evolution of critical zones and the existence of bifurcation points at low rotor speed values. Additionally, it is known that when treated as parametric excitations; periodic terms may cause parametric resonances in dynamic systems, some of which can become unstable. Therefore, the helicopter is later considered as a parametrically excited system and the equations are treated analytically by applying the Method of Multiple Scales (MMS). A stability analysis is used to verify the existence of unstable parametric resonances with first and second-order sets of equations. The results are compared and validated with those obtained by Floquet's Theory. Moreover, an explanation is given for the presence of unstable motion at low rotor speeds due to parametric instabilities of the second order.
Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E
2013-06-01
Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric Society.
Model selection criterion in survival analysis
NASA Astrophysics Data System (ADS)
Karabey, Uǧur; Tutkun, Nihal Ata
2017-07-01
Survival analysis deals with time until occurrence of an event of interest such as death, recurrence of an illness, the failure of an equipment or divorce. There are various survival models with semi-parametric or parametric approaches used in medical, natural or social sciences. The decision on the most appropriate model for the data is an important point of the analysis. In literature Akaike information criteria or Bayesian information criteria are used to select among nested models. In this study,the behavior of these information criterion is discussed for a real data set.
Likert scales, levels of measurement and the "laws" of statistics.
Norman, Geoff
2010-12-01
Reviewers of research reports frequently criticize the choice of statistical methods. While some of these criticisms are well-founded, frequently the use of various parametric methods such as analysis of variance, regression, correlation are faulted because: (a) the sample size is too small, (b) the data may not be normally distributed, or (c) The data are from Likert scales, which are ordinal, so parametric statistics cannot be used. In this paper, I dissect these arguments, and show that many studies, dating back to the 1930s consistently show that parametric statistics are robust with respect to violations of these assumptions. Hence, challenges like those above are unfounded, and parametric methods can be utilized without concern for "getting the wrong answer".
NASA Astrophysics Data System (ADS)
Perez Altimar, Roderick
Brittleness is a key characteristic for effective reservoir stimulation and is mainly controlled by mineralogy in unconventional reservoirs. Unfortunately, there is no universally accepted means of predicting brittleness from measures made in wells or from surface seismic data. Brittleness indices (BI) are based on mineralogy, while brittleness average estimations are based on Young's modulus and Poisson's ratio. I evaluate two of the more popular brittleness estimation techniques and apply them to a Barnett Shale seismic survey in order to estimate its geomechanical properties. Using specialized logging tools such as elemental capture tool, density, and P- and S wave sonic logs calibrated to previous core descriptions and laboratory measurements, I create a survey-specific BI template in Young's modulus versus Poisson's ratio or alternatively lambdarho versus murho space. I use this template to predict BI from elastic parameters computed from surface seismic data, providing a continuous estimate of BI estimate in the Barnett Shale survey. Extracting lambdarho-murho values from microseismic event locations, I compute brittleness index from the template and find that most microsemic events occur in the more brittle part of the reservoir. My template is validated through a suite of microseismic experiments that shows most events occurring in brittle zones, fewer events in the ductile shale, and fewer events still in the limestone fracture barriers. Estimated ultimate recovery (EUR) is an estimate of the expected total production of oil and/or gas for the economic life of a well and is widely used in the evaluation of resource play reserves. In the literature it is possible to find several approaches for forecasting purposes and economic analyses. However, the extension to newer infill wells is somewhat challenging because production forecasts in unconventional reservoirs are a function of both completion effectiveness and reservoir quality. For shale gas reservoirs, completion effectiveness is a function not only of the length of the horizontal wells, but also of the number and size of the hydraulic fracture treatments in a multistage completion. These considerations also include the volume of proppant placed, proppant concentration, total perforation length, and number of clusters, while reservoir quality is dependent on properties such as the spatial variations in permeability, porosity, stress, and mechanical properties. I evaluate parametric methods such as multi-linear regression, and compare it to a non-parameteric ACE to better correlate production to engineering attributes for two datasets in the Haynesville Shale play and the Barnett Shale. I find that the parametric methods are useful for an exploratory analysis of the relationship among several variables and are useful to guide the selection of a more sophisticated parametric functional form, when the underlying functional relationship is unknown. Non-parametric regression, on the other hand, is entirely data-driven and does not rely on a pre-specified functional forms. The transformations generated by the ACE algorithm facilitate the identification of appropriate, and possibly meaningful, functional forms.
Application of artificial neural network to fMRI regression analysis.
Misaki, Masaya; Miyauchi, Satoru
2006-01-15
We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.
Evaluation of grid generation technologies from an applied perspective
NASA Technical Reports Server (NTRS)
Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.
1995-01-01
An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.
Uncertainty importance analysis using parametric moment ratio functions.
Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen
2014-02-01
This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.
Power flow analysis of two coupled plates with arbitrary characteristics
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1988-01-01
The limitation of keeping two plates identical is removed and the vibrational power input and output are evaluated for different area ratios, plate thickness ratios, and for different values of the structural damping loss factor for the source plate (plate with excitation) and the receiver plate. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to be able to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. As was done previously, results obtained from the mobility power flow approach will be compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between SEA results and the mobility power flow results. Furthermore, the benefits that can be derived from using the mobility power flow approach, are also examined.
Ilan, Ezgi; Sandström, Mattias; Velikyan, Irina; Sundin, Anders; Eriksson, Barbro; Lubberink, Mark
2017-05-01
68 Ga-DOTATOC and 68 Ga-DOTATATE are radiolabeled somatostatin analogs used for the diagnosis of somatostatin receptor-expressing neuroendocrine tumors (NETs), and SUV measurements are suggested for treatment monitoring. However, changes in net influx rate ( K i ) may better reflect treatment effects than those of the SUV, and accordingly there is a need to compute parametric images showing K i at the voxel level. The aim of this study was to evaluate parametric methods for computation of parametric K i images by comparison to volume of interest (VOI)-based methods and to assess image contrast in terms of tumor-to-liver ratio. Methods: Ten patients with metastatic NETs underwent a 45-min dynamic PET examination followed by whole-body PET/CT at 1 h after injection of 68 Ga-DOTATOC and 68 Ga-DOTATATE on consecutive days. Parametric K i images were computed using a basis function method (BFM) implementation of the 2-tissue-irreversible-compartment model and the Patlak method using a descending aorta image-derived input function, and mean tumor K i values were determined for 50% isocontour VOIs and compared with K i values based on nonlinear regression (NLR) of the whole-VOI time-activity curve. A subsample of healthy liver was delineated in the whole-body and K i images, and tumor-to-liver ratios were calculated to evaluate image contrast. Correlation ( R 2 ) and agreement between VOI-based and parametric K i values were assessed using regression and Bland-Altman analysis. Results: The R 2 between NLR-based and parametric image-based (BFM) tumor K i values was 0.98 (slope, 0.81) and 0.97 (slope, 0.88) for 68 Ga-DOTATOC and 68 Ga-DOTATATE, respectively. For Patlak analysis, the R 2 between NLR-based and parametric-based (Patlak) tumor K i was 0.95 (slope, 0.71) and 0.92 (slope, 0.74) for 68 Ga-DOTATOC and 68 Ga-DOTATATE, respectively. There was no bias between NLR and parametric-based K i values. Tumor-to-liver contrast was 1.6 and 2.0 times higher in the parametric BFM K i images and 2.3 and 3.0 times in the Patlak images than in the whole-body images for 68 Ga-DOTATOC and 68 Ga-DOTATATE, respectively. Conclusion: A high R 2 and agreement between NLR- and parametric-based K i values was found, showing that K i images are quantitatively accurate. In addition, tumor-to-liver contrast was superior in the parametric K i images compared with whole-body images for both 68 Ga-DOTATOC and 68 Ga DOTATATE. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
NASA Astrophysics Data System (ADS)
Harré, Michael S.
2013-02-01
Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Creasy, John T
2015-05-12
This project has the objective to reduce and/or eliminate the use of HEU in commerce. Steps in the process include developing a target testing methodology that is bounding for all Mo-99 target irradiators, establishing a maximum target LEU-foil mass, developing a LEU-foil target qualification document, developing a bounding target failure analysis methodology (failure in reactor containment), optimizing safety vs. economics (goal is to manufacture a safe, but relatively inexpensive target to offset the inherent economic disadvantage of using LEU in place of HEU), and developing target material specifications and manufacturing QC test criteria. The slide presentation is organized under themore » following topics: Objective, Process Overview, Background, Team Structure, Key Achievements, Experiment and Activity Descriptions, and Conclusions. The High Density Target project has demonstrated: approx. 50 targets irradiated through domestic and international partners; proof of concept for two front end processing methods; fabrication of uranium foils for target manufacture; quality control procedures and steps for manufacture; multiple target assembly techniques; multiple target disassembly devices; welding of targets; thermal, hydraulic, and mechanical modeling; robust target assembly parametric studies; and target qualification analysis for insertion into very high flux environment. The High Density Target project has tested and proven several technologies that will benefit current and future Mo-99 producers.« less
Economic study of multipurpose advanced high-speed transport configurations
NASA Technical Reports Server (NTRS)
1979-01-01
A nondimensional economic examination of a parametrically-derived set of supersonic transport aircraft was conducted. The measure of economic value was surcharged relative to subsonic airplane tourist-class yield. Ten airplanes were defined according to size, payload, and speed. The price, range capability, fuel burned, and block time were determined for each configuration, then operating costs and surcharges were calculated. The parameter with the most noticeable influence on nominal surcharge was found to be real (constant dollars) fuel price increase. A change in SST design Mach number from 2.4 to Mach 2.7 showed a very small surcharge advantage (on the order of 1 percent for the faster aircraft). Configuration design compromises required for an airplane to operate overland at supersonic speeds without causing sonic boom annoyance result in severe performance penalties and require high (more than 100 percent) surcharges.
Zeng, Li-ping; Hu, Zheng-mao; Mu, Li-li; Mei, Gui-sen; Lu, Xiu-ling; Zheng, Yong-jun; Li, Pei-jian; Zhang, Ying-xue; Pan, Qian; Long, Zhi-gao; Dai, He-ping; Zhang, Zhuo-hua; Xia, Jia-hui; Zhao, Jing-ping; Xia, Kun
2011-06-01
To investigate the relationship of susceptibility loci in chromosomes 1q21-25 and 6p21-25 and schizophrenia subtypes in Chinese population. A genomic scan and parametric and non-parametric analyses were performed on 242 individuals from 36 schizophrenia pedigrees, including 19 paranoid schizophrenia and 17 undifferentiated schizophrenia pedigrees, from Henan province of China using 5 microsatellite markers in the chromosome region 1q21-25 and 8 microsatellite markers in the chromosome region 6p21-25, which were the candidates of previous studies. All affected subjects were diagnosed and typed according to the criteria of the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revised (DSM-IV-TR; American Psychiatric Association, 2000). All subjects signed informed consent. In chromosome 1, parametric analysis under the dominant inheritance mode of all 36 pedigrees showed that the maximum multi-point heterogeneity Log of odds score method (HLOD) score was 1.33 (α = 0.38). The non-parametric analysis and the single point and multi-point nonparametric linkage (NPL) scores suggested linkage at D1S484, D1S2878, and D1S196. In the 19 paranoid schizophrenias pedigrees, linkage was not observed for any of the 5 markers. In the 17 undifferentiated schizophrenia pedigrees, the multi-point NPL score was 1.60 (P= 0.0367) at D1S484. The single point NPL score was 1.95(P= 0.0145) and the multi-point NPL score was 2.39 (P= 0.0041) at D1S2878. Additionally, the multi-point NPL score was 1.74 (P= 0.0255) at D1S196. These same three loci showed suggestive linkage during the integrative analysis of all 36 pedigrees. In chromosome 6, parametric linkage analysis under the dominant and recessive inheritance and the non-parametric linkage analysis of all 36 pedigrees and the 17 undifferentiated schizophrenia pedigrees, linkage was not observed for any of the 8 markers. In the 19 paranoid schizophrenias pedigrees, parametric analysis showed that under recessive inheritance mode the maximum single-point HLOD score was 1.26 (α = 0.40) and the multi-point HLOD was 1.12 (α = 0.38) at D6S289 in the chromosome 6p23. In nonparametric analysis, the single-point NPL score was 1.52 (P= 0.0402) and the multi-point NPL score was 1.92 (P= 0.0206) at D6S289. Susceptibility genes correlated with undifferentiated schizophrenia pedigrees from D1S484, D1S2878, D1S196 loci, and those correlated with paranoid schizophrenia pedigrees from D6S289 locus are likely present in chromosome regions 1q23.3 and 1q24.2, and chromosome region 6p23, respectively.
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
NASA Astrophysics Data System (ADS)
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.
Cost Risk Analysis Based on Perception of the Engineering Process
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.
1986-01-01
In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering process parameters are elicited from the engineer/expert on the project and are based on that expert's technical knowledge. These are converted by a parametric cost model into a cost estimate. The method discussed makes no assumptions about the distribution underlying the distribution of possible costs, and is not tied to the analysis of previous projects, except through the expert calibrations performed by the parametric cost analyst.
Lausch, Anthony; Yeung, Timothy Pok-Chi; Chen, Jeff; Law, Elton; Wang, Yong; Urbini, Benedetta; Donelli, Filippo; Manco, Luigi; Fainardi, Enrico; Lee, Ting-Yim; Wong, Eugene
2017-11-01
Parametric response map (PRM) analysis of functional imaging has been shown to be an effective tool for early prediction of cancer treatment outcomes and may also be well-suited toward guiding personalized adaptive radiotherapy (RT) strategies such as sub-volume boosting. However, the PRM method was primarily designed for analysis of longitudinally acquired pairs of single-parameter image data. The purpose of this study was to demonstrate the feasibility of a generalized parametric response map analysis framework, which enables analysis of multi-parametric data while maintaining the key advantages of the original PRM method. MRI-derived apparent diffusion coefficient (ADC) and relative cerebral blood volume (rCBV) maps acquired at 1 and 3-months post-RT for 19 patients with high-grade glioma were used to demonstrate the algorithm. Images were first co-registered and then standardized using normal tissue image intensity values. Tumor voxels were then plotted in a four-dimensional Cartesian space with coordinate values equal to a voxel's image intensity in each of the image volumes and an origin defined as the multi-parametric mean of normal tissue image intensity values. Voxel positions were orthogonally projected onto a line defined by the origin and a pre-determined response vector. The voxels are subsequently classified as positive, negative or nil, according to whether projected positions along the response vector exceeded a threshold distance from the origin. The response vector was selected by identifying the direction in which the standard deviation of tumor image intensity values was maximally different between responding and non-responding patients within a training dataset. Voxel classifications were visualized via familiar three-class response maps and then the fraction of tumor voxels associated with each of the classes was investigated for predictive utility analogous to the original PRM method. Independent PRM and MPRM analyses of the contrast-enhancing lesion (CEL) and a 1 cm shell of surrounding peri-tumoral tissue were performed. Prediction using tumor volume metrics was also investigated. Leave-one-out cross validation (LOOCV) was used in combination with permutation testing to assess preliminary predictive efficacy and estimate statistically robust P-values. The predictive endpoint was overall survival (OS) greater than or equal to the median OS of 18.2 months. Single-parameter PRM and multi-parametric response maps (MPRMs) were generated for each patient and used to predict OS via the LOOCV. Tumor volume metrics (P ≥ 0.071 ± 0.01) and single-parameter PRM analyses (P ≥ 0.170 ± 0.01) were not found to be predictive of OS within this study. MPRM analysis of the peri-tumoral region but not the CEL was found to be predictive of OS with a classification sensitivity, specificity and accuracy of 80%, 100%, and 89%, respectively (P = 0.001 ± 0.01). The feasibility of a generalized MPRM analysis framework was demonstrated with improved prediction of overall survival compared to the original single-parameter method when applied to a glioblastoma dataset. The proposed algorithm takes the spatial heterogeneity in multi-parametric response into consideration and enables visualization. MPRM analysis of peri-tumoral regions was shown to have predictive potential supporting further investigation of a larger glioblastoma dataset. © 2017 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Lototzis, M.; Papadopoulos, G. K.; Droulia, F.; Tseliou, A.; Tsiros, I. X.
2018-04-01
There are several cases where a circular variable is associated with a linear one. A typical example is wind direction that is often associated with linear quantities such as air temperature and air humidity. The analysis of a statistical relationship of this kind can be tested by the use of parametric and non-parametric methods, each of which has its own advantages and drawbacks. This work deals with correlation analysis using both the parametric and the non-parametric procedure on a small set of meteorological data of air temperature and wind direction during a summer period in a Mediterranean climate. Correlations were examined between hourly, daily and maximum-prevailing values, under typical and non-typical meteorological conditions. Both tests indicated a strong correlation between mean hourly wind directions and mean hourly air temperature, whereas mean daily wind direction and mean daily air temperature do not seem to be correlated. In some cases, however, the two procedures were found to give quite dissimilar levels of significance on the rejection or not of the null hypothesis of no correlation. The simple statistical analysis presented in this study, appropriately extended in large sets of meteorological data, may be a useful tool for estimating effects of wind on local climate studies.
A parametric ribcage geometry model accounting for variations among the adult population.
Wang, Yulong; Cao, Libo; Bai, Zhonghao; Reed, Matthew P; Rupp, Jonathan D; Hoff, Carrie N; Hu, Jingwen
2016-09-06
The objective of this study is to develop a parametric ribcage model that can account for morphological variations among the adult population. Ribcage geometries, including 12 pair of ribs, sternum, and thoracic spine, were collected from CT scans of 101 adult subjects through image segmentation, landmark identification (1016 for each subject), symmetry adjustment, and template mesh mapping (26,180 elements for each subject). Generalized procrustes analysis (GPA), principal component analysis (PCA), and regression analysis were used to develop a parametric ribcage model, which can predict nodal locations of the template mesh according to age, sex, height, and body mass index (BMI). Two regression models, a quadratic model for estimating the ribcage size and a linear model for estimating the ribcage shape, were developed. The results showed that the ribcage size was dominated by the height (p=0.000) and age-sex-interaction (p=0.007) and the ribcage shape was significantly affected by the age (p=0.0005), sex (p=0.0002), height (p=0.0064) and BMI (p=0.0000). Along with proper assignment of cortical bone thickness, material properties and failure properties, this parametric ribcage model can directly serve as the mesh of finite element ribcage models for quantifying effects of human characteristics on thoracic injury risks. Copyright © 2016 Elsevier Ltd. All rights reserved.
Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G
2016-04-01
This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.
Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.
Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J
2017-10-20
This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p < 0.05) between the two lesion types. A hybrid biomarker developed using a stepwise feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.
Hydrogen turbine power conversion system assessment
NASA Technical Reports Server (NTRS)
Wright, D. E.; Lucci, A. D.; Campbell, J.; Lee, J. C.
1978-01-01
A three part technical study was conducted whereby parametric technical and economic feasibility data were developed on several power conversion systems suitable for the generation of central station electric power through the combustion of hydrogen and the use of the resulting heat energy in turbogenerator equipment. The study assessed potential applications of hydrogen-fueled power conversion systems and identified the three most promising candidates: (1) Ericsson Cycle, (2) gas turbine, and (3) direct steam injection system for fossil fuel as well as nuclear powerplants. A technical and economic evaluation was performed on the three systems from which the direct injection system (fossil fuel only) was selected for a preliminary conceptual design of an integrated hydrogen-fired power conversion system.
NASA Astrophysics Data System (ADS)
Cisneros, Anselmo Tomas, Jr.
The Fluoride salt cooled High temperature Reactor (FHR) is a class of advanced nuclear reactors that combine the robust coated particle fuel form from high temperature gas cooled reactors, direct reactor auxillary cooling system (DRACS) passive decay removal of liquid metal fast reactors, and the transparent, high volumetric heat capacitance liquid fluoride salt working fluids---flibe (33%7Li2F-67%BeF)---from molten salt reactors. This combination of fuel and coolant enables FHRs to operate in a high-temperature low-pressure design space that has beneficial safety and economic implications. In 2012, UC Berkeley was charged with developing a pre-conceptual design of a commercial prototype FHR---the Pebble Bed- Fluoride Salt Cooled High Temperature Reactor (PB-FHR)---as part of the Nuclear Energy University Programs' (NEUP) integrated research project. The Mark 1 design of the PB-FHR (Mk1 PB-FHR) is 236 MWt flibe cooled pebble bed nuclear heat source that drives an open-air Brayton combine-cycle power conversion system. The PB-FHR's pebble bed consists of a 19.8% enriched uranium fuel core surrounded by an inert graphite pebble reflector that shields the outer solid graphite reflector, core barrel and reactor vessel. The fuel reaches an average burnup of 178000 MWt-d/MT. The Mk1 PB-FHR exhibits strong negative temperature reactivity feedback from the fuel, graphite moderator and the flibe coolant but a small positive temperature reactivity feedback of the inner reflector and from the outer graphite pebble reflector. A novel neutronics and depletion methodology---the multiple burnup state methodology was developed for an accurate and efficient search for the equilibrium composition of an arbitrary continuously refueled pebble bed reactor core. The Burnup Equilibrium Analysis Utility (BEAU) computer program was developed to implement this methodology. BEAU was successfully benchmarked against published results generated with existing equilibrium depletion codes VSOP and PEBBED for a high temperature gas cooled pebble bed reactor. Three parametric studies were performed for exploring the design space of the PB-FHR---to select a fuel design for the PB-FHR] to select a core configuration; and to optimize the PB-FHR design. These parametric studies investigated trends in the dependence of important reactor performance parameters such as burnup, temperature reactivity feedback, radiation damage, etc on the reactor design variables and attempted to understand the underlying reactor physics responsible for these trends. A pebble fuel parametric study determined that pebble fuel should be designed with a carbon to heavy metal ratio (C/HM) less than 400 to maintain negative coolant temperature reactivity coefficients. Seed and thorium blanket-, seed and inert pebble reflector- and seed only core configurations were investigated for annular FHR PBRs---the C/HM of the blanket pebbles and discharge burnup of the thorium blanket pebbles were additional design variable for core configurations with thorium blankets. Either a thorium blanket or graphite pebble reflector is required to shield the outer graphite reflector enough to extend its service lifetime to 60 EFPY. The fuel fabrication costs and long cycle lengths of the thorium blanket fuel limit the potential economic advantages of using a thorium blanket. Therefore, the seed and pebble reflector core configuration was adopted as the baseline core configuration. Multi-objective optimization with respect to economics was performed for the PB-FHR accounting for safety and other physical design constraints derived from the high-level safety regulatory criteria. These physical constraints were applied along in a design tool, Nuclear Application Value Estimator, that evaluated a simplified cash flow economics model based on estimates of reactor performance parameters calculated using correlations based on the results of parametric design studies for a specific PB-FHR design and a set of economic assumptions about the electricity market to evaluate the economic implications of design decisions. The optimal PB-FHR design---Mark 1 PB-FHR---is described along with a detailed summary of its performance characteristics including: the burnup, the burnup evolution, temperature reactivity coefficients, the power distribution, radiation damage distributions, control element worths, decay heat curves and tritium production rates. The Mk1 PB-FHR satisfies the PB-FHR safety criteria. The fuel, moderator (pebble core, pebble shell, graphite matrix, TRISO layers) and coolant have global negative temperature reactivity coefficients and the fuel temperatures are well within their limits.
Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis
Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.
2006-01-01
In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709
Ruiz-Sanchez, Eduardo
2015-12-01
The Neotropical woody bamboo genus Otatea is one of five genera in the subtribe Guaduinae. Of the eight described Otatea species, seven are endemic to Mexico and one is also distributed in Central and South America. Otatea acuminata has the widest geographical distribution of the eight species, and two of its recently collected populations do not match the known species morphologically. Parametric and non-parametric methods were used to delimit the species in Otatea using five chloroplast markers, one nuclear marker, and morphological characters. The parametric coalescent method and the non-parametric analysis supported the recognition of two distinct evolutionary lineages. Molecular clock estimates were used to estimate divergence times in Otatea. The results for divergence time in Otatea estimated the origin of the speciation events from the Late Miocene to Late Pleistocene. The species delimitation analyses (parametric and non-parametric) identified that the two populations of O. acuminata from Chiapas and Hidalgo are from two separate evolutionary lineages and these new species have morphological characters that separate them from O. acuminata s.s. The geological activity of the Trans-Mexican Volcanic Belt and the Isthmus of Tehuantepec may have isolated populations and limited the gene flow between Otatea species, driving speciation. Based on the results found here, I describe Otatea rzedowskiorum and Otatea victoriae as two new species, morphologically different from O. acuminata. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Vittal, H.; Singh, Jitendra; Kumar, Pankaj; Karmakar, Subhankar
2015-06-01
In watershed management, flood frequency analysis (FFA) is performed to quantify the risk of flooding at different spatial locations and also to provide guidelines for determining the design periods of flood control structures. The traditional FFA was extensively performed by considering univariate scenario for both at-site and regional estimation of return periods. However, due to inherent mutual dependence of the flood variables or characteristics [i.e., peak flow (P), flood volume (V) and flood duration (D), which are random in nature], analysis has been further extended to multivariate scenario, with some restrictive assumptions. To overcome the assumption of same family of marginal density function for all flood variables, the concept of copula has been introduced. Although, the advancement from univariate to multivariate analyses drew formidable attention to the FFA research community, the basic limitation was that the analyses were performed with the implementation of only parametric family of distributions. The aim of the current study is to emphasize the importance of nonparametric approaches in the field of multivariate FFA; however, the nonparametric distribution may not always be a good-fit and capable of replacing well-implemented multivariate parametric and multivariate copula-based applications. Nevertheless, the potential of obtaining best-fit using nonparametric distributions might be improved because such distributions reproduce the sample's characteristics, resulting in more accurate estimations of the multivariate return period. Hence, the current study shows the importance of conjugating multivariate nonparametric approach with multivariate parametric and copula-based approaches, thereby results in a comprehensive framework for complete at-site FFA. Although the proposed framework is designed for at-site FFA, this approach can also be applied to regional FFA because regional estimations ideally include at-site estimations. The framework is based on the following steps: (i) comprehensive trend analysis to assess nonstationarity in the observed data; (ii) selection of the best-fit univariate marginal distribution with a comprehensive set of parametric and nonparametric distributions for the flood variables; (iii) multivariate frequency analyses with parametric, copula-based and nonparametric approaches; and (iv) estimation of joint and various conditional return periods. The proposed framework for frequency analysis is demonstrated using 110 years of observed data from Allegheny River at Salamanca, New York, USA. The results show that for both univariate and multivariate cases, the nonparametric Gaussian kernel provides the best estimate. Further, we perform FFA for twenty major rivers over continental USA, which shows for seven rivers, all the flood variables followed nonparametric Gaussian kernel; whereas for other rivers, parametric distributions provide the best-fit either for one or two flood variables. Thus the summary of results shows that the nonparametric method cannot substitute the parametric and copula-based approaches, but should be considered during any at-site FFA to provide the broadest choices for best estimation of the flood return periods.
NASA Technical Reports Server (NTRS)
Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.
2000-01-01
This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.
A Lunar Surface Operations Simulator
NASA Technical Reports Server (NTRS)
Nayar, H.; Balaram, J.; Cameron, J.; Jain, A.; Lim, C.; Mukherjee, R.; Peters, S.; Pomerantz, M.; Reder, L.; Shakkottai, P.;
2008-01-01
The Lunar Surface Operations Simulator (LSOS) is being developed to support planning and design of space missions to return astronauts to the moon. Vehicles, habitats, dynamic and physical processes and related environment systems are modeled and simulated in LSOS to assist in the visualization and design optimization of systems for lunar surface operations. A parametric analysis tool and a data browser were also implemented to provide an intuitive interface to run multiple simulations and review their results. The simulator and parametric analysis capability are described in this paper.
Selecting a Separable Parametric Spatiotemporal Covariance Structure for Longitudinal Imaging Data
George, Brandon; Aban, Inmaculada
2014-01-01
Longitudinal imaging studies allow great insight into how the structure and function of a subject’s internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures, and the spatial from the outcomes of interest being observed at multiple points in a patients body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on Type I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the Type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be done in practice, as well as how covariance structure choice can change inferences about fixed effects. PMID:25293361
Procedural Semantics as a Theory of Meaning.
1981-03-01
Aaron Sloman (none of whom can be held responsible, of course, fcor the opinions expressed herein). Special thanks are also due to John Lyons for valuable...Meanings 12 7 Parametric Ambiguity 14 8 The Economic Necessity of Ambiguity 16 9 Semantic Interpretation 19 10 Semantics of the Internal Language 21 11...sufficiently low order organisms, the behavioral characteristics of that organism in response to stimuli are essentially "wired in" by their genes
Cost Estimation of Naval Ship Acquisition.
1983-12-01
one a 9-sub- system model , the other a single total cost model . The models were developed using the linear least squares regression tech- nique with...to Linear Statistical Models , McGraw-Hill, 1961. 11. Helmer, F. T., Bibliography on Pricing Methodology and Cost Estimating, Dept. of Economics and...SUPPI.EMSaTARY NOTES IS. KWRo" (Cowaft. en tever aide of ..aesep M’ Idab~t 6 Week ONNa.) Cost estimation; Acquisition; Parametric cost estimate; linear
Incorporating parametric uncertainty into population viability analysis models
McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.
2011-01-01
Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.
NASA Astrophysics Data System (ADS)
Yu, Miao; Huang, Deqing; Yang, Wanqiu
2018-06-01
In this paper, we address the problem of unknown periodicity for a class of discrete-time nonlinear parametric systems without assuming any growth conditions on the nonlinearities. The unknown periodicity hides in the parametric uncertainties, which is difficult to estimate with existing techniques. By incorporating a logic-based switching mechanism, we identify the period and bound of unknown parameter simultaneously. Lyapunov-based analysis is given to demonstrate that a finite number of switchings can guarantee the asymptotic tracking for the nonlinear parametric systems. The simulation result also shows the efficacy of the proposed switching periodic adaptive control approach.
Parametric Sensitivity Analysis of Oscillatory Delay Systems with an Application to Gene Regulation.
Ingalls, Brian; Mincheva, Maya; Roussel, Marc R
2017-07-01
A parametric sensitivity analysis for periodic solutions of delay-differential equations is developed. Because phase shifts cause the sensitivity coefficients of a periodic orbit to diverge, we focus on sensitivities of the extrema, from which amplitude sensitivities are computed, and of the period. Delay-differential equations are often used to model gene expression networks. In these models, the parametric sensitivities of a particular genotype define the local geometry of the evolutionary landscape. Thus, sensitivities can be used to investigate directions of gradual evolutionary change. An oscillatory protein synthesis model whose properties are modulated by RNA interference is used as an example. This model consists of a set of coupled delay-differential equations involving three delays. Sensitivity analyses are carried out at several operating points. Comments on the evolutionary implications of the results are offered.
NASA Astrophysics Data System (ADS)
Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu
2018-06-01
Seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter trade-off, arising from the simultaneous variations of different physical parameters, which increase the nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parametrization and acquisition arrangement. An appropriate choice of model parametrization is important to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parametrizations in isotropic-elastic FWI with walk-away vertical seismic profile (W-VSP) data for unconventional heavy oil reservoir characterization. Six model parametrizations are considered: velocity-density (α, β and ρ΄), modulus-density (κ, μ and ρ), Lamé-density (λ, μ΄ and ρ‴), impedance-density (IP, IS and ρ″), velocity-impedance-I (α΄, β΄ and I_P^' }) and velocity-impedance-II (α″, β″ and I_S^' }). We begin analysing the interparameter trade-off by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. We discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter trade-offs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter trade-offs for various model parametrizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parametrization, the inverted density profile can be overestimated, underestimated or spatially distorted. Among the six cases, only the velocity-density parametrization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. The heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson's ratios, can be identified clearly with the inverted isotropic-elastic parameters.
Wang, Monan; Zhang, Kai; Yang, Ning
2018-04-09
To help doctors decide their treatment from the aspect of mechanical analysis, the work built a computer assisted optimal system for treatment of femoral neck fracture oriented to clinical application. The whole system encompassed the following three parts: Preprocessing module, finite element mechanical analysis module, post processing module. Preprocessing module included parametric modeling of bone, parametric modeling of fracture face, parametric modeling of fixed screw and fixed position and input and transmission of model parameters. Finite element mechanical analysis module included grid division, element type setting, material property setting, contact setting, constraint and load setting, analysis method setting and batch processing operation. Post processing module included extraction and display of batch processing operation results, image generation of batch processing operation, optimal program operation and optimal result display. The system implemented the whole operations from input of fracture parameters to output of the optimal fixed plan according to specific patient real fracture parameter and optimal rules, which demonstrated the effectiveness of the system. Meanwhile, the system had a friendly interface, simple operation and could improve the system function quickly through modifying single module.
2010-02-01
98 8.4.5 Training Screening ............................. .................................................................99 8.5 Experimental...associated with the proposed parametric model. Several im- portant issues are discussed, including model order selection, training screening , and time...parameters associated with the NS-AR model. In addition, we develop model order selection, training screening , and time-series based whitening and
USDA-ARS?s Scientific Manuscript database
This study reports the use of crude glycerine from biodiesel production in the glycerolysis process and presents the associated parametric and energy analyses. The potential of glycerolysis as an alternative pretreatment method for high free fatty acid (FFA) containing fats, oils and greases (FOGs) ...
1980-06-01
problems, a parametric model was built which uses the TI - 59 programmable calculator as its ve- hicle. Although the calculator has many disadvantages for...previous experience using the TI 59 programmable calculator . For example, explicit instructions for reading cards into the memory set will not be given
Research on the Applicable Method of Valuation of Pure Electric Used vehicles
NASA Astrophysics Data System (ADS)
Cai, yun; Tan, zhengping; Wang, yidong; Mao, pan
2018-03-01
With the rapid growth in the ownership of pure electric vehicles, the research on the valuation of used electric vehicles has become the key to the development of the pure electric used vehicle market. The paper analyzed the application of the three value assessment methods, current market price method, capitalized earning method and replacement cost method, in pure electric used vehicles, and draws a conclusion that the replacement cost method is more suitable for pure electric used car. At the same time, the article also conducted a parametric correction exploration research, aiming at the characteristics of pure electric vehicles and replacement cost of the constituent factors. Through the analysis of the applicability parameters of physical devaluation, functional devaluation and economic devaluation, the revised replacement cost method can be used for the valuation of purely used electric vehicles for private use.
Modeling Integrated Water-User Decisions with Intermittent Supplies
NASA Astrophysics Data System (ADS)
Lund, J. R.; Rosenberg, D.
2006-12-01
We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.
NASA Technical Reports Server (NTRS)
Haj-Ali, Rami; Aboudi, Jacob
2012-01-01
The recent two-dimensional (2-D) parametric formulation of the high fidelity generalized method of cells (HFGMC) reported by the authors is generalized for the micromechanical analysis of three-dimensional (3-D) multiphase composites with periodic microstructure. Arbitrary hexahedral subcell geometry is developed to discretize a triply periodic repeating unit-cell (RUC). Linear parametric-geometric mapping is employed to transform the arbitrary hexahedral subcell shapes from the physical space to an auxiliary orthogonal shape, where a complete quadratic displacement expansion is performed. Previously in the 2-D case, additional three equations are needed in the form of average moments of equilibrium as a result of the inclusion of the bilinear terms. However, the present 3-D parametric HFGMC formulation eliminates the need for such additional equations. This is achieved by expressing the coefficients of the full quadratic polynomial expansion of the subcell in terms of the side or face average-displacement vectors. The 2-D parametric and orthogonal HFGMC are special cases of the present 3-D formulation. The continuity of displacements and tractions, as well as the equilibrium equations, are imposed in the average (integral) sense as in the original HFGMC formulation. Each of the six sides (faces) of a subcell has an independent average displacement micro-variable vector which forms an energy-conjugate pair with the transformed average-traction vector. This allows generating symmetric stiffness matrices along with internal resisting vectors for the subcells which enhances the computational efficiency. The established new parametric 3-D HFGMC equations are formulated and solution implementations are addressed. Several applications for triply periodic 3-D composites are presented to demonstrate the general capability and varsity of the present parametric HFGMC method for refined micromechanical analysis by generating the spatial distributions of local stress fields. These applications include triply periodic composites with inclusions in the form of a cavity, spherical inclusion, ellipsoidal inclusion, discontinuous aligned short fiber. A 3-D repeating unit-cell for foam material composite is simulated.
Analysing child mortality in Nigeria with geoadditive discrete-time survival models.
Adebayo, Samson B; Fahrmeir, Ludwig
2005-03-15
Child mortality reflects a country's level of socio-economic development and quality of life. In developing countries, mortality rates are not only influenced by socio-economic, demographic and health variables but they also vary considerably across regions and districts. In this paper, we analysed child mortality in Nigeria with flexible geoadditive discrete-time survival models. This class of models allows us to measure small-area district-specific spatial effects simultaneously with possibly non-linear or time-varying effects of other factors. Inference is fully Bayesian and uses computationally efficient Markov chain Monte Carlo (MCMC) simulation techniques. The application is based on the 1999 Nigeria Demographic and Health Survey. Our method assesses effects at a high level of temporal and spatial resolution not available with traditional parametric models, and the results provide some evidence on how to reduce child mortality by improving socio-economic and public health conditions. Copyright (c) 2004 John Wiley & Sons, Ltd.
Modeling personnel turnover in the parametric organization
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1991-01-01
A model is developed for simulating the dynamics of a newly formed organization, credible during all phases of organizational development. The model development process is broken down into the activities of determining the tasks required for parametric cost analysis (PCA), determining the skills required for each PCA task, determining the skills available in the applicant marketplace, determining the structure of the model, implementing the model, and testing it. The model, parameterized by the likelihood of job function transition, has demonstrated by the capability to represent the transition of personnel across functional boundaries within a parametric organization using a linear dynamical system, and the ability to predict required staffing profiles to meet functional needs at the desired time. The model can be extended by revisions of the state and transition structure to provide refinements in functional definition for the parametric and extended organization.
Zilverstand, Anna; Sorger, Bettina; Kaemingk, Anita; Goebel, Rainer
2017-06-01
We employed a novel parametric spider picture set in the context of a parametric fMRI anxiety provocation study, designed to tease apart brain regions involved in threat monitoring from regions representing an exaggerated anxiety response in spider phobics. For the stimulus set, we systematically manipulated perceived proximity of threat by varying a depicted spider's context, size, and posture. All stimuli were validated in a behavioral rating study (phobics n = 20; controls n = 20; all female). An independent group participated in a subsequent fMRI anxiety provocation study (phobics n = 7; controls n = 7; all female), in which we compared a whole-brain categorical to a whole-brain parametric analysis. Results demonstrated that the parametric analysis provided a richer characterization of the functional role of the involved brain networks. In three brain regions-the mid insula, the dorsal anterior cingulate, and the ventrolateral prefrontal cortex-activation was linearly modulated by perceived proximity specifically in the spider phobia group, indicating a quantitative representation of an exaggerated anxiety response. In other regions (e.g., the amygdala), activation was linearly modulated in both groups, suggesting a functional role in threat monitoring. Prefrontal regions, such as dorsolateral prefrontal cortex, were activated during anxiety provocation but did not show a stimulus-dependent linear modulation in either group. The results confirm that brain regions involved in anxiety processing hold a quantitative representation of a pathological anxiety response and more generally suggest that parametric fMRI designs may be a very powerful tool for clinical research in the future, particularly when developing novel brain-based interventions (e.g., neurofeedback training). Hum Brain Mapp 38:3025-3038, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Duarte, João Valente; Faustino, Ricardo; Lobo, Mercês; Cunha, Gil; Nunes, César; Ferreira, Carlos; Januário, Cristina; Castelo-Branco, Miguel
2016-10-01
Machado-Joseph Disease, inherited type 3 spinocerebellar ataxia (SCA3), is the most common form worldwide. Neuroimaging and neuropathology have consistently demonstrated cerebellar alterations. Here we aimed to discover whole-brain functional biomarkers, based on parametric performance-level-dependent signals. We assessed 13 patients with early SCA3 and 14 healthy participants. We used a combined parametric behavioral/functional neuroimaging design to investigate disease fingerprints, as a function of performance levels, coupled with structural MRI and voxel-based morphometry. Functional magnetic resonance imaging (fMRI) was designed to parametrically analyze behavior and neural responses to audio-paced bilateral thumb movements at temporal frequencies of 1, 3, and 5 Hz. Our performance-level-based design probing neuronal correlates of motor coordination enabled the discovery that neural activation and behavior show critical loss of parametric modulation specifically in SCA3, associated with frequency-dependent cortico/subcortical activation/deactivation patterns. Cerebellar/cortical rate-dependent dissociation patterns could clearly differentiate between groups irrespective of grey matter loss. Our findings suggest functional reorganization of the motor network and indicate a possible role of fMRI as a tool to monitor disease progression in SCA3. Accordingly, fMRI patterns proved to be potential biomarkers in early SCA3, as tested by receiver operating characteristic analysis of both behavior and neural activation at different frequencies. Discrimination analysis based on BOLD signal in response to the applied parametric finger-tapping task significantly often reached >80% sensitivity and specificity in single regions-of-interest.Functional fingerprints based on cerebellar and cortical BOLD performance dependent signal modulation can thus be combined as diagnostic and/or therapeutic targets in hereditary ataxia. Hum Brain Mapp 37:3656-3668, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Waveform inversion for orthorhombic anisotropy with P waves: feasibility and resolution
NASA Astrophysics Data System (ADS)
Kazei, Vladimir; Alkhalifah, Tariq
2018-05-01
Various parametrizations have been suggested to simplify inversions of first arrivals, or P waves, in orthorhombic anisotropic media, but the number and type of retrievable parameters have not been decisively determined. We show that only six parameters can be retrieved from the dynamic linearized inversion of P waves. These parameters are different from the six parameters needed to describe the kinematics of P waves. Reflection-based radiation patterns from the P-P scattered waves are remapped into the spectral domain to allow for our resolution analysis based on the effective angle of illumination concept. Singular value decomposition of the spectral sensitivities from various azimuths, offset coverage scenarios and data bandwidths allows us to quantify the resolution of different parametrizations, taking into account the signal-to-noise ratio in a given experiment. According to our singular value analysis, when the primary goal of inversion is determining the velocity of the P waves, gradually adding anisotropy of lower orders (isotropic, vertically transversally isotropic and orthorhombic) in hierarchical parametrization is the best choice. Hierarchical parametrization reduces the trade-off between the parameters and makes gradual introduction of lower anisotropy orders straightforward. When all the anisotropic parameters affecting P-wave propagation need to be retrieved simultaneously, the classic parametrization of orthorhombic medium with elastic stiffness matrix coefficients and density is a better choice for inversion. We provide estimates of the number and set of parameters that can be retrieved from surface seismic data in different acquisition scenarios. To set up an inversion process, the singular values determine the number of parameters that can be inverted and the resolution matrices from the parametrizations can be used to ascertain the set of parameters that can be resolved.
Multiresolution and Explicit Methods for Vector Field Analysis and Visualization
NASA Technical Reports Server (NTRS)
1996-01-01
We first report on our current progress in the area of explicit methods for tangent curve computation. The basic idea of this method is to decompose the domain into a collection of triangles (or tetrahedra) and assume linear variation of the vector field over each cell. With this assumption, the equations which define a tangent curve become a system of linear, constant coefficient ODE's which can be solved explicitly. There are five different representation of the solution depending on the eigenvalues of the Jacobian. The analysis of these five cases is somewhat similar to the phase plane analysis often associate with critical point classification within the context of topological methods, but it is not exactly the same. There are some critical differences. Moving from one cell to the next as a tangent curve is tracked, requires the computation of the exit point which is an intersection of the solution of the constant coefficient ODE and the edge of a triangle. There are two possible approaches to this root computation problem. We can express the tangent curve into parametric form and substitute into an implicit form for the edge or we can express the edge in parametric form and substitute in an implicit form of the tangent curve. Normally the solution of a system of ODE's is given in parametric form and so the first approach is the most accessible and straightforward. The second approach requires the 'implicitization' of these parametric curves. The implicitization of parametric curves can often be rather difficult, but in this case we have been successful and have been able to develop algorithms and subsequent computer programs for both approaches. We will give these details along with some comparisons in a forthcoming research paper on this topic.
Comparison of thawing and freezing dark energy parametrizations
NASA Astrophysics Data System (ADS)
Pantazis, G.; Nesseris, S.; Perivolaropoulos, L.
2016-05-01
Dark energy equation of state w (z ) parametrizations with two parameters and given monotonicity are generically either convex or concave functions. This makes them suitable for fitting either freezing or thawing quintessence models but not both simultaneously. Fitting a data set based on a freezing model with an unsuitable (concave when increasing) w (z ) parametrization [like Chevallier-Polarski-Linder (CPL)] can lead to significant misleading features like crossing of the phantom divide line, incorrect w (z =0 ), incorrect slope, etc., that are not present in the underlying cosmological model. To demonstrate this fact we generate scattered cosmological data at both the level of w (z ) and the luminosity distance DL(z ) based on either thawing or freezing quintessence models and fit them using parametrizations of convex and of concave type. We then compare statistically significant features of the best fit w (z ) with actual features of the underlying model. We thus verify that the use of unsuitable parametrizations can lead to misleading conclusions. In order to avoid these problems it is important to either use both convex and concave parametrizations and select the one with the best χ2 or use principal component analysis thus splitting the redshift range into independent bins. In the latter case, however, significant information about the slope of w (z ) at high redshifts is lost. Finally, we propose a new family of parametrizations w (z )=w0+wa(z/1 +z )n which generalizes the CPL and interpolates between thawing and freezing parametrizations as the parameter n increases to values larger than 1.
Le, Quang A; Bae, Yuna H; Kang, Jenny H
2016-10-01
The EMILIA trial demonstrated that trastuzumab emtansine (T-DM1) significantly increased the median profession-free and overall survival relative to combination therapy with lapatinib plus capecitabine (LC) in patients with HER2-positive advanced breast cancer (ABC) previously treated with trastuzumab and a taxane. We performed an economic analysis of T-DM1 as a second-line therapy compared to LC and monotherapy with capecitabine (C) from both perspectives of the US payer and society. We developed four possible Markov models for ABC to compare the projected life-time costs and outcomes of T-DM1, LC, and C. Model transition probabilities were estimated from the EMILIA and EGF100151 clinical trials. Direct costs of the therapies, major adverse events, laboratory tests, and disease progression, indirect costs (productivity losses due to morbidity and mortality), and health utilities were obtained from published sources. The models used 3 % discount rate and reported in 2015 US dollars. Probabilistic sensitivity analysis and model averaging were used to account for model parametric and structural uncertainty. When incorporating both model parametric and structural uncertainty, the resulting incremental cost-effectiveness ratios (ICER) comparing T-DM1 to LC and T-DM1 to C were $183,828 per quality-adjusted life year (QALY) and $126,001/QALY from the societal perspective, respectively. From the payer's perspective, the ICERs were $220,385/QALY (T-DM1 vs. LC) and $168,355/QALY (T-DM1 vs. C). From both perspectives of the US payer and society, T-DM1 is not cost-effective when comparing to the LC combination therapy at a willingness-to-pay threshold of $150,000/QALY. T-DM1 might have a better chance to be cost-effective compared to capecitabine monotherapy from the US societal perspective.
Pluripotency gene network dynamics: System views from parametric analysis.
Akberdin, Ilya R; Omelyanchuk, Nadezda A; Fadeev, Stanislav I; Leskova, Natalya E; Oschepkova, Evgeniya A; Kazantsev, Fedor V; Matushkin, Yury G; Afonnikov, Dmitry A; Kolchanov, Nikolay A
2018-01-01
Multiple experimental data demonstrated that the core gene network orchestrating self-renewal and differentiation of mouse embryonic stem cells involves activity of Oct4, Sox2 and Nanog genes by means of a number of positive feedback loops among them. However, recent studies indicated that the architecture of the core gene network should also incorporate negative Nanog autoregulation and might not include positive feedbacks from Nanog to Oct4 and Sox2. Thorough parametric analysis of the mathematical model based on this revisited core regulatory circuit identified that there are substantial changes in model dynamics occurred depending on the strength of Oct4 and Sox2 activation and molecular complexity of Nanog autorepression. The analysis showed the existence of four dynamical domains with different numbers of stable and unstable steady states. We hypothesize that these domains can constitute the checkpoints in a developmental progression from naïve to primed pluripotency and vice versa. During this transition, parametric conditions exist, which generate an oscillatory behavior of the system explaining heterogeneity in expression of pluripotent and differentiation factors in serum ESC cultures. Eventually, simulations showed that addition of positive feedbacks from Nanog to Oct4 and Sox2 leads mainly to increase of the parametric space for the naïve ESC state, in which pluripotency factors are strongly expressed while differentiation ones are repressed.
Kattner, Florian; Cochrane, Aaron; Green, C Shawn
2017-09-01
The majority of theoretical models of learning consider learning to be a continuous function of experience. However, most perceptual learning studies use thresholds estimated by fitting psychometric functions to independent blocks, sometimes then fitting a parametric function to these block-wise estimated thresholds. Critically, such approaches tend to violate the basic principle that learning is continuous through time (e.g., by aggregating trials into large "blocks" for analysis that each assume stationarity, then fitting learning functions to these aggregated blocks). To address this discrepancy between base theory and analysis practice, here we instead propose fitting a parametric function to thresholds from each individual trial. In particular, we implemented a dynamic psychometric function whose parameters were allowed to change continuously with each trial, thus parameterizing nonstationarity. We fit the resulting continuous time parametric model to data from two different perceptual learning tasks. In nearly every case, the quality of the fits derived from the continuous time parametric model outperformed the fits derived from a nonparametric approach wherein separate psychometric functions were fit to blocks of trials. Because such a continuous trial-dependent model of perceptual learning also offers a number of additional advantages (e.g., the ability to extrapolate beyond the observed data; the ability to estimate performance on individual critical trials), we suggest that this technique would be a useful addition to each psychophysicist's analysis toolkit.
Creating A Data Base For Design Of An Impeller
NASA Technical Reports Server (NTRS)
Prueger, George H.; Chen, Wei-Chung
1993-01-01
Report describes use of Taguchi method of parametric design to create data base facilitating optimization of design of impeller in centrifugal pump. Data base enables systematic design analysis covering all significant design parameters. Reduces time and cost of parametric optimization of design: for particular impeller considered, one can cover 4,374 designs by computational simulations of performance for only 18 cases.
Nonlinear Analysis of Mechanical Systems Under Combined Harmonic and Stochastic Excitation
1993-05-27
Namachchivaya and Naresh Malhotra Department of Aeronautical and Astronautical Engineering University of Illinois, Urbana-Champaign Urbana, Illinois...Aeronauticai and Astronautical Engineering, University of Illinois, 1991. 2. N. Sri Namachchivaya and N. Malhotra , Parametrically Excited Hopf Bifurcation...Namachchivaya and N. Malhotra , Parametrically Excited Hopf Bifurcation with Non-semisimple 1:1 Resonance, Nonlinear Vibrations, ASME-AMD, Vol. 114, 1992. 3
Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks.
Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph
2015-08-01
Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is "non-intrusive" and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design.
Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks
Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph
2015-01-01
Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is “non-intrusive” and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design. PMID:26317784
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charbonneau-Lefort, Mathieu; Afeyan, Bedros; Fejer, M. M.
Optical parametric amplifiers using chirped quasi-phase-matching (QPM) gratings offer the possibility of engineering the gain and group delay spectra. We give practical formulas for the design of such amplifiers. We consider linearly chirped QPM gratings providing constant gain over a broad bandwidth, sinusoidally modulated profiles for selective frequency amplification and a pair of QPM gratings working in tandem to ensure constant gain and constant group delay at the same time across the spectrum. Finally, the analysis is carried out in the frequency domain using Wentzel–Kramers–Brillouin analysis.
Ground-Based Telescope Parametric Cost Model
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Rowell, Ginger Holmes
2004-01-01
A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.
Solid state SPS microwave generation and transmission study. Volume 1: Phase 2
NASA Technical Reports Server (NTRS)
Maynard, O. E.
1980-01-01
The solid state sandwich concept for Solar Power Station (SPS) was investigated. The design effort concentrated on the spacetenna, but did include some system analysis for parametric comparison reasons. The study specifically included definition and math modeling of basic solid state microwave devices, an initial conceptual subsystems and system design, sidelobe control and system selection, an assessment of selected system concept and parametric solid state microwave power transmission system data relevant to the SPS concept. Although device efficiency was not a goal, the sensitivities to design of this efficiency were parametrically treated. Sidelobe control consisted of various single step tapers, multistep tapers, and Gaussian tapers. A preliminary assessment of a hybrid concept using tubes and solid state is also included. There is a considerable amount of thermal analysis provided with emphasis on sensitivities to waste heat radiator form factor, emissivity, absorptivity, amplifier efficiency, material and junction temperature.
The use of analysis of variance procedures in biological studies
Williams, B.K.
1987-01-01
The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.
Schwabe, Inga; Boomsma, Dorret I; van den Berg, Stéphanie M
2017-12-01
Genotype by environment interaction in behavioral traits may be assessed by estimating the proportion of variance that is explained by genetic and environmental influences conditional on a measured moderating variable, such as a known environmental exposure. Behavioral traits of interest are often measured by questionnaires and analyzed as sum scores on the items. However, statistical results on genotype by environment interaction based on sum scores can be biased due to the properties of a scale. This article presents a method that makes it possible to analyze the actually observed (phenotypic) item data rather than a sum score by simultaneously estimating the genetic model and an item response theory (IRT) model. In the proposed model, the estimation of genotype by environment interaction is based on an alternative parametrization that is uniquely identified and therefore to be preferred over standard parametrizations. A simulation study shows good performance of our method compared to analyzing sum scores in terms of bias. Next, we analyzed data of 2,110 12-year-old Dutch twin pairs on mathematical ability. Genetic models were evaluated and genetic and environmental variance components estimated as a function of a family's socio-economic status (SES). Results suggested that common environmental influences are less important in creating individual differences in mathematical ability in families with a high SES than in creating individual differences in mathematical ability in twin pairs with a low or average SES.
Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca
2016-10-15
The Expected Value of Perfect Partial Information (EVPPI) is a decision-theoretic measure of the 'cost' of parametric uncertainty in decision making used principally in health economic decision making. Despite this decision-theoretic grounding, the uptake of EVPPI calculations in practice has been slow. This is in part due to the prohibitive computational time required to estimate the EVPPI via Monte Carlo simulations. However, recent developments have demonstrated that the EVPPI can be estimated by non-parametric regression methods, which have significantly decreased the computation time required to approximate the EVPPI. Under certain circumstances, high-dimensional Gaussian Process (GP) regression is suggested, but this can still be prohibitively expensive. Applying fast computation methods developed in spatial statistics using Integrated Nested Laplace Approximations (INLA) and projecting from a high-dimensional into a low-dimensional input space allows us to decrease the computation time for fitting these high-dimensional GP, often substantially. We demonstrate that the EVPPI calculated using our method for GP regression is in line with the standard GP regression method and that despite the apparent methodological complexity of this new method, R functions are available in the package BCEA to implement it simply and efficiently. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Kumemura, Momoko; Odake, Tamao; Korenaga, Takashi
2005-06-01
A laser-induced fluorescence microscopic system based on optical parametric oscillation has been constructed as a tunable detector for microchip analysis. The detection limit of sulforhodamine B (Ex. 520 nm, Em. 570 nm) was 0.2 mumol, which was approximately eight orders of magnitude better than with a conventional fluorophotometer. The system was applied to the determination of fluorescence-labeled DNA (Ex. 494 nm, Em. 519 nm) in a microchannel and the detection limit reached a single molecule. These results showed the feasibility of this system as a highly sensitive and tunable fluorescence detector for microchip analysis.
NASA Astrophysics Data System (ADS)
Thomas, Mark Andrew
Throughout the 1970s and 1980s the Soviet Union sold oil shipments to the member-states of the Council of Mutual Economic Assistance (CMEA) at a fraction of the world market price (wmp). Contrary to arguments made by previous scholars that it paid a subsidy, namely the difference between the wmp and the CMEA price, either as a reward for material contributions to Soviet foreign policy objectives or as a consequence of membership in a customs union, the Soviet Union provided subsidized oil shipments as a form of economic assistance in maintaining its hegemony. Using non-parametric statistical analysis of previous scholars' data and comparative case studies based on interviews of Soviet decision-makers and on archival research, this study shows that the Soviet Union acted as a hegemon, which created a protectionist trade regime, used oil policy as means of hegemonic maintenance. The CMEA, the embodiment of values espoused in the Soviet trade regime identified as "embedded supranationalism", stood as the institutional antithesis of a customs unions, which embodied the values of the Western liberal trade regime. Soviet leaders did not use oil subsidies or trade relations in general as means of calibrating CMEA member-states' domestic or foreign policy behavior. Soviet leaders used subsidized oil as a means of supporting East European national economic development with the ultimate goal of creating politically legitimate governments thereby ensuring political stability in its cordon sanitaire with the West.
The ASAC Air Carrier Investment Model (Second Generation)
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Johnson, Jesse P.; Sickles, Robin C.; Good, David H.
1997-01-01
To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. To accomplish this, NASA is building an Aviation System Analysis Capability (ASAC). The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. To link the economics of flight with the technology of flight, ASAC requires a parametrically based mode with extensions that link airline operations and investments in aircraft with aircraft characteristics. This model also must provide a mechanism for incorporating air travel demand and profitability factors into the airlines' investment decisions. Finally, the model must be flexible and capable of being incorporated into a wide-ranging suite of economic and technical models that are envisioned for ASAC. We describe a second-generation Air Carrier Investment Model that meets these requirements. The enhanced model incorporates econometric results from the supply and demand curves faced by U.S.-scheduled passenger air carriers. It uses detailed information about their fleets in 1995 to make predictions about future aircraft purchases. It enables analysts with the ability to project revenue passenger-miles flown, airline industry employment, airline operating profit margins, numbers and types of aircraft in the fleet, and changes in aircraft manufacturing employment under various user-defined scenarios.
Evolution of spherical cavitation bubbles: Parametric and closed-form solutions
NASA Astrophysics Data System (ADS)
Mancas, Stefan C.; Rosu, Haret C.
2016-02-01
We present an analysis of the Rayleigh-Plesset equation for a three dimensional vacuous bubble in water. In the simplest case when the effects of surface tension are neglected, the known parametric solutions for the radius and time evolution of the bubble in terms of a hypergeometric function are briefly reviewed. By including the surface tension, we show the connection between the Rayleigh-Plesset equation and Abel's equation, and obtain the parametric rational Weierstrass periodic solutions following the Abel route. In the same Abel approach, we also provide a discussion of the nonintegrable case of nonzero viscosity for which we perform a numerical integration.
Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation
NASA Technical Reports Server (NTRS)
Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.
1998-01-01
The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.
Buffer thermal energy storage for an air Brayton solar engine
NASA Technical Reports Server (NTRS)
Strumpf, H. J.; Barr, K. P.
1981-01-01
The application of latent-heat buffer thermal energy storage to a point-focusing solar receiver equipped with an air Brayton engine was studied. To demonstrate the effect of buffer thermal energy storage on engine operation, a computer program was written which models the recuperator, receiver, and thermal storage device as finite-element thermal masses. Actual operating or predicted performance data are used for all components, including the rotating equipment. Based on insolation input and a specified control scheme, the program predicts the Brayton engine operation, including flows, temperatures, and pressures for the various components, along with the engine output power. An economic parametric study indicates that the economic viability of buffer thermal energy storage is largely a function of the achievable engine life.
Direct 4D reconstruction of parametric images incorporating anato-functional joint entropy.
Tang, Jing; Kuwabara, Hiroto; Wong, Dean F; Rahmim, Arman
2010-08-07
We developed an anatomy-guided 4D closed-form algorithm to directly reconstruct parametric images from projection data for (nearly) irreversible tracers. Conventional methods consist of individually reconstructing 2D/3D PET data, followed by graphical analysis on the sequence of reconstructed image frames. The proposed direct reconstruction approach maintains the simplicity and accuracy of the expectation-maximization (EM) algorithm by extending the system matrix to include the relation between the parametric images and the measured data. A closed-form solution was achieved using a different hidden complete-data formulation within the EM framework. Furthermore, the proposed method was extended to maximum a posterior reconstruction via incorporation of MR image information, taking the joint entropy between MR and parametric PET features as the prior. Using realistic simulated noisy [(11)C]-naltrindole PET and MR brain images/data, the quantitative performance of the proposed methods was investigated. Significant improvements in terms of noise versus bias performance were demonstrated when performing direct parametric reconstruction, and additionally upon extending the algorithm to its Bayesian counterpart using the MR-PET joint entropy measure.
Direct Parametric Reconstruction With Joint Motion Estimation/Correction for Dynamic Brain PET Data.
Jiao, Jieqing; Bousse, Alexandre; Thielemans, Kris; Burgos, Ninon; Weston, Philip S J; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Markiewicz, Pawel; Ourselin, Sebastien
2017-01-01
Direct reconstruction of parametric images from raw photon counts has been shown to improve the quantitative analysis of dynamic positron emission tomography (PET) data. However it suffers from subject motion which is inevitable during the typical acquisition time of 1-2 hours. In this work we propose a framework to jointly estimate subject head motion and reconstruct the motion-corrected parametric images directly from raw PET data, so that the effects of distorted tissue-to-voxel mapping due to subject motion can be reduced in reconstructing the parametric images with motion-compensated attenuation correction and spatially aligned temporal PET data. The proposed approach is formulated within the maximum likelihood framework, and efficient solutions are derived for estimating subject motion and kinetic parameters from raw PET photon count data. Results from evaluations on simulated [ 11 C]raclopride data using the Zubal brain phantom and real clinical [ 18 F]florbetapir data of a patient with Alzheimer's disease show that the proposed joint direct parametric reconstruction motion correction approach can improve the accuracy of quantifying dynamic PET data with large subject motion.
Selecting a separable parametric spatiotemporal covariance structure for longitudinal imaging data.
George, Brandon; Aban, Inmaculada
2015-01-15
Longitudinal imaging studies allow great insight into how the structure and function of a subject's internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures and the spatial from the outcomes of interest being observed at multiple points in a patient's body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on types I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be performed in practice, as well as how covariance structure choice can change inferences about fixed effects. Copyright © 2014 John Wiley & Sons, Ltd.
Can you trust the parametric standard errors in nonlinear least squares? Yes, with provisos.
Tellinghuisen, Joel
2018-04-01
Questions about the reliability of parametric standard errors (SEs) from nonlinear least squares (LS) algorithms have led to a general mistrust of these precision estimators that is often unwarranted. The importance of non-Gaussian parameter distributions is illustrated by converting linear models to nonlinear by substituting e A , ln A, and 1/A for a linear parameter a. Monte Carlo (MC) simulations characterize parameter distributions in more complex cases, including when data have varying uncertainty and should be weighted, but weights are neglected. This situation leads to loss of precision and erroneous parametric SEs, as is illustrated for the Lineweaver-Burk analysis of enzyme kinetics data and the analysis of isothermal titration calorimetry data. Non-Gaussian parameter distributions are generally asymmetric and biased. However, when the parametric SE is <10% of the magnitude of the parameter, both the bias and the asymmetry can usually be ignored. Sometimes nonlinear estimators can be redefined to give more normal distributions and better convergence properties. Variable data uncertainty, or heteroscedasticity, can sometimes be handled by data transforms but more generally requires weighted LS, which in turn require knowledge of the data variance. Parametric SEs are rigorously correct in linear LS under the usual assumptions, and are a trustworthy approximation in nonlinear LS provided they are sufficiently small - a condition favored by the abundant, precise data routinely collected in many modern instrumental methods. Copyright © 2018 Elsevier B.V. All rights reserved.
Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data
Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao
2012-01-01
Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976
Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data.
Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao
2013-01-01
Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions.
Falk, Carl F; Cai, Li
2016-06-01
We present a semi-parametric approach to estimating item response functions (IRF) useful when the true IRF does not strictly follow commonly used functions. Our approach replaces the linear predictor of the generalized partial credit model with a monotonic polynomial. The model includes the regular generalized partial credit model at the lowest order polynomial. Our approach extends Liang's (A semi-parametric approach to estimate IRFs, Unpublished doctoral dissertation, 2007) method for dichotomous item responses to the case of polytomous data. Furthermore, item parameter estimation is implemented with maximum marginal likelihood using the Bock-Aitkin EM algorithm, thereby facilitating multiple group analyses useful in operational settings. Our approach is demonstrated on both educational and psychological data. We present simulation results comparing our approach to more standard IRF estimation approaches and other non-parametric and semi-parametric alternatives.
THz-wave parametric sources and imaging applications
NASA Astrophysics Data System (ADS)
Kawase, Kodo
2004-12-01
We have studied the generation of terahertz (THz) waves by optical parametric processes based on laser light scattering from the polariton mode of nonlinear crystals. Using parametric oscillation of MgO-doped LiNbO3 crystal pumped by a nano-second Q-switched Nd:YAG laser, we have realized a widely tunable coherent THz-wave sources with a simple configuration. We have also developed a novel basic technology for THz imaging, which allows detection and identification of chemicals by introducing the component spatial pattern analysis. The spatial distributions of the chemicals were obtained from terahertz multispectral trasillumination images, using absorption spectra previously measured with a widely tunable THz-wave parametric oscillator. Further we have applied this technique to the detection and identification of illicit drugs concealed in envelopes. The samples we used were methamphetamine and MDMA, two of the most widely consumed illegal drugs in Japan, and aspirin as a reference.
Parametrization study of the land multiparameter VTI elastic waveform inversion
NASA Astrophysics Data System (ADS)
He, W.; Plessix, R.-É.; Singh, S.
2018-06-01
Multiparameter inversion of seismic data remains challenging due to the trade-off between the different elastic parameters and the non-uniqueness of the solution. The sensitivity of the seismic data to a given subsurface elastic parameter depends on the source and receiver ray/wave path orientations at the subsurface point. In a high-frequency approximation, this is commonly analysed through the study of the radiation patterns that indicate the sensitivity of each parameter versus the incoming (from the source) and outgoing (to the receiver) angles. In practice, this means that the inversion result becomes sensitive to the choice of parametrization, notably because the null-space of the inversion depends on this choice. We can use a least-overlapping parametrization that minimizes the overlaps between the radiation patterns, in this case each parameter is only sensitive in a restricted angle domain, or an overlapping parametrization that contains a parameter sensitive to all angles, in this case overlaps between the radiation parameters occur. Considering a multiparameter inversion in an elastic vertically transverse isotropic medium and a complex land geological setting, we show that the inversion with the least-overlapping parametrization gives less satisfactory results than with the overlapping parametrization. The difficulties come from the complex wave paths that make difficult to predict the areas of sensitivity of each parameter. This shows that the parametrization choice should not only be based on the radiation pattern analysis but also on the angular coverage at each subsurface point that depends on geology and the acquisition layout.
Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob
2016-08-01
The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.
Comparison of Salmonella enteritidis phage types isolated from layers and humans in Belgium in 2005.
Welby, Sarah; Imberechts, Hein; Riocreux, Flavien; Bertrand, Sophie; Dierick, Katelijne; Wildemauwe, Christa; Hooyberghs, Jozef; Van der Stede, Yves
2011-08-01
The aim of this study was to investigate the available results for Belgium of the European Union coordinated monitoring program (2004/665 EC) on Salmonella in layers in 2005, as well as the results of the monthly outbreak reports of Salmonella Enteritidis in humans in 2005 to identify a possible statistical significant trend in both populations. Separate descriptive statistics and univariate analysis were carried out and the parametric and/or non-parametric hypothesis tests were conducted. A time cluster analysis was performed for all Salmonella Enteritidis phage types (PTs) isolated. The proportions of each Salmonella Enteritidis PT in layers and in humans were compared and the monthly distribution of the most common PT, isolated in both populations, was evaluated. The time cluster analysis revealed significant clusters during the months May and June for layers and May, July, August, and September for humans. PT21, the most frequently isolated PT in both populations in 2005, seemed to be responsible of these significant clusters. PT4 was the second most frequently isolated PT. No significant difference was found for the monthly trend evolution of both PT in both populations based on parametric and non-parametric methods. A similar monthly trend of PT distribution in humans and layers during the year 2005 was observed. The time cluster analysis and the statistical significance testing confirmed these results. Moreover, the time cluster analysis showed significant clusters during the summer time and slightly delayed in time (humans after layers). These results suggest a common link between the prevalence of Salmonella Enteritidis in layers and the occurrence of the pathogen in humans. Phage typing was confirmed to be a useful tool for identifying temporal trends.
Inverse Thermal Analysis of Titanium GTA Welds Using Multiple Constraints
NASA Astrophysics Data System (ADS)
Lambrakos, S. G.; Shabaev, A.; Huang, L.
2015-06-01
Inverse thermal analysis of titanium gas-tungsten-arc welds using multiple constraint conditions is presented. This analysis employs a methodology that is in terms of numerical-analytical basis functions for inverse thermal analysis of steady-state energy deposition in plate structures. The results of this type of analysis provide parametric representations of weld temperature histories that can be adopted as input data to various types of computational procedures, such as those for prediction of solid-state phase transformations. In addition, these temperature histories can be used to construct parametric function representations for inverse thermal analysis of welds corresponding to other process parameters or welding processes whose process conditions are within similar regimes. The present study applies an inverse thermal analysis procedure that provides for the inclusion of constraint conditions associated with both solidification and phase transformation boundaries.
Potency control of modified live viral vaccines for veterinary use.
Terpstra, C; Kroese, A H
1996-04-01
This paper reviews various aspects of efficacy, and methods for assaying the potency of modified live viral vaccines. The pros and cons of parametric versus non-parametric methods for analysis of potency assays are discussed and critical levels of protection, as determined by the target(s) of vaccination, are exemplified. Recommendations are presented for designing potency assays on master virus seeds and vaccine batches.
Potency control of modified live viral vaccines for veterinary use.
Terpstra, C; Kroese, A H
1996-01-01
This paper reviews various aspects of efficacy, and methods for assaying the potency of modified live viral vaccines. The pros and cons of parametric versus non-parametric methods for analysis of potency assays are discussed and critical levels of protection, as determined by the target(s) of vaccination, are exemplified. Recommendations are presented for designing potency assays on master virus seeds and vaccine batches.
GASP- GENERAL AVIATION SYNTHESIS PROGRAM
NASA Technical Reports Server (NTRS)
Galloway, T. L.
1994-01-01
The General Aviation Synthesis Program, GASP, was developed to perform tasks generally associated with the preliminary phase of aircraft design. GASP gives the analyst the capability of performing parametric studies in a rapid manner during preliminary design efforts. During the development of GASP, emphasis was placed on small fixed-wing aircraft employing propulsion systems varying from a single piston engine with a fixed pitch propeller through twin turboprop/turbofan systems as employed in business or transport type aircraft. The program is comprised of modules representing the various technical disciplines of design, integrated into a computational flow which ensures that the interacting effects of design variables are continuously accounted for in the aircraft sizing procedures. GASP provides a useful tool for comparing configurations, assessing aircraft performance and economics, and performing tradeoff and sensitivity studies. By utilizing GASP, the impact of various aircraft requirements and design factors may be studied in a systematic manner, with benefits being measured in terms of overall aircraft performance and economics. The GASP program consists of a control module and six "technology" submodules which perform the various independent studies required in the design of general aviation or small transport type aircraft. The six technology modules include geometry, aerodynamics, propulsion, weight and balance, mission analysis, and economics. The geometry module calculates the dimensions of the synthesized aircraft components based on such input parameters as number of passengers, aspect ratio, taper ratio, sweep angles, and thickness of wing and tail surfaces. The aerodynamics module calculates the various lift and drag coefficients of the synthesized aircraft based on inputs concerning configuration geometry, flight conditions, and type of high lift device. The propulsion module determines the engine size and performance for the synthesized aircraft. Both cruise and take-off requirements for the aircraft may be specified. This module can currently simulate turbojet, turbofan, turboprop, and reciprocating or rotating combustion engines. The weight and balance module accepts as input gross weight, payload, aircraft geometry, and weight trend coefficients for use in calculating the size of tip tanks and wing location required such that the synthesized aircraft is in balance for center of gravity travel. In the mission analysis module, the taxi, take-off, climb, cruise, and landing segments of a specified mission are analyzed to compute the total range, and the aircraft size required to provide this range is determined. In the economic module both the flyaway and operating costs are determined from estimated resources and services cost. The six technology modules are integrated into a single synthesis system by the control module. This integrated approach ensures that the results from each module contain the effect of design interactions among all the modules. Starting from a set of simple input quantities concerning aircraft type, size, and performance, the synthesis is extended to the point where all of the important aircraft characteristics have been analyzed quantitatively. Together, the synthesis model and procedure develops aircraft configurations in a manner useful in parametric analysis and provides a useful step toward more detailed analytical and experimental studies. The GASP program is written in FORTRAN IV for batch execution and has been implemented on a CDC CYBER 170 series computer with a central memory requirement of approximately 200K(octal) of 60 bit words. The GASP program was developed in 1978.
Entangled-photon compressive ghost imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zerom, Petros; Chan, Kam Wai Clifford; Howell, John C.
2011-12-15
We have experimentally demonstrated high-resolution compressive ghost imaging at the single-photon level using entangled photons produced by a spontaneous parametric down-conversion source and using single-pixel detectors. For a given mean-squared error, the number of photons needed to reconstruct a two-dimensional image is found to be much smaller than that in quantum ghost imaging experiments employing a raster scan. This procedure not only shortens the data acquisition time, but also suggests a more economical use of photons for low-light-level and quantum image formation.
Laser And Nonlinear Optical Materials For Laser Remote Sensing
NASA Technical Reports Server (NTRS)
Barnes, Norman P.
2005-01-01
NASA remote sensing missions involving laser systems and their economic impact are outlined. Potential remote sensing missions include: green house gasses, tropospheric winds, ozone, water vapor, and ice cap thickness. Systems to perform these measurements use lanthanide series lasers and nonlinear devices including second harmonic generators and parametric oscillators. Demands these missions place on the laser and nonlinear optical materials are discussed from a materials point of view. Methods of designing new laser and nonlinear optical materials to meet these demands are presented.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
NASA Astrophysics Data System (ADS)
Viswanath, Satish; Bloch, B. Nicholas; Chappelow, Jonathan; Patel, Pratik; Rofsky, Neil; Lenkinski, Robert; Genega, Elizabeth; Madabhushi, Anant
2011-03-01
Currently, there is significant interest in developing methods for quantitative integration of multi-parametric (structural, functional) imaging data with the objective of building automated meta-classifiers to improve disease detection, diagnosis, and prognosis. Such techniques are required to address the differences in dimensionalities and scales of individual protocols, while deriving an integrated multi-parametric data representation which best captures all disease-pertinent information available. In this paper, we present a scheme called Enhanced Multi-Protocol Analysis via Intelligent Supervised Embedding (EMPrAvISE); a powerful, generalizable framework applicable to a variety of domains for multi-parametric data representation and fusion. Our scheme utilizes an ensemble of embeddings (via dimensionality reduction, DR); thereby exploiting the variance amongst multiple uncorrelated embeddings in a manner similar to ensemble classifier schemes (e.g. Bagging, Boosting). We apply this framework to the problem of prostate cancer (CaP) detection on 12 3 Tesla pre-operative in vivo multi-parametric (T2-weighted, Dynamic Contrast Enhanced, and Diffusion-weighted) magnetic resonance imaging (MRI) studies, in turn comprising a total of 39 2D planar MR images. We first align the different imaging protocols via automated image registration, followed by quantification of image attributes from individual protocols. Multiple embeddings are generated from the resultant high-dimensional feature space which are then combined intelligently to yield a single stable solution. Our scheme is employed in conjunction with graph embedding (for DR) and probabilistic boosting trees (PBTs) to detect CaP on multi-parametric MRI. Finally, a probabilistic pairwise Markov Random Field algorithm is used to apply spatial constraints to the result of the PBT classifier, yielding a per-voxel classification of CaP presence. Per-voxel evaluation of detection results against ground truth for CaP extent on MRI (obtained by spatially registering pre-operative MRI with available whole-mount histological specimens) reveals that EMPrAvISE yields a statistically significant improvement (AUC=0.77) over classifiers constructed from individual protocols (AUC=0.62, 0.62, 0.65, for T2w, DCE, DWI respectively) as well as one trained using multi-parametric feature concatenation (AUC=0.67).
Feasibilities of a Coal-Biomass to Liquids Plant in Southern West Virginia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharyya, Debangsu; DVallance, David; Henthorn, Greg
This project has generated comprehensive and realistic results of feasibilities for a coal-biomass to liquids (CBTL) plant in southern West Virginia; and evaluated the sensitivity of the analyses to various anticipated scenarios and parametric uncertainties. Specifically the project has addressed economic feasibility, technical feasibility, market feasibility, and financial feasibility. In the economic feasibility study, a multi-objective siting model was developed and was then used to identify and rank the suitable facility sites. Spatial models were also developed to assess the biomass and coal feedstock availabilities and economics. Environmental impact analysis was conducted mainly to assess life cycle analysis and greenhousemore » gas emission. Uncertainty and sensitivity analysis were also investigated in this study. Sensitivity analyses on required selling price (RSP) and greenhouse gas (GHG) emissions of CBTL fuels were conducted according to feedstock availability and price, biomass to coal mix ratio, conversion rate, internal rate of return (IRR), capital cost, operational and maintenance cost. The study of siting and capacity showed that feedstock mixed ratio limited the CBTL production. The price of coal had a more dominant effect on RSP than that of biomass. Different mix ratios in the feedstock and conversion rates led to RSP ranging from $104.3 - $157.9/bbl. LCA results indicated that GHG emissions ranged from 80.62 kg CO 2 eq to 101.46 kg CO2 eq/1,000 MJ of liquid fuel at various biomass to coal mix ratios and conversion rates if carbon capture and storage (CCS) was applied. Most of water and fossil energy were consumed in conversion process. Compared to petroleum-derived-liquid fuels, the reduction in GHG emissions could be between -2.7% and 16.2% with CBTL substitution. As for the technical study, three approaches of coal and biomass to liquids, direct, indirect and hybrid, were considered in the analysis. The process models including conceptual design, process modeling and process validation were developed and validated for different cases. Equipment design and capital costs were investigated on capital coast estimation and economical model validation. Material and energy balances and techno-economic analysis on base case were conducted for evaluation of projects. Also, sensitives studies of direct and indirect approaches were both used to evaluate the CBTL plant economic performance. In this study, techno-economic analysis were conducted in Aspen Process Economic Analyzer (APEA) environment for indirect, direct, and hybrid CBTL plants with CCS based on high fidelity process models developed in Aspen Plus and Excel. The process thermal efficiency ranges from 45% to 67%. The break-even oil price ranges from $86.1 to $100.6 per barrel for small scale (10000 bbl/day) CBTL plants and from $65.3 to $80.5 per barrel for large scale (50000 bbl/day) CBTL plants. Increasing biomass/coal ratio from 8/92 to 20/80 would increase the break-even oil price of indirect CBTL plant by $3/bbl and decrease the break-even oil price of direct CBTL plant by about $1/bbl. The order of carbon capture penalty is direct > indirect > hybrid. The order of capital investment is hybrid (with or without shale gas utilization) > direct (without shale gas utilization) > indirect > direct (with shale gas utilization). The order of thermal efficiency is direct > hybrid > indirect. The order of break-even oil price is hybrid (without shale gas utilization) > direct (without shale gas utilization) > hybrid (with shale gas utilization) > indirect > direct (with shale gas utilization).« less
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
NASA Astrophysics Data System (ADS)
Dolev, A.; Bucher, I.
2018-04-01
Mechanical or electromechanical amplifiers can exploit the high-Q and low noise features of mechanical resonance, in particular when parametric excitation is employed. Multi-frequency parametric excitation introduces tunability and is able to project weak input signals on a selected resonance. The present paper addresses multi degree of freedom mechanical amplifiers or resonators whose analysis and features require treatment of the spatial as well as temporal behavior. In some cases, virtual electronic coupling can alter the given topology of the resonator to better amplify specific inputs. An analytical development is followed by a numerical and experimental sensitivity and performance verifications, illustrating the advantages and disadvantages of such topologies.
Rayleigh-type parametric chemical oscillation.
Ghosh, Shyamolina; Ray, Deb Shankar
2015-09-28
We consider a nonlinear chemical dynamical system of two phase space variables in a stable steady state. When the system is driven by a time-dependent sinusoidal forcing of a suitable scaling parameter at a frequency twice the output frequency and the strength of perturbation exceeds a threshold, the system undergoes sustained Rayleigh-type periodic oscillation, wellknown for parametric oscillation in pipe organs and distinct from the usual forced quasiperiodic oscillation of a damped nonlinear system where the system is oscillatory even in absence of any external forcing. Our theoretical analysis of the parametric chemical oscillation is corroborated by full numerical simulation of two well known models of chemical dynamics, chlorite-iodine-malonic acid and iodine-clock reactions.
Bayesian non-parametric inference for stochastic epidemic models using Gaussian Processes.
Xu, Xiaoguang; Kypraios, Theodore; O'Neill, Philip D
2016-10-01
This paper considers novel Bayesian non-parametric methods for stochastic epidemic models. Many standard modeling and data analysis methods use underlying assumptions (e.g. concerning the rate at which new cases of disease will occur) which are rarely challenged or tested in practice. To relax these assumptions, we develop a Bayesian non-parametric approach using Gaussian Processes, specifically to estimate the infection process. The methods are illustrated with both simulated and real data sets, the former illustrating that the methods can recover the true infection process quite well in practice, and the latter illustrating that the methods can be successfully applied in different settings. © The Author 2016. Published by Oxford University Press.
Parametric analysis of closed cycle magnetohydrodynamic (MHD) power plants
NASA Technical Reports Server (NTRS)
Owens, W.; Berg, R.; Murthy, R.; Patten, J.
1981-01-01
A parametric analysis of closed cycle MHD power plants was performed which studied the technical feasibility, associated capital cost, and cost of electricity for the direct combustion of coal or coal derived fuel. Three reference plants, differing primarily in the method of coal conversion utilized, were defined. Reference Plant 1 used direct coal fired combustion while Reference Plants 2 and 3 employed on site integrated gasifiers. Reference Plant 2 used a pressurized gasifier while Reference Plant 3 used a ""state of the art' atmospheric gasifier. Thirty plant configurations were considered by using parametric variations from the Reference Plants. Parametric variations include the type of coal (Montana Rosebud or Illinois No. 6), clean up systems (hot or cold gas clean up), on or two stage atmospheric or pressurized direct fired coal combustors, and six different gasifier systems. Plant sizes ranged from 100 to 1000 MWe. Overall plant performance was calculated using two methodologies. In one task, the channel performance was assumed and the MHD topping cycle efficiencies were based on the assumed values. A second task involved rigorous calculations of channel performance (enthalpy extraction, isentropic efficiency and generator output) that verified the original (task one) assumptions. Closed cycle MHD capital costs were estimated for the task one plants; task two cost estimates were made for the channel and magnet only.
Martina, R; Kay, R; van Maanen, R; Ridder, A
2015-01-01
Clinical studies in overactive bladder have traditionally used analysis of covariance or nonparametric methods to analyse the number of incontinence episodes and other count data. It is known that if the underlying distributional assumptions of a particular parametric method do not hold, an alternative parametric method may be more efficient than a nonparametric one, which makes no assumptions regarding the underlying distribution of the data. Therefore, there are advantages in using methods based on the Poisson distribution or extensions of that method, which incorporate specific features that provide a modelling framework for count data. One challenge with count data is overdispersion, but methods are available that can account for this through the introduction of random effect terms in the modelling, and it is this modelling framework that leads to the negative binomial distribution. These models can also provide clinicians with a clearer and more appropriate interpretation of treatment effects in terms of rate ratios. In this paper, the previously used parametric and non-parametric approaches are contrasted with those based on Poisson regression and various extensions in trials evaluating solifenacin and mirabegron in patients with overactive bladder. In these applications, negative binomial models are seen to fit the data well. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Vidya Sagar, R.; Raghu Prasad, B. K.
2012-03-01
This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.
NASA Astrophysics Data System (ADS)
Dai, Xiaoqian; Tian, Jie; Chen, Zhe
2010-03-01
Parametric images can represent both spatial distribution and quantification of the biological and physiological parameters of tracer kinetics. The linear least square (LLS) method is a well-estimated linear regression method for generating parametric images by fitting compartment models with good computational efficiency. However, bias exists in LLS-based parameter estimates, owing to the noise present in tissue time activity curves (TTACs) that propagates as correlated error in the LLS linearized equations. To address this problem, a volume-wise principal component analysis (PCA) based method is proposed. In this method, firstly dynamic PET data are properly pre-transformed to standardize noise variance as PCA is a data driven technique and can not itself separate signals from noise. Secondly, the volume-wise PCA is applied on PET data. The signals can be mostly represented by the first few principle components (PC) and the noise is left in the subsequent PCs. Then the noise-reduced data are obtained using the first few PCs by applying 'inverse PCA'. It should also be transformed back according to the pre-transformation method used in the first step to maintain the scale of the original data set. Finally, the obtained new data set is used to generate parametric images using the linear least squares (LLS) estimation method. Compared with other noise-removal method, the proposed method can achieve high statistical reliability in the generated parametric images. The effectiveness of the method is demonstrated both with computer simulation and with clinical dynamic FDG PET study.
Outcome of temporal lobe epilepsy surgery predicted by statistical parametric PET imaging.
Wong, C Y; Geller, E B; Chen, E Q; MacIntyre, W J; Morris, H H; Raja, S; Saha, G B; Lüders, H O; Cook, S A; Go, R T
1996-07-01
PET is useful in the presurgical evaluation of temporal lobe epilepsy. The purpose of this retrospective study is to assess the clinical use of statistical parametric imaging in predicting surgical outcome. Interictal 18FDG-PET scans in 17 patients with surgically-treated temporal lobe epilepsy (Group A-13 seizure-free, group B = 4 not seizure-free at 6 mo) were transformed into statistical parametric imaging, with each pixel representing a z-score value by using the mean and s.d. of count distribution in each individual patient, for both visual and quantitative analysis. Mean z-scores were significantly more negative in anterolateral (AL) and mesial (M) regions on the operated side than the nonoperated side in group A (AL: p < 0.00005, M: p = 0.0097), but not in group B (AL: p = 0.46, M: p = 0.08). Statistical parametric imaging correctly lateralized 16 out of 17 patients. Only the AL region, however, was significant in predicting surgical outcome (F = 29.03, p < 0.00005). Using a cut-off z-score value of -1.5, statistical parametric imaging correctly classified 92% of temporal lobes from group A and 88% of those from Group B. The preliminary results indicate that statistical parametric imaging provides both clinically useful information for lateralization in temporal lobe epilepsy and a reliable predictive indicator of clinical outcome following surgical treatment.
The linear transformation model with frailties for the analysis of item response times.
Wang, Chun; Chang, Hua-Hua; Douglas, Jeffrey A
2013-02-01
The item response times (RTs) collected from computerized testing represent an underutilized source of information about items and examinees. In addition to knowing the examinees' responses to each item, we can investigate the amount of time examinees spend on each item. In this paper, we propose a semi-parametric model for RTs, the linear transformation model with a latent speed covariate, which combines the flexibility of non-parametric modelling and the brevity as well as interpretability of parametric modelling. In this new model, the RTs, after some non-parametric monotone transformation, become a linear model with latent speed as covariate plus an error term. The distribution of the error term implicitly defines the relationship between the RT and examinees' latent speeds; whereas the non-parametric transformation is able to describe various shapes of RT distributions. The linear transformation model represents a rich family of models that includes the Cox proportional hazards model, the Box-Cox normal model, and many other models as special cases. This new model is embedded in a hierarchical framework so that both RTs and responses are modelled simultaneously. A two-stage estimation method is proposed. In the first stage, the Markov chain Monte Carlo method is employed to estimate the parametric part of the model. In the second stage, an estimating equation method with a recursive algorithm is adopted to estimate the non-parametric transformation. Applicability of the new model is demonstrated with a simulation study and a real data application. Finally, methods to evaluate the model fit are suggested. © 2012 The British Psychological Society.
Formation of parametric images using mixed-effects models: a feasibility study.
Huang, Husan-Ming; Shih, Yi-Yu; Lin, Chieh
2016-03-01
Mixed-effects models have been widely used in the analysis of longitudinal data. By presenting the parameters as a combination of fixed effects and random effects, mixed-effects models incorporating both within- and between-subject variations are capable of improving parameter estimation. In this work, we demonstrate the feasibility of using a non-linear mixed-effects (NLME) approach for generating parametric images from medical imaging data of a single study. By assuming that all voxels in the image are independent, we used simulation and animal data to evaluate whether NLME can improve the voxel-wise parameter estimation. For testing purposes, intravoxel incoherent motion (IVIM) diffusion parameters including perfusion fraction, pseudo-diffusion coefficient and true diffusion coefficient were estimated using diffusion-weighted MR images and NLME through fitting the IVIM model. The conventional method of non-linear least squares (NLLS) was used as the standard approach for comparison of the resulted parametric images. In the simulated data, NLME provides more accurate and precise estimates of diffusion parameters compared with NLLS. Similarly, we found that NLME has the ability to improve the signal-to-noise ratio of parametric images obtained from rat brain data. These data have shown that it is feasible to apply NLME in parametric image generation, and the parametric image quality can be accordingly improved with the use of NLME. With the flexibility to be adapted to other models or modalities, NLME may become a useful tool to improve the parametric image quality in the future. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark
2016-08-01
(11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
Parametric-Studies and Data-Plotting Modules for the SOAP
NASA Technical Reports Server (NTRS)
2008-01-01
"Parametric Studies" and "Data Table Plot View" are the names of software modules in the Satellite Orbit Analysis Program (SOAP). Parametric Studies enables parameterization of as many as three satellite or ground-station attributes across a range of values and computes the average, minimum, and maximum of a specified metric, the revisit time, or 21 other functions at each point in the parameter space. This computation produces a one-, two-, or three-dimensional table of data representing statistical results across the parameter space. Inasmuch as the output of a parametric study in three dimensions can be a very large data set, visualization is a paramount means of discovering trends in the data (see figure). Data Table Plot View enables visualization of the data table created by Parametric Studies or by another data source: this module quickly generates a display of the data in the form of a rotatable three-dimensional-appearing plot, making it unnecessary to load the SOAP output data into a separate plotting program. The rotatable three-dimensionalappearing plot makes it easy to determine which points in the parameter space are most desirable. Both modules provide intuitive user interfaces for ease of use.
Parametric and experimental analysis using a power flow approach
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1990-01-01
A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.
Linear and nonlinear analysis of fluid slosh dampers
NASA Astrophysics Data System (ADS)
Sayar, B. A.; Baumgarten, J. R.
1982-11-01
A vibrating structure and a container partially filled with fluid are considered coupled in a free vibration mode. To simplify the mathematical analysis, a pendulum model to duplicate the fluid motion and a mass-spring dashpot representing the vibrating structure are used. The equations of motion are derived by Lagrange's energy approach and expressed in parametric form. For a wide range of parametric values the logarithmic decrements of the main system are calculated from theoretical and experimental response curves in the linear analysis. However, for the nonlinear analysis the theoretical and experimental response curves of the main system are compared. Theoretical predictions are justified by experimental observations with excellent agreement. It is concluded finally that for a proper selection of design parameters, containers partially filled with viscous fluids serve as good vibration dampers.
Supercritical nonlinear parametric dynamics of Timoshenko microbeams
NASA Astrophysics Data System (ADS)
Farokhi, Hamed; Ghayesh, Mergen H.
2018-06-01
The nonlinear supercritical parametric dynamics of a Timoshenko microbeam subject to an axial harmonic excitation force is examined theoretically, by means of different numerical techniques, and employing a high-dimensional analysis. The time-variant axial load is assumed to consist of a mean value along with harmonic fluctuations. In terms of modelling, a continuous expression for the elastic potential energy of the system is developed based on the modified couple stress theory, taking into account small-size effects; the kinetic energy of the system is also modelled as a continuous function of the displacement field. Hamilton's principle is employed to balance the energies and to obtain the continuous model of the system. Employing the Galerkin scheme along with an assumed-mode technique, the energy terms are reduced, yielding a second-order reduced-order model with finite number of degrees of freedom. A transformation is carried out to convert the second-order reduced-order model into a double-dimensional first order one. A bifurcation analysis is performed for the system in the absence of the axial load fluctuations. Moreover, a mean value for the axial load is selected in the supercritical range, and the principal parametric resonant response, due to the time-variant component of the axial load, is obtained - as opposed to transversely excited systems, for parametrically excited system (such as our problem here), the nonlinear resonance occurs in the vicinity of twice any natural frequency of the linear system; this is accomplished via use of the pseudo-arclength continuation technique, a direct time integration, an eigenvalue analysis, and the Floquet theory for stability. The natural frequencies of the system prior to and beyond buckling are also determined. Moreover, the effect of different system parameters on the nonlinear supercritical parametric dynamics of the system is analysed, with special consideration to the effect of the length-scale parameter.
Kamboj, Laveena; Oh, Paul; Levine, Mitchell; Kammila, Srinu; Casey, William; Harterre, Don; Goeree, Ron
2016-01-15
In Ontario, Canada, the Comprehensive Vascular Disease Prevention and Management Initiative (CVDPMI) was undertaken to improve the vascular health in communities. The CVDPMI significantly improved cardiovascular (CV) risk factor profiles from baseline to follow-up visits including the 10 year Framingham Risk Score (FRS). Although the CVDPMI improved CV risk, the economic value of this program had not been evaluated. We examined the cost effectiveness of the CVDPMI program compared to no CVDPMI program in adult patients identified at risk for an initial or subsequent vascular event in a primary care setting. A one year and a ten year cost effectiveness analyses were conducted. To determine the uncertainty around the cost per life year gained ratio, a non-parametric bootstrap analysis was conducted. The overall population base case analysis at one year resulted in a cost per CV event avoided of $70,423. FRS subgroup analyses showed the high risk cohort (FRS >20%) had an incremental cost effectiveness ratio (ICER) that was dominant. In the moderate risk subgroup (FRS 10%-20%) the ICER was $47,439 per CV event avoided and the low risk subgroup (FRS <10%) showed a highly cost ineffective result of greater than $5 million per CV event avoided. The ten year analysis resulted in a dominant ICER. At one year, the CVDPMI program is economically acceptable for patients at moderate to high risk for CV events. The CVDPMI results in increased life expectancy at an incremental cost saving to the healthcare system over a ten year period. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Parametric sensitivity study for solar-assisted heat-pump systems
NASA Astrophysics Data System (ADS)
White, N. M.; Morehouse, J. H.
1981-07-01
The engineering and economic parameters affecting life-cycle costs for solar-assisted heat pump systems are investigted. The change in energy usage resulting from each engineering parameter varied was developed from computer simulations, and is compared with results from a stand-alone heat pump system. Three geographical locations are considered: Washington, DC, Fort Worth, TX, and Madison, WI. Results indicate that most engineering changes to the systems studied do not provide significant energy savings. The most promising parameters to ary are the solar collector parameters tau (-) and U/sub L/ the heat pump capacity at design point, and the minimum utilizable evaporator temperature. Costs associated with each change are estimated, and life-cycle costs computed for both engineering parameters and economic variations in interest rate, discount rate, tax credits, fuel unit costs and fuel inflation rates. Results indicate that none of the feasibile engineering changes for the system configuration studied will make these systems economically competitive with the stand-alone heat pump without a considerable tax credit.
Analysis of Lateral Rail Restraint.
DOT National Transportation Integrated Search
1983-09-01
This report deals with the analysis of lateral rail strength using the results of experimental investigations and a nonlinear rail response model. Part of the analysis involves the parametric study of the influence of track parameters on lateral rail...
Parametrically excited oscillation of stay cable and its control in cable-stayed bridges.
Sun, Bing-nan; Wang, Zhi-gang; Ko, J M; Ni, Y Q
2003-01-01
This paper presents a nonlinear dynamic model for simulation and analysis of a kind of parametrically excited vibration of stay cable caused by support motion in cable-stayed bridges. The sag, inclination angle of the stay cable are considered in the model, based on which, the oscillation mechanism and dynamic response characteristics of this kind of vibration are analyzed through numerical calculation. It is noted that parametrically excited oscillation of a stay cable with certain sag, inclination angle and initial static tension force may occur in cable-stayed bridges due to deck vibration under the condition that the natural frequency of a cable approaches to about half of the first model frequency of the bridge deck system. A new vibration control system installed on the cable anchorage is proposed as a possible damping system to suppress the cable parametric oscillation. The numerical calculation results showed that with the use of this damping system, the cable oscillation due to the vibration of the deck and/or towers will be considerably reduced.
Parametric Cost Models for Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney
2010-01-01
Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.
Definition of NASTRAN sets by use of parametric geometry
NASA Technical Reports Server (NTRS)
Baughn, Terry V.; Tiv, Mehran
1989-01-01
Many finite element preprocessors describe finite element model geometry with points, lines, surfaces and volumes. One method for describing these basic geometric entities is by use of parametric cubics which are useful for representing complex shapes. The lines, surfaces and volumes may be discretized for follow on finite element analysis. The ability to limit or selectively recover results from the finite element model is extremely important to the analyst. Equally important is the ability to easily apply boundary conditions. Although graphical preprocessors have made these tasks easier, model complexity may not lend itself to easily identify a group of grid points desired for data recovery or application of constraints. A methodology is presented which makes use of the assignment of grid point locations in parametric coordinates. The parametric coordinates provide a convenient ordering of the grid point locations and a method for retrieving the grid point ID's from the parent geometry. The selected grid points may then be used for the generation of the appropriate set and constraint cards.
NASA Technical Reports Server (NTRS)
1975-01-01
Transportation mass requirements are developed for various mission and transportation modes based on vehicle systems sized to fit the exact needs of each mission. The parametric data used to derive the mass requirements for each mission and transportation mode are presented to enable accommodation of possible changes in mode options or payload definitions. The vehicle sizing and functional requirements used to derive the parametric data are described.
Parametrically excited multidegree-of-freedom systems with repeated frequencies
NASA Astrophysics Data System (ADS)
Nayfeh, A. H.
1983-05-01
An analysis is presented of the linear response of multidegree-of-freedom systems with a repeated frequency of order three to a harmonic parametric excitation. The method of multiple scales is used to determine the modulation of the amplitudes and phases for two cases: fundamental resonance of the modes with the repeated frequency and combination resonance involving these modes and another mode. Conditions are then derived for determining the stability of the motion.
Advanced oxygen-hydrocarbon rocket engine study
NASA Technical Reports Server (NTRS)
Obrien, C. J.; Salkeld, R.
1980-01-01
The advantages and disadvantages, system performance and operating limits, engine parametric data, and technology requirements for candidate high pressure LO2/Hydrocarbon engine systems are summarized. These summaries of parametric analysis and design provide a consistent engine system data base. Power balance data were generated for the eleven engine cycles. Engine cycle rating parameters were established and the desired condition and the effect of the parameter on the engine and/or vehicle are described.
SEC sensor parametric test and evaluation system
NASA Technical Reports Server (NTRS)
1978-01-01
This system provides the necessary automated hardware required to carry out, in conjunction with the existing 70 mm SEC television camera, the sensor evaluation tests which are described in detail. The Parametric Test Set (PTS) was completed and is used in a semiautomatic data acquisition and control mode to test the development of the 70 mm SEC sensor, WX 32193. Data analysis of raw data is performed on the Princeton IBM 360-91 computer.
CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions
Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.
NASA Astrophysics Data System (ADS)
Zhen, Xing-wei; Huang, Yi
2017-10-01
This study focuses on a new technology of Subsurface Tension Leg Platform (STLP), which utilizes the shallowwater rated well completion equipment and technology for the development of large oil and gas fields in ultra-deep water (UDW). Thus, the STLP concept offers attractive advantages over conventional field development concepts. STLP is basically a pre-installed Subsurface Sea-star Platform (SSP), which supports rigid risers and shallow-water rated well completion equipment. The paper details the results of the parametric study on the behavior of STLP at a water depth of 3000 m. At first, a general description of the STLP configuration and working principle is introduced. Then, the numerical models for the global analysis of the STLP in waves and current are presented. After that, extensive parametric studies are carried out with regarding to SSP/tethers system analysis, global dynamic analysis and riser interference analysis. Critical points are addressed on the mooring pattern and riser arrangement under the influence of ocean current, to ensure that the requirements on SSP stability and riser interference are well satisfied. Finally, conclusions and discussions are made. The results indicate that STLP is a competitive well and riser solution in up to 3000 m water depth for offshore petroleum production.
Quintela-del-Río, Alejandro; Francisco-Fernández, Mario
2011-02-01
The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
An Interactive Software for Conceptual Wing Flutter Analysis and Parametric Study
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
1996-01-01
An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate the flutter instability boundary of a flexible cantilever wing, when well-defined structural and aerodynamic data are not available, and then study the effect of change in Mach number, dynamic pressure, torsional frequency, sweep, mass ratio, aspect ratio, taper ratio, center of gravity, and pitch inertia, to guide the development of the concept. The software was developed for Macintosh or IBM compatible personal computers, on MathCad application software with integrated documentation, graphics, data base and symbolic mathematics. The analysis method was based on non-dimensional parametric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on torsional stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravity location and pitch inertia radius of gyration. The parametric plots were compiled in a Vought Corporation report from a vast data base of past experiments and wind-tunnel tests. The computer program was utilized for flutter analysis of the outer wing of a Blended-Wing-Body concept, proposed by McDonnell Douglas Corp. Using a set of assumed data, preliminary flutter boundary and flutter dynamic pressure variation with altitude, Mach number and torsional stiffness were determined.
Benchmark dose analysis via nonparametric regression modeling
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose-response modeling. It is a well-known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low-dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal-response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap-based confidence limits for the BMD. We explore the confidence limits’ small-sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty. PMID:23683057
Parametric Covariance Model for Horizon-Based Optical Navigation
NASA Technical Reports Server (NTRS)
Hikes, Jacob; Liounis, Andrew J.; Christian, John A.
2016-01-01
This Note presents an entirely parametric version of the covariance for horizon-based optical navigation measurements. The covariance can be written as a function of only the spacecraft position, two sensor design parameters, the illumination direction, the size of the observed planet, the size of the lit arc to be used, and the total number of observed horizon points. As a result, one may now more clearly understand the sensitivity of horizon-based optical navigation performance as a function of these key design parameters, which is insight that was obscured in previous (and nonparametric) versions of the covariance. Finally, the new parametric covariance is shown to agree with both the nonparametric analytic covariance and results from a Monte Carlo analysis.
Parametric interactions in presence of different size colloids in semiconductor quantum plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanshpal, R., E-mail: ravivanshpal@gmail.com; Sharma, Uttam; Dubey, Swati
2015-07-31
Present work is an attempt to investigate the effect of different size colloids on parametric interaction in semiconductor quantum plasma. Inclusion of quantum effect is being done in this analysis through quantum correction term in classical hydrodynamic model of homogeneous semiconductor plasma. The effect is associated with purely quantum origin using quantum Bohm potential and quantum statistics. Colloidal size and quantum correction term modify the parametric dispersion characteristics of ion implanted semiconductor plasma medium. It is found that quantum effect on colloids is inversely proportional to their size. Moreover critical size of implanted colloids for the effective quantum correction ismore » determined which is found to be equal to the lattice spacing of the crystal.« less
Multi Response Optimization of Laser Micro Marking Process:A Grey- Fuzzy Approach
NASA Astrophysics Data System (ADS)
Shivakoti, I.; Das, P. P.; Kibria, G.; Pradhan, B. B.; Mustafa, Z.; Ghadai, R. K.
2017-07-01
The selection of optimal parametric combination for efficient machining has always become a challenging issue for the manufacturing researcher. The optimal parametric combination always provides a better machining which improves the productivity, product quality and subsequently reduces the production cost and time. The paper presents the hybrid approach of Grey relational analysis and Fuzzy logic to obtain the optimal parametric combination for better laser beam micro marking on the Gallium Nitride (GaN) work material. The response surface methodology has been implemented for design of experiment considering three parameters with their five levels. The parameter such as current, frequency and scanning speed has been considered and the mark width, mark depth and mark intensity has been considered as the process response.
Applying complex models to poultry production in the future--economics and biology.
Talpaz, H; Cohen, M; Fancher, B; Halley, J
2013-09-01
The ability to determine the optimal broiler feed nutrient density that maximizes margin over feeding cost (MOFC) has obvious economic value. To determine optimal feed nutrient density, one must consider ingredient prices, meat values, the product mix being marketed, and the projected biological performance. A series of 8 feeding trials was conducted to estimate biological responses to changes in ME and amino acid (AA) density. Eight different genotypes of sex-separate reared broilers were fed diets varying in ME (2,723-3,386 kcal of ME/kg) and AA (0.89-1.65% digestible lysine with all essential AA acids being indexed to lysine) levels. Broilers were processed to determine carcass component yield at many different BW (1.09-4.70 kg). Trial data generated were used in model constructed to discover the dietary levels of ME and AA that maximize MOFC on a per broiler or per broiler annualized basis (bird × number of cycles/year). The model was designed to estimate the effects of dietary nutrient concentration on broiler live weight, feed conversion, mortality, and carcass component yield. Estimated coefficients from the step-wise regression process are subsequently used to predict the optimal ME and AA concentrations that maximize MOFC. The effects of changing feed or meat prices across a wide spectrum on optimal ME and AA levels can be evaluated via parametric analysis. The model can rapidly compare both biological and economic implications of changing from current practice to the simulated optimal solution. The model can be exploited to enhance decision making under volatile market conditions.
Calhelha, Ricardo C; Martínez, Mireia A; Prieto, M A; Ferreira, Isabel C F R
2017-10-23
The development of convenient tools for describing and quantifying the effects of standard and novel therapeutic agents is essential for the research community, to perform more precise evaluations. Although mathematical models and quantification criteria have been exchanged in the last decade between different fields of study, there are relevant methodologies that lack proper mathematical descriptions and standard criteria to quantify their responses. Therefore, part of the relevant information that can be drawn from the experimental results obtained and the quantification of its statistical reliability are lost. Despite its relevance, there is not a standard form for the in vitro endpoint tumor cell lines' assays (TCLA) that enables the evaluation of the cytotoxic dose-response effects of anti-tumor drugs. The analysis of all the specific problems associated with the diverse nature of the available TCLA used is unfeasible. However, since most TCLA share the main objectives and similar operative requirements, we have chosen the sulforhodamine B (SRB) colorimetric assay for cytotoxicity screening of tumor cell lines as an experimental case study. In this work, the common biological and practical non-linear dose-response mathematical models are tested against experimental data and, following several statistical analyses, the model based on the Weibull distribution was confirmed as the convenient approximation to test the cytotoxic effectiveness of anti-tumor compounds. Then, the advantages and disadvantages of all the different parametric criteria derived from the model, which enable the quantification of the dose-response drug-effects, are extensively discussed. Therefore, model and standard criteria for easily performing the comparisons between different compounds are established. The advantages include a simple application, provision of parametric estimations that characterize the response as standard criteria, economization of experimental effort and enabling rigorous comparisons among the effects of different compounds and experimental approaches. In all experimental data fitted, the calculated parameters were always statistically significant, the equations proved to be consistent and the correlation coefficient of determination was, in most of the cases, higher than 0.98.
Varabyova, Yauheniya; Schreyögg, Jonas
2013-09-01
There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Izmaylov, R.; Lebedev, A.
2015-08-01
Centrifugal compressors are complex energy equipment. Automotive control and protection system should meet the requirements: of operation reliability and durability. In turbocompressors there are at least two dangerous areas: surge and rotating stall. Antisurge protecting systems usually use parametric or feature methods. As a rule industrial system are parametric. The main disadvantages of anti-surge parametric systems are difficulties in mass flow measurements in natural gas pipeline compressor. The principal idea of feature method is based on the experimental fact: as a rule just before the onset of surge rotating or precursor stall established in compressor. In this case the problem consists in detecting of unsteady pressure or velocity fluctuations characteristic signals. Wavelet analysis is the best method for detecting onset of rotating stall in spite of high level of spurious signals (rotating wakes, turbulence, etc.). This method is compatible with state of the art DSP systems of industrial control. Examples of wavelet analysis application for detecting onset of rotating stall in typical stages centrifugal compressor are presented. Experimental investigations include unsteady pressure measurement and sophisticated data acquisition system. Wavelet transforms used biorthogonal wavelets in Mathlab systems.
The composition of M-type asteroids: Synthesis of spectroscopic and radar observations
NASA Astrophysics Data System (ADS)
Neeley, J. R.; Ockert-Bell, M. E.; Clark, B. E.; Shepard, M. K.; Cloutis, E. A.; Fornasier, S.; Bus, S. J.
2011-10-01
This work updates our and expands our long term radar-driven observational campaign of 27 main-belt asteroids (MBAs) focused on Bus-DeMeo Xc- and Xk-type objects (Tholen X and M class asteroids) using the Arecibo radar and NASA Infrared Telescope Facilities (IRTF). Seventeen of our targets were near-simultaneously observed with radar and those observations are described in companion paper (Shepard et al., 2010). We utilized visible wavelength for a more complete compositional analysis of our targets. Compositional evidence is derived from our target asteroid spectra using three different methods: 1) a χ2 search for spectral matches in the RELAB database, 2) parametric comparisons with meteorites and 3) linear discriminant analysis. This paper synthesizes the results of the RELAB search, parametric comparisons, and linear discriminant analysis with compositional suggestions based on radar observations. We find that for six of seventeen targets with radar data, our spectral results are consistent with their radar analog (16 Psyche, 21 Lutetia, 69 Hesperia, 135 Hertha, 216 Kleopatra, and 497 Iva). For twenty out of twenty-seven objects our statistical comparisons with RELAB meteorites result in consistent analog identification, providing a degree of confidence in our parametric methods.
NASA Technical Reports Server (NTRS)
Howlett, R. A.
1975-01-01
A continuation of the NASA/P and WA study to evaluate various types of propulsion systems for advanced commercial supersonic transports has resulted in the identification of two very promising engine concepts. They are the Variable Stream Control Engine which provides independent temperature and velocity control for two coannular exhaust streams, and a derivative of this engine, a Variable Cycle Engine that employs a rear flow-inverter valve to vary the bypass ratio of the cycle. Both concepts are based on advanced engine technology and have the potential for significant improvements in jet noise, exhaust emissions and economic characteristics relative to current technology supersonic engines. Extensive research and technology programs are required in several critical areas that are unique to these supersonic Variable Cycle Engines to realize these potential improvements. Parametric cycle and integration studies of conventional and Variable Cycle Engines are reviewed, features of the two most promising engine concepts are described, and critical technology requirements and required programs are summarized.
Parameterization of a Conventional and Regenerated UHB Turbofan
NASA Astrophysics Data System (ADS)
Oliveira, Fábio; Brójo, Francisco
2015-09-01
The attempt to improve aircraft engines efficiency resulted in the evolution from turbojets to the first generation low bypass ratio turbofans. Today, high bypass ratio turbofans are the most traditional type of engine in commercial aviation. Following many years of technological developments and improvements, this type of engine has proved to be the most reliable facing the commercial aviation requirements. In search of more efficiency, the engine manufacturers tend to increase the bypass ratio leading to ultra-high bypass ratio (UHB) engines. Increased bypass ratio has clear benefits in terms of propulsion system like reducing the specific fuel consumption. This study is aimed at a parametric analysis of a UHB turbofan engine focused on short haul flights. Two cycle configurations (conventional and regenerated) were studied, and estimated values of their specific fuel consumption (TSFC) and specific thrust (Fs) were determined. Results demonstrate that the regenerated cycle may contribute towards a more economic and friendly aero engines in a higher range of bypass ratio.
NASA Technical Reports Server (NTRS)
1976-01-01
In the conceptual design task, several feasible wind generator systems (WGS) configurations were evaluated, and the concept offering the lowest energy cost potential and minimum technical risk for utility applications was selected. In the optimization task, the selected concept was optimized utilizing a parametric computer program prepared for this purpose. In the preliminary design task, the optimized selected concept was designed and analyzed in detail. The utility requirements evaluation task examined the economic, operational, and institutional factors affecting the WGS in a utility environment, and provided additional guidance for the preliminary design effort. Results of the conceptual design task indicated that a rotor operating at constant speed, driving an AC generator through a gear transmission is the most cost effective WGS configuration. The optimization task results led to the selection of a 500 kW rating for the low power WGS and a 1500 kW rating for the high power WGS.
Near-term hybrid vehicle program, phase 1. Appendix D: Sensitivity analysis resport
NASA Technical Reports Server (NTRS)
1979-01-01
Parametric analyses, using a hybrid vehicle synthesis and economics program (HYVELD) are described investigating the sensitivity of hybrid vehicle cost, fuel usage, utility, and marketability to changes in travel statistics, energy costs, vehicle lifetime and maintenance, owner use patterns, internal combustion engine (ICE) reference vehicle fuel economy, and drive-line component costs and type. The lowest initial cost of the hybrid vehicle would be $1200 to $1500 higher than that of the conventional vehicle. For nominal energy costs ($1.00/gal for gasoline and 4.2 cents/kWh for electricity), the ownership cost of the hybrid vehicle is projected to be 0.5 to 1.0 cents/mi less than the conventional ICE vehicle. To attain this ownership cost differential, the lifetime of the hybrid vehicle must be extended to 12 years and its maintenance cost reduced by 25 percent compared with the conventional vehicle. The ownership cost advantage of the hybrid vehicle increases rapidly as the price of fuel increases from $1 to $2/gal.
B{yields}X{sub s{gamma}} rate and CP asymmetry within the aligned two-Higgs-doublet model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Martin; Pich, Antonio; Tuzon, Paula
In the two-Higgs-doublet model the alignment of the Yukawa matrices in flavor space guarantees the absence of flavor-changing neutral currents at tree level, while introducing new sources for CP violation parametrized in a very economical way [Antonio Pich and Paula Tuzon, Phys. Rev. D 80, 091702 (2009)]. This implies a potentially large influence in a number of processes, b{yields}s{gamma} being a prominent example where rather high experimental and theoretical precision meet. We analyze the CP rate asymmetry in this inclusive decay and determine the resulting constraints on the model parameters. We demonstrate the compatibility with previously obtained limits [Martin Jung,more » Antonio Pich, and Paula Tuzon, J. High Energy Phys. 11 (2010) 003]. Moreover, we extend the phenomenological analysis of the branching ratio, and examine the influence of resulting correlations on the like-sign dimuon charge asymmetry in B decays.« less
A Cobb Douglas Stochastic Frontier Model on Measuring Domestic Bank Efficiency in Malaysia
Hasan, Md. Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md. Azizul
2012-01-01
Banking system plays an important role in the economic development of any country. Domestic banks, which are the main components of the banking system, have to be efficient; otherwise, they may create obstacle in the process of development in any economy. This study examines the technical efficiency of the Malaysian domestic banks listed in the Kuala Lumpur Stock Exchange (KLSE) market over the period 2005–2010. A parametric approach, Stochastic Frontier Approach (SFA), is used in this analysis. The findings show that Malaysian domestic banks have exhibited an average overall efficiency of 94 percent, implying that sample banks have wasted an average of 6 percent of their inputs. Among the banks, RHBCAP is found to be highly efficient with a score of 0.986 and PBBANK is noted to have the lowest efficiency with a score of 0.918. The results also show that the level of efficiency has increased during the period of study, and that the technical efficiency effect has fluctuated considerably over time. PMID:22900009
A Cobb Douglas stochastic frontier model on measuring domestic bank efficiency in Malaysia.
Hasan, Md Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md Azizul
2012-01-01
Banking system plays an important role in the economic development of any country. Domestic banks, which are the main components of the banking system, have to be efficient; otherwise, they may create obstacle in the process of development in any economy. This study examines the technical efficiency of the Malaysian domestic banks listed in the Kuala Lumpur Stock Exchange (KLSE) market over the period 2005-2010. A parametric approach, Stochastic Frontier Approach (SFA), is used in this analysis. The findings show that Malaysian domestic banks have exhibited an average overall efficiency of 94 percent, implying that sample banks have wasted an average of 6 percent of their inputs. Among the banks, RHBCAP is found to be highly efficient with a score of 0.986 and PBBANK is noted to have the lowest efficiency with a score of 0.918. The results also show that the level of efficiency has increased during the period of study, and that the technical efficiency effect has fluctuated considerably over time.
Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging
NASA Astrophysics Data System (ADS)
Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.
2014-04-01
The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques, though the proposed methodology can be used to evaluate any other ABI parametric image estimation technique.
Noise and Analyzer-Crystal Angular Position Analysis for Analyzer-Based Phase-Contrast Imaging
Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.
2014-01-01
The analyzer-based phase-contrast X-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile (AIP) of the X-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this manuscript is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the Multiple-Image Radiography (MIR), Diffraction Enhanced Imaging (DEI) and Scatter Diffraction Enhanced Imaging (S-DEI) estimation techniques, though the proposed methodology can be used to evaluate any other ABI parametric image estimation technique. PMID:24651402
Stability analysis of a time-periodic 2-dof MEMS structure
NASA Astrophysics Data System (ADS)
Kniffka, Till Jochen; Welte, Johannes; Ecker, Horst
2012-11-01
Microelectromechanical systems (MEMS) are becoming important for all kinds of industrial applications. Among them are filters in communication devices, due to the growing demand for efficient and accurate filtering of signals. In recent developments single degree of freedom (1-dof) oscillators, that are operated at a parametric resonances, are employed for such tasks. Typically vibration damping is low in such MEM systems. While parametric excitation (PE) is used so far to take advantage of a parametric resonance, this contribution suggests to also exploit parametric anti-resonances in order to improve the damping behavior of such systems. Modeling aspects of a 2-dof MEM system and first results of the analysis of the non-linear and the linearized system are the focus of this paper. In principle the investigated system is an oscillating mechanical system with two degrees of freedom x = [x1x2]T that can be described by Mx+Cx+K1x+K3(x2)x+Fes(x,V(t)) = 0. The system is inherently non-linear because of the cubic mechanical stiffness K3 of the structure, but also because of electrostatic forces (1+cos(ωt))Fes(x) that act on the system. Electrostatic forces are generated by comb drives and are proportional to the applied time-periodic voltage V(t). These drives also provide the means to introduce time-periodic coefficients, i.e. parametric excitation (1+cos(ωt)) with frequency ω. For a realistic MEM system the coefficients of the non-linear set of differential equations need to be scaled for efficient numerical treatment. The final mathematical model is a set of four non-linear time-periodic homogeneous differential equations of first order. Numerical results are obtained from two different methods. The linearized time-periodic (LTP) system is studied by calculating the Monodromy matrix of the system. The eigenvalues of this matrix decide on the stability of the LTP-system. To study the unabridged non-linear system, the bifurcation software ManLab is employed. Continuation analysis including stability evaluations are executed and show the frequency ranges for which the 2-dof system becomes unstable due to parametric resonances. Moreover, the existence of frequency intervals are shown where enhanced damping for the system is observed for this MEMS. The results from the stability studies are confirmed by simulation results.
NASA Astrophysics Data System (ADS)
Flores, Robert Joseph
Distributed generation can provide many benefits over traditional central generation such as increased reliability and efficiency while reducing emissions. Despite these potential benefits, distributed generation is generally not purchased unless it reduces energy costs. Economic dispatch strategies can be designed such that distributed generation technologies reduce overall facility energy costs. In this thesis, a microturbine generator is dispatched using different economic control strategies, reducing the cost of energy to the facility. Several industrial and commercial facilities are simulated using acquired electrical, heating, and cooling load data. Industrial and commercial utility rate structures are modeled after Southern California Edison and Southern California Gas Company tariffs and used to find energy costs for the simulated buildings and corresponding microturbine dispatch. Using these control strategies, building models, and utility rate models, a parametric study examining various generator characteristics is performed. An economic assessment of the distributed generation is then performed for both the microturbine generator and parametric study. Without the ability to export electricity to the grid, the economic value of distributed generation is limited to reducing the individual costs that make up the cost of energy for a building. Any economic dispatch strategy must be built to reduce these individual costs. While the ability of distributed generation to reduce cost depends of factors such as electrical efficiency and operations and maintenance cost, the building energy demand being serviced has a strong effect on cost reduction. Buildings with low load factors can accept distributed generation with higher operating costs (low electrical efficiency and/or high operations and maintenance cost) due to the value of demand reduction. As load factor increases, lower operating cost generators are desired due to a larger portion of the building load being met in an effort to reduce demand. In addition, buildings with large thermal demand have access to the least expensive natural gas, lowering the cost of operating distributed generation. Recovery of exhaust heat from DG reduces cost only if the buildings thermal demand coincides with the electrical demand. Capacity limits exist where annual savings from operation of distributed generation decrease if further generation is installed. For low operating cost generators, the approximate limit is the average building load. This limit decreases as operating costs increase. In addition, a high capital cost of distributed generation can be accepted if generator operating costs are low. As generator operating costs increase, capital cost must decrease if a positive economic performance is desired.
Zou, Kelly H; Resnic, Frederic S; Talos, Ion-Florin; Goldberg-Zimring, Daniel; Bhagwat, Jui G; Haker, Steven J; Kikinis, Ron; Jolesz, Ferenc A; Ohno-Machado, Lucila
2005-10-01
Medical classification accuracy studies often yield continuous data based on predictive models for treatment outcomes. A popular method for evaluating the performance of diagnostic tests is the receiver operating characteristic (ROC) curve analysis. The main objective was to develop a global statistical hypothesis test for assessing the goodness-of-fit (GOF) for parametric ROC curves via the bootstrap. A simple log (or logit) and a more flexible Box-Cox normality transformations were applied to untransformed or transformed data from two clinical studies to predict complications following percutaneous coronary interventions (PCIs) and for image-guided neurosurgical resection results predicted by tumor volume, respectively. We compared a non-parametric with a parametric binormal estimate of the underlying ROC curve. To construct such a GOF test, we used the non-parametric and parametric areas under the curve (AUCs) as the metrics, with a resulting p value reported. In the interventional cardiology example, logit and Box-Cox transformations of the predictive probabilities led to satisfactory AUCs (AUC=0.888; p=0.78, and AUC=0.888; p=0.73, respectively), while in the brain tumor resection example, log and Box-Cox transformations of the tumor size also led to satisfactory AUCs (AUC=0.898; p=0.61, and AUC=0.899; p=0.42, respectively). In contrast, significant departures from GOF were observed without applying any transformation prior to assuming a binormal model (AUC=0.766; p=0.004, and AUC=0.831; p=0.03), respectively. In both studies the p values suggested that transformations were important to consider before applying any binormal model to estimate the AUC. Our analyses also demonstrated and confirmed the predictive values of different classifiers for determining the interventional complications following PCIs and resection outcomes in image-guided neurosurgery.
Anastasiadis, K; Fragoulakis, V; Antonitsis, P; Maniadakis, N
2013-10-15
This study aims to develop a methodological framework for the comparative economic evaluation between Minimal Extracorporeal Circulation (MECC) versus conventional Extracorporeal Circulation (CECC) in patients undergoing coronary artery bypass grafting (CABG) in different healthcare systems. Moreover, we evaluate the cost-effectiveness ratio of alternative comparators in the healthcare setting of Greece, Germany, the Netherlands and Switzerland. The effectiveness data utilized were derived from a recent meta-analysis which incorporated 24 randomized clinical trials. Total therapy cost per patient reflects all resources expensed in delivery of therapy and the management of any adverse events, including drugs, diagnostics tests, materials, devices, blood units, the utilization of operating theaters, intensive care units, and wards. Perioperative mortality was used as the primary health outcome to estimate life years gained in treatment arms. Bias-corrected uncertainty intervals were calculated using the percentile method of non-parametric Monte-Carlo simulation. The MECC circuit was more expensive than CECC, with a difference ranging from €180 to €600 depending on the country. However, in terms of total therapy cost per patient the comparison favored MECC in all countries. Specifically it was associated with a reduction of €635 in Greece, €297 in Germany, €1590 in the Netherlands and €375 in Switzerland. In terms of effectiveness, the total life-years gained were slightly higher in favor of MECC. Surgery with MECC may be dominant (lower cost and higher effectiveness) compared to CECC in coronary revascularization procedures and therefore it represents an attractive new option relative to conventional extracorporeal circulation for CABG. © 2013.
Cost analysis of a coal-fired power plant using the NPV method
NASA Astrophysics Data System (ADS)
Kumar, Ravinder; Sharma, Avdhesh Kr.; Tewari, P. C.
2015-12-01
The present study investigates the impact of various factors affecting coal-fired power plant economics of 210 MW subcritical unit situated in north India for electricity generation. In this paper, the cost data of various units of thermal power plant in terms of power output capacity have been fitted using power law with the help of the data collected from a literature search. To have a realistic estimate of primary components or equipment, it is necessary to include the latest cost of these components. The cost analysis of the plant was carried out on the basis of total capital investment, operating cost and revenue. The total capital investment includes the total direct plant cost and total indirect plant cost. Total direct plant cost involves the cost of equipment (i.e. boiler, steam turbine, condenser, generator and auxiliary equipment including condensate extraction pump, feed water pump, etc.) and other costs associated with piping, electrical, civil works, direct installation cost, auxiliary services, instrumentation and controls, and site preparation. The total indirect plant cost includes the cost of engineering and set-up. The net present value method was adopted for the present study. The work presented in this paper is an endeavour to study the influence of some of the important parameters on the lifetime costs of a coal-fired power plant. For this purpose, parametric study with and without escalation rates for a period of 35 years plant life was evaluated. The results predicted that plant life, interest rate and the escalation rate were observed to be very sensitive on plant economics in comparison to other factors under study.
Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 4, Appendix C
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
The electrical characterization and qualification test results are presented for the RCA MWS5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Statistical analysis data is supplied along with write pulse width, read cycle time, write cycle time, and chip enable time data.
Multivariable Parametric Cost Model for Ground Optical Telescope Assembly
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia
2005-01-01
A parametric cost model for ground-based telescopes is developed using multivariable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction-limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature are examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e., multi-telescope phased-array systems). Additionally, single variable models Based on aperture diameter are derived.
Comparison of Survival Models for Analyzing Prognostic Factors in Gastric Cancer Patients
Habibi, Danial; Rafiei, Mohammad; Chehrei, Ali; Shayan, Zahra; Tafaqodi, Soheil
2018-03-27
Objective: There are a number of models for determining risk factors for survival of patients with gastric cancer. This study was conducted to select the model showing the best fit with available data. Methods: Cox regression and parametric models (Exponential, Weibull, Gompertz, Log normal, Log logistic and Generalized Gamma) were utilized in unadjusted and adjusted forms to detect factors influencing mortality of patients. Comparisons were made with Akaike Information Criterion (AIC) by using STATA 13 and R 3.1.3 softwares. Results: The results of this study indicated that all parametric models outperform the Cox regression model. The Log normal, Log logistic and Generalized Gamma provided the best performance in terms of AIC values (179.2, 179.4 and 181.1, respectively). On unadjusted analysis, the results of the Cox regression and parametric models indicated stage, grade, largest diameter of metastatic nest, largest diameter of LM, number of involved lymph nodes and the largest ratio of metastatic nests to lymph nodes, to be variables influencing the survival of patients with gastric cancer. On adjusted analysis, according to the best model (log normal), grade was found as the significant variable. Conclusion: The results suggested that all parametric models outperform the Cox model. The log normal model provides the best fit and is a good substitute for Cox regression. Creative Commons Attribution License
40 CFR 264.97 - General ground-water monitoring requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... paragraph (i) of this section. (1) A parametric analysis of variance (ANOVA) followed by multiple... mean levels for each constituent. (2) An analysis of variance (ANOVA) based on ranks followed by...
Intercity Passenger Parametric Analysis: Overview: Maglev Analysis
DOT National Transportation Integrated Search
1990-04-02
This document provides information intended to clarify consideration of some of the technically-based questions which arise in connection with intercity passenger transportation, and to provide insight into the characteristics and potential roles o...
Comprehensive Analysis Modeling of Small-Scale UAS Rotors
NASA Technical Reports Server (NTRS)
Russell, Carl R.; Sekula, Martin K.
2017-01-01
Multicopter unmanned aircraft systems (UAS), or drones, have continued their explosive growth in recent years. With this growth comes demand for increased performance as the limits of existing technologies are reached. In order to better design multicopter UAS aircraft, better performance prediction tools are needed. This paper presents the results of a study aimed at using the rotorcraft comprehensive analysis code CAMRAD II to model a multicopter UAS rotor in hover. Parametric studies were performed to determine the level of fidelity needed in the analysis code inputs to achieve results that match test data. Overall, the results show that CAMRAD II is well suited to model small-scale UAS rotors in hover. This paper presents the results of the parametric studies as well as recommendations for the application of comprehensive analysis codes to multicopter UAS rotors.
Parametric analysis of a down-scaled turbo jet engine suitable for drone and UAV propulsion
NASA Astrophysics Data System (ADS)
Wessley, G. Jims John; Chauhan, Swati
2018-04-01
This paper presents a detailed study on the need for downscaling gas turbine engines for UAV and drone propulsion. Also, the procedure for downscaling and the parametric analysis of a downscaled engine using Gas Turbine Simulation Program software GSP 11 is presented. The need for identifying a micro gas turbine engine in the thrust range of 0.13 to 4.45 kN to power UAVs and drones weighing in the range of 4.5 to 25 kg is considered and in order to meet the requirement a parametric analysis on the scaled down Allison J33-A-35 Turbojet engine is performed. It is evident from the analysis that the thrust developed by the scaled engine and the Thrust Specific Fuel Consumption TSFC depends on pressure ratio, mass flow rate of air and Mach number. A scaling factor of 0.195 corresponding to air mass flow rate of 7.69 kg/s produces a thrust in the range of 4.57 to 5.6 kN while operating at a Mach number of 0.3 within the altitude of 5000 to 9000 m. The thermal and overall efficiency of the scaled engine is found to be 67% and 75% respectively for a pressure ratio of 2. The outcomes of this analysis form a strong base for further analysis, design and fabrication of micro gas turbine engines to propel future UAVs and drones.
NASA Technical Reports Server (NTRS)
1972-01-01
Propulsion system characteristics for a long range, high subsonic (Mach 0.90 - 0.98), jet commercial transport aircraft are studied to identify the most desirable cycle and engine configuration and to assess the payoff of advanced engine technologies applicable to the time frame of the late 1970s to the mid 1980s. An engine parametric study phase examines major cycle trends on the basis of aircraft economics. This is followed by the preliminary design of two advanced mixed exhaust turbofan engines pointed at two different technology levels (1970 and 1985 commercial certification for engines No. 1 and No. 2, respectively). The economic penalties of environmental constraints - noise and exhaust emissions - are assessed. The highest specific thrust engine (lowest bypass ratio for a given core technology) achievable with a single-stage fan yields the best economics for a Mach 0.95 - 0.98 aircraft and can meet the noise objectives specified, but with significant economic penalties. Advanced technologies which would allow high temperature and cycle pressure ratios to be used effectively are shown to provide significant improvement in mission performance which can partially offset the economic penalties incurred to meet lower noise goals. Advanced technology needs are identified; and, in particular, the initiation of an integrated fan and inlet aero/acoustic program is recommended.
Validation of a Parametric Approach for 3d Fortification Modelling: Application to Scale Models
NASA Astrophysics Data System (ADS)
Jacquot, K.; Chevrier, C.; Halin, G.
2013-02-01
Parametric modelling approach applied to cultural heritage virtual representation is a field of research explored for years since it can address many limitations of digitising tools. For example, essential historical sources for fortification virtual reconstructions like plans-reliefs have several shortcomings when they are scanned. To overcome those problems, knowledge based-modelling can be used: knowledge models based on the analysis of theoretical literature of a specific domain such as bastioned fortification treatises can be the cornerstone of the creation of a parametric library of fortification components. Implemented in Grasshopper, these components are manually adjusted on the data available (i.e. 3D surveys of plans-reliefs or scanned maps). Most of the fortification area is now modelled and the question of accuracy assessment is raised. A specific method is used to evaluate the accuracy of the parametric components. The results of the assessment process will allow us to validate the parametric approach. The automation of the adjustment process can finally be planned. The virtual model of fortification is part of a larger project aimed at valorising and diffusing a very unique cultural heritage item: the collection of plans-reliefs. As such, knowledge models are precious assets when automation and semantic enhancements will be considered.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
A FORTRAN computer program is described for predicting the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
The FORTRAN computing program predicts flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuels of varying end point and hydrogen content specifications. The program has a provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.
GASP- General Aviation Synthesis Program. Volume 1: Main program. Part 1: Theoretical development
NASA Technical Reports Server (NTRS)
Hague, D.
1978-01-01
The General Aviation synthesis program performs tasks generally associated with aircraft preliminary design and allows an analyst the capability of performing parametric studies in a rapid manner. GASP emphasizes small fixed-wing aircraft employing propulsion systems varying froma single piston engine with fixed pitch propeller through twin turboprop/ turbofan powered business or transport type aircraft. The program, which may be operated from a computer terminal in either the batch or interactive graphic mode, is comprised of modules representing the various technical disciplines integrated into a computational flow which ensures that the interacting effects of design variables are continuously accounted for in the aircraft sizing procedure. The model is a useful tool for comparing configurations, assessing aircraft performance and economics, performing tradeoff and sensitivity studies, and assessing the impact of advanced technologies on aircraft performance and economics.
Thermal energy storage systems using fluidized bed heat exchangers
NASA Technical Reports Server (NTRS)
Weast, T.; Shannon, L.
1980-01-01
A rotary cement kiln and an electric arc furnace were chosen for evaluation to determine the applicability of a fluid bed heat exchanger (FBHX) for thermal energy storage (TES). Multistage shallow bed FBHX's operating with high temperature differences were identified as the most suitable for TES applications. Analysis of the two selected conceptual systems included establishing a plant process flow configuration, an operational scenario, a preliminary FBHX/TES design, and parametric analysis. A computer model was developed to determine the effects of the number of stages, gas temperatures, gas flows, bed materials, charge and discharge time, and parasitic power required for operation. The maximum national energy conservation potential of the cement plant application with TES is 15.4 million barrels of oil or 3.9 million tons of coal per year. For the electric arc furnance application the maximum national conservation potential with TES is 4.5 million barrels of oil or 1.1 million tons of coal per year. Present time of day utility rates are near the breakeven point required for the TES system. Escalation of on-peak energy due to critical fuel shortages could make the FBHX/TES applications economically attractive in the future.
Thermal energy storage systems using fluidized bed heat exchangers
NASA Astrophysics Data System (ADS)
Weast, T.; Shannon, L.
1980-06-01
A rotary cement kiln and an electric arc furnace were chosen for evaluation to determine the applicability of a fluid bed heat exchanger (FBHX) for thermal energy storage (TES). Multistage shallow bed FBHX's operating with high temperature differences were identified as the most suitable for TES applications. Analysis of the two selected conceptual systems included establishing a plant process flow configuration, an operational scenario, a preliminary FBHX/TES design, and parametric analysis. A computer model was developed to determine the effects of the number of stages, gas temperatures, gas flows, bed materials, charge and discharge time, and parasitic power required for operation. The maximum national energy conservation potential of the cement plant application with TES is 15.4 million barrels of oil or 3.9 million tons of coal per year. For the electric arc furnance application the maximum national conservation potential with TES is 4.5 million barrels of oil or 1.1 million tons of coal per year. Present time of day utility rates are near the breakeven point required for the TES system. Escalation of on-peak energy due to critical fuel shortages could make the FBHX/TES applications economically attractive in the future.
Vitikainen, Kirsi; Street, Andrew; Linna, Miika
2009-02-01
Hospital efficiency has been the subject of numerous health economics studies, but there is little evidence on how the chosen output and casemix measures affect the efficiency results. The aim of this study is to examine the robustness of efficiency results due to these factors. Comparison is made between activities and episode output measures, and two different output grouping systems (Classic and FullDRG). Non-parametric data envelopment analysis is used as an analysis technique. The data consist of all public acute care hospitals in Finland in 2005 (n=40). Efficiency estimates were not found to be highly sensitive to the choice between episode and activity descriptions of output, but more so to the choice of DRG grouping system. Estimates are most sensitive to scale assumptions, with evidence of decreasing returns to scale in larger hospitals. Episode measures are generally to be preferred to activity measures because these better capture the patient pathway, while FullDRGs are preferred to Classic DRGs particularly because of the better description of outpatient output in the former grouping system. Attention should be paid to reducing the extent of scale inefficiency in Finland.
NASA Astrophysics Data System (ADS)
Linden, Sebastian; Virey, Jean-Marc
2008-07-01
We test the robustness and flexibility of the Chevallier-Polarski-Linder (CPL) parametrization of the dark energy equation of state w(z)=w0+wa(z)/(1+z) in recovering a four-parameter steplike fiducial model. We constrain the parameter space region of the underlying fiducial model where the CPL parametrization offers a reliable reconstruction. It turns out that non-negligible biases leak into the results for recent (z<2.5) rapid transitions, but that CPL yields a good reconstruction in all other cases. The presented analysis is performed with supernova Ia data as forecasted for a space mission like SNAP/JDEM, combined with future expectations for the cosmic microwave background shift parameter R and the baryonic acoustic oscillation parameter A.
Parametric spatiotemporal oscillation in reaction-diffusion systems.
Ghosh, Shyamolina; Ray, Deb Shankar
2016-03-01
We consider a reaction-diffusion system in a homogeneous stable steady state. On perturbation by a time-dependent sinusoidal forcing of a suitable scaling parameter the system exhibits parametric spatiotemporal instability beyond a critical threshold frequency. We have formulated a general scheme to calculate the threshold condition for oscillation and the range of unstable spatial modes lying within a V-shaped region reminiscent of Arnold's tongue. Full numerical simulations show that depending on the specificity of nonlinearity of the models, the instability may result in time-periodic stationary patterns in the form of standing clusters or spatially localized breathing patterns with characteristic wavelengths. Our theoretical analysis of the parametric oscillation in reaction-diffusion system is corroborated by full numerical simulation of two well-known chemical dynamical models: chlorite-iodine-malonic acid and Briggs-Rauscher reactions.
Parametric spatiotemporal oscillation in reaction-diffusion systems
NASA Astrophysics Data System (ADS)
Ghosh, Shyamolina; Ray, Deb Shankar
2016-03-01
We consider a reaction-diffusion system in a homogeneous stable steady state. On perturbation by a time-dependent sinusoidal forcing of a suitable scaling parameter the system exhibits parametric spatiotemporal instability beyond a critical threshold frequency. We have formulated a general scheme to calculate the threshold condition for oscillation and the range of unstable spatial modes lying within a V-shaped region reminiscent of Arnold's tongue. Full numerical simulations show that depending on the specificity of nonlinearity of the models, the instability may result in time-periodic stationary patterns in the form of standing clusters or spatially localized breathing patterns with characteristic wavelengths. Our theoretical analysis of the parametric oscillation in reaction-diffusion system is corroborated by full numerical simulation of two well-known chemical dynamical models: chlorite-iodine-malonic acid and Briggs-Rauscher reactions.
Quasi-phase-matched χ(3 )-parametric interactions in sinusoidally tapered waveguides
NASA Astrophysics Data System (ADS)
Saleh, Mohammed F.
2018-01-01
In this article, I show how periodically tapered waveguides can be employed as efficient quasi-phase-matching schemes for four-wave mixing parametric processes in third-order nonlinear materials. As an example, a thorough study of enhancing third-harmonic generation in sinusoidally tapered fibers has been conducted. The quasi-phase-matching condition has been obtained for nonlinear parametric interactions in these structures using Fourier-series analysis. The dependencies of the conversion efficiency of the third harmonic on the modulation amplitude, tapering period, longitudinal-propagation direction, and pump wavelength have been studied. In comparison to uniform waveguides, the conversion efficiency has been enhanced by orders of magnitudes. I envisage that this work will have a great impact in the field of guided nonlinear optics using centrosymmetric materials.
Parametric, nonparametric and parametric modelling of a chaotic circuit time series
NASA Astrophysics Data System (ADS)
Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.
2000-09-01
The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.
The Propensity Score Analytical Framework: An Overview and Institutional Research Example
ERIC Educational Resources Information Center
Herzog, Serge
2014-01-01
Estimating the effect of campus math tutoring support, this study demonstrates the use of propensity score weighted and matched-data analysis and examines the correspondence with results from parametric regression analysis.
Parallel runway requirement analysis study. Volume 1: The analysis
NASA Technical Reports Server (NTRS)
Ebrahimi, Yaghoob S.
1993-01-01
The correlation of increased flight delays with the level of aviation activity is well recognized. A main contributor to these flight delays has been the capacity of airports. Though new airport and runway construction would significantly increase airport capacity, few programs of this type are currently underway, let alone planned, because of the high cost associated with such endeavors. Therefore, it is necessary to achieve the most efficient and cost effective use of existing fixed airport resources through better planning and control of traffic flows. In fact, during the past few years the FAA has initiated such an airport capacity program designed to provide additional capacity at existing airports. Some of the improvements that that program has generated thus far have been based on new Air Traffic Control procedures, terminal automation, additional Instrument Landing Systems, improved controller display aids, and improved utilization of multiple runways/Instrument Meteorological Conditions (IMC) approach procedures. A useful element to understanding potential operational capacity enhancements at high demand airports has been the development and use of an analysis tool called The PLAND_BLUNDER (PLB) Simulation Model. The objective for building this simulation was to develop a parametric model that could be used for analysis in determining the minimum safety level of parallel runway operations for various parameters representing the airplane, navigation, surveillance, and ATC system performance. This simulation is useful as: a quick and economical evaluation of existing environments that are experiencing IMC delays, an efficient way to study and validate proposed procedure modifications, an aid in evaluating requirements for new airports or new runways in old airports, a simple, parametric investigation of a wide range of issues and approaches, an ability to tradeoff air and ground technology and procedures contributions, and a way of considering probable blunder mechanisms and range of blunder scenarios. This study describes the steps of building the simulation and considers the input parameters, assumptions and limitations, and available outputs. Validation results and sensitivity analysis are addressed as well as outlining some IMC and Visual Meteorological Conditions (VMC) approaches to parallel runways. Also, present and future applicable technologies (e.g., Digital Autoland Systems, Traffic Collision and Avoidance System II, Enhanced Situational Awareness System, Global Positioning Systems for Landing, etc.) are assessed and recommendations made.
NASA Astrophysics Data System (ADS)
Yoo, Yeon-Jong
The purpose of this study is to investigate the performance and stability of the gas-injection enhanced natural circulation in heavy-liquid-metal-cooled systems. The target system is STAR-LM, which is a 400-MWt-class advanced lead-cooled fast reactor under development by Argonne National Laboratory and Oregon State University. The primary loop of STAR-LM relies on natural circulation to eliminate main circulation pumps for enhancement of passive safety. To significantly increase the natural circulation flow rate for the incorporation of potential future power uprates, the injection of noncondensable gas into the coolant above the core is envisioned ("gas lift pump"). Reliance upon gas-injection enhanced natural circulation raises the concern of flow instability due to the relatively high temperature change in the reactor core and the two-phase flow condition in the riser. For this study, the one-dimensional flow field equations were applied to each flow section and the mixture models of two-phase flow, i.e., both the homogeneous and drift-flux equilibrium models were used in the two-phase region of the riser. For the stability analysis, the linear perturbation technique based on the frequency-domain approach was used by employing the Nyquist stability criterion and a numerical root search method. It has been shown that the thermal power of the STAR-LM natural circulation system could be increased from 400 up to 1152 MW with gas injection under the limiting void fraction of 0.30 and limiting coolant velocity of 2.0 m/s from the steady-state performance analysis. As the result of the linear stability analysis, it has turned out that the STAR-LM natural circulation system would be stable even with gas injection. In addition, through the parametric study, it has been found that the thermal inertia effects of solid structures such as fuel rod and heat exchanger tube should be considered in the stability analysis model. The results of this study will be a part of the optimized stable design of the gas-injection enhanced natural circulation of STAR-LM with substantially improved power level and economical competitiveness. Furthermore, combined with the parametric study, this research could contribute a guideline for the design of other similar heavy-liquid-metal-cooled natural circulation systems with gas injection.
Parametric amplification in quasi-PT symmetric coupled waveguide structures
NASA Astrophysics Data System (ADS)
Zhong, Q.; Ahmed, A.; Dadap, J. I.; Osgood, R. M., Jr.; El-Ganainy, R.
2016-12-01
The concept of non-Hermitian parametric amplification was recently proposed as a means to achieve an efficient energy conversion throughout the process of nonlinear three wave mixing in the absence of phase matching. Here we investigate this effect in a waveguide coupler arrangement whose characteristics are tailored to introduce passive PT symmetry only for the idler component. By means of analytical solutions and numerical analysis, we demonstrate the utility of these novel schemes and obtain the optimal design conditions for these devices.
Parametric study on the performance of automotive MR shock absorbers
NASA Astrophysics Data System (ADS)
Gołdasz, J.; Dzierżek, S.
2016-09-01
The paper contains the results of a parametric study to explore the influence of various quantities on the performance range of semi-active automotive shock absorbers using the magnetorheological (MR) fluid under steady-state and transient excitations. The analysis was performed with simulated data and using a standard single-tube shock absorber configuration with a single-gap MR valve. Additionally, the impact of material variables and valves geometry was examined as the parameters were varied and its dynamic range studied.
Multivariable Parametric Cost Model for Ground Optical: Telescope Assembly
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia
2004-01-01
A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature were examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter were derived.
A probabilistic strategy for parametric catastrophe insurance
NASA Astrophysics Data System (ADS)
Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin
2017-04-01
Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss events. Due to the nature of parametric programmes, it is still necessary to clearly define when a payout is due or not, and so a decision threshold probability above which a loss event is considered to occur must be set, effectively converting the issued probabilities into deterministic binary outcomes. Model skill and value are evaluated over the range of possible threshold probabilities, with the objective of defining the optimal one. The predictive ability of the model is assessed. In terms of value assessment, a decision model is proposed, allowing users to quantify monetarily their expected expenses when different combinations of model event triggering and actual event occurrence take place, directly tackling the problem of basis risk.
Castillo, Maria Isabel; Larsen, Emily; Cooke, Marie; Marsh, Nicole M; Wallis, Marianne C; Finucane, Julie; Brown, Peter; Mihala, Gabor; Byrnes, Joshua; Walker, Rachel; Cable, Prudence; Zhang, Li; Sear, Candi; Jackson, Gavin; Rowsome, Anna; Ryan, Alison; Humphries, Julie C; Sivyer, Susan; Flanigan, Kathy; Rickard, Claire M
2018-01-01
Introduction Peripheral intravenous catheters (PIVCs) are frequently used in hospitals. However, PIVC complications are common, with failures leading to treatment delays, additional procedures, patient pain and discomfort, increased clinician workload and substantially increased healthcare costs. Recent evidence suggests integrated PIVC systems may be more effective than traditional non-integrated PIVC systems in reducing phlebitis, infiltration and costs and increasing functional dwell time. The study aim is to determine the efficacy, cost–utility and acceptability to patients and professionals of an integrated PIVC system compared with a non-integrated PIVC system. Methods and analysis Two-arm, multicentre, randomised controlled superiority trial of integrated versus non-integrated PIVC systems to compare effectiveness on clinical and economic outcomes. Recruitment of 1560 patients over 2 years, with randomisation by a centralised service ensuring allocation concealment. Primary outcomes: catheter failure (composite endpoint) for reasons of: occlusion, infiltration/extravasation, phlebitis/thrombophlebitis, dislodgement, localised or catheter-associated bloodstream infections. Secondary outcomes: first time insertion success, types of PIVC failure, device colonisation, insertion pain, functional dwell time, adverse events, mortality, cost–utility and consumer acceptability. One PIVC per patient will be included, with intention-to-treat analysis. Baseline group comparisons will be made for potentially clinically important confounders. The proportional hazards assumption will be checked, and Cox regression will test the effect of group, patient, device and clinical variables on failure. An as-treated analysis will assess the effect of protocol violations. Kaplan-Meier survival curves with log-rank tests will compare failure by group over time. Secondary endpoints will be compared between groups using parametric/non-parametric techniques. Ethics and dissemination Ethical approval from the Royal Brisbane and Women’s Hospital Human Research Ethics Committee (HREC/16/QRBW/527), Griffith University Human Research Ethics Committee (Ref No. 2017/002) and the South Metropolitan Health Services Human Research Ethics Committee (Ref No. 2016–239). Results will be published in peer-reviewed journals. Trial registration number ACTRN12617000089336. PMID:29764876
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins
NASA Technical Reports Server (NTRS)
Brenner, Marty; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.
NASA Technical Reports Server (NTRS)
Kamhawi, Hilmi N.
2012-01-01
This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.
Two-sample tests and one-way MANOVA for multivariate biomarker data with nondetects.
Thulin, M
2016-09-10
Testing whether the mean vector of a multivariate set of biomarkers differs between several populations is an increasingly common problem in medical research. Biomarker data is often left censored because some measurements fall below the laboratory's detection limit. We investigate how such censoring affects multivariate two-sample and one-way multivariate analysis of variance tests. Type I error rates, power and robustness to increasing censoring are studied, under both normality and non-normality. Parametric tests are found to perform better than non-parametric alternatives, indicating that the current recommendations for analysis of censored multivariate data may have to be revised. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Trajectories for High Specific Impulse High Specific Power Deep Space Exploration
NASA Technical Reports Server (NTRS)
Polsgrove, T.; Adams, R. B.; Brady, Hugh J. (Technical Monitor)
2002-01-01
Preliminary results are presented for two methods to approximate the mission performance of high specific impulse high specific power vehicles. The first method is based on an analytical approximation derived by Williams and Shepherd and can be used to approximate mission performance to outer planets and interstellar space. The second method is based on a parametric analysis of trajectories created using the well known trajectory optimization code, VARITOP. This parametric analysis allows the reader to approximate payload ratios and optimal power requirements for both one-way and round-trip missions. While this second method only addresses missions to and from Jupiter, future work will encompass all of the outer planet destinations and some interstellar precursor missions.
Service, Susan; Molina, Julio; Deyoung, Joseph; Jawaheer, Damini; Aldana, Ileana; Vu, Thuy; Araya, Carmen; Araya, Xinia; Bejarano, Julio; Fournier, Eduardo; Ramirez, Magui; Mathews, Carol A; Davanzo, Pablo; Macaya, Gabriel; Sandkuijl, Lodewijk; Sabatti, Chiara; Reus, Victor; Freimer, Nelson
2006-06-05
We have ascertained in the Central Valley of Costa Rica a new kindred (CR201) segregating for severe bipolar disorder (BP-I). The family was identified by tracing genealogical connections among eight persons initially independently ascertained for a genome wide association study of BP-I. For the genome screen in CR201, we trimmed the family down to 168 persons (82 of whom are genotyped), containing 25 individuals with a best-estimate diagnosis of BP-I. A total of 4,690 SNP markers were genotyped. Analysis of the data was hampered by the size and complexity of the pedigree, which prohibited using exact multipoint methods on the entire kindred. Two-point parametric linkage analysis, using a conservative model of transmission, produced a maximum LOD score of 2.78 on chromosome 6, and a total of 39 loci with LOD scores >1.0. Multipoint parametric and non-parametric linkage analysis was performed separately on four sections of CR201, and interesting (nominal P-value from either analysis <0.01), although not statistically significant, regions were highlighted on chromosomes 1, 2, 3, 12, 16, 19, and 22, in at least one section of the pedigree, or when considering all sections together. The difficulties of analyzing genome wide SNP data for complex disorders in large, potentially informative, kindreds are discussed.
Serum and Plasma Metabolomic Biomarkers for Lung Cancer.
Kumar, Nishith; Shahjaman, Md; Mollah, Md Nurul Haque; Islam, S M Shahinul; Hoque, Md Aminul
2017-01-01
In drug invention and early disease prediction of lung cancer, metabolomic biomarker detection is very important. Mortality rate can be decreased, if cancer is predicted at the earlier stage. Recent diagnostic techniques for lung cancer are not prognosis diagnostic techniques. However, if we know the name of the metabolites, whose intensity levels are considerably changing between cancer subject and control subject, then it will be easy to early diagnosis the disease as well as to discover the drug. Therefore, in this paper we have identified the influential plasma and serum blood sample metabolites for lung cancer and also identified the biomarkers that will be helpful for early disease prediction as well as for drug invention. To identify the influential metabolites, we considered a parametric and a nonparametric test namely student׳s t-test as parametric and Kruskal-Wallis test as non-parametric test. We also categorized the up-regulated and down-regulated metabolites by the heatmap plot and identified the biomarkers by support vector machine (SVM) classifier and pathway analysis. From our analysis, we got 27 influential (p-value<0.05) metabolites from plasma sample and 13 influential (p-value<0.05) metabolites from serum sample. According to the importance plot through SVM classifier, pathway analysis and correlation network analysis, we declared 4 metabolites (taurine, aspertic acid, glutamine and pyruvic acid) as plasma biomarker and 3 metabolites (aspartic acid, taurine and inosine) as serum biomarker.
Geometric Model for a Parametric Study of the Blended-Wing-Body Airplane
NASA Technical Reports Server (NTRS)
Mastin, C. Wayne; Smith, Robert E.; Sadrehaghighi, Ideen; Wiese, Micharl R.
1996-01-01
A parametric model is presented for the blended-wing-body airplane, one concept being proposed for the next generation of large subsonic transports. The model is defined in terms of a small set of parameters which facilitates analysis and optimization during the conceptual design process. The model is generated from a preliminary CAD geometry. From this geometry, airfoil cross sections are cut at selected locations and fitted with analytic curves. The airfoils are then used as boundaries for surfaces defined as the solution of partial differential equations. Both the airfoil curves and the surfaces are generated with free parameters selected to give a good representation of the original geometry. The original surface is compared with the parametric model, and solutions of the Euler equations for compressible flow are computed for both geometries. The parametric model is a good approximation of the CAD model and the computed solutions are qualitatively similar. An optimal NURBS approximation is constructed and can be used by a CAD model for further refinement or modification of the original geometry.
Parametric Study and Design of Tab Shape for Improving Aerodynamic Performance of Rotor Blade
NASA Astrophysics Data System (ADS)
Han, Jaeseong; Kwon, Oh Joon
2018-04-01
In the present study, the parametric study was performed to analyze the effect of the tab on the aerodynamic performance and characteristics of rotor blades. Also, the tab shape was designed to improve the aerodynamic performance of rotor blades. A computational fluid dynamics solver based on three-dimensional Reynolds averaged Navier-Stokes equation using an unstructured mesh was used for the parametric study and the tab design. For airfoils, the effect of length and angle of a tab was studied on the aerodynamic characteristics of airfoils. In addition, including those parameters, the effect of a span of a tab was studied for rotor blades in hovering flight. The results of the parametric study were analyzed in terms of change of the aerodynamic performance and characteristics to understand the effect of a tab. Considering the analysis, the design of tab shape was conducted to improve the aerodynamic performance of rotor blades. The simply attached tab to trailing edge of the rotor blades increases the thrust of the rotor blades without significant changing of aerodynamic characteristics of the rotor blades in hovering and forward flight.
Investigating the possibility of a turning point in the dark energy equation of state
NASA Astrophysics Data System (ADS)
Hu, YaZhou; Li, Miao; Li, XiaoDong; Zhang, ZhenHui
2014-08-01
We investigate a second order parabolic parametrization, w( a) = w t + w a ( a t - a)2, which is a direct characterization of a possible turning in w. The cosmological consequence of this parametrization is explored by using the observational data of the SNLS3 type Ia supernovae sample, the CMB measurements from WMAP9 and Planck, the Hubble parameter measurement from HST, and the baryon acoustic oscillation (BAO) measurements from 6dFGS, BOSS DR11 and improved WiggleZ. We found the existence of a turning point in w at a ˜ 0.7 is favored at 1 σ CL. In the epoch 0.55 < a < 0.9, w < -1 is favored at 1 σ CL, and this significance increases near a = 0.8, reaching a 2 σ CL. The parabolic parametrization achieve equivalent performance to the ΛCDM and Chevallier-Polarski-Linder (CPL) models when the Akaike information criterion was used to assess them. Our analysis shows the value of considering high order parametrizations when studying the cosmological constraints on w.
Parametric study of potential early commercial MHD power plants
NASA Technical Reports Server (NTRS)
Hals, F. A.
1979-01-01
Three different reference power plant configurations were considered with parametric variations of the various design parameters for each plant. Two of the reference plant designs were based on the use of high temperature regenerative air preheaters separately fired by a low Btu gas produced from a coal gasifier which was integrated with the power plant. The third reference plant design was based on the use of oxygen enriched combustion air preheated to a more moderate temperature in a tubular type metallic recuperative heat exchanger which is part of the bottoming plant heat recovery system. Comparative information was developed on plant performance and economics. The highest net plant efficiency of about 45 percent was attained by the reference plant design with the use of a high temperature air preheater separately fired with the advanced entrained bed gasifier. The use of oxygen enrichment of the combustion air yielded the lowest cost of generating electricity at a slightly lower plant efficiency. Both of these two reference plant designs are identified as potentially attractive for early MHD power plant applications.
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Model-independent fit to Planck and BICEP2 data
NASA Astrophysics Data System (ADS)
Barranco, Laura; Boubekeur, Lotfi; Mena, Olga
2014-09-01
Inflation is the leading theory to describe elegantly the initial conditions that led to structure formation in our Universe. In this paper, we present a novel phenomenological fit to the Planck, WMAP polarization (WP) and the BICEP2 data sets using an alternative parametrization. Instead of starting from inflationary potentials and computing the inflationary observables, we use a phenomenological parametrization due to Mukhanov, describing inflation by an effective equation of state, in terms of the number of e-folds and two phenomenological parameters α and β. Within such a parametrization, which captures the different inflationary models in a model-independent way, the values of the scalar spectral index ns, its running and the tensor-to-scalar ratio r are predicted, given a set of parameters (α ,β). We perform a Markov Chain Monte Carlo analysis of these parameters, and we show that the combined analysis of Planck and WP data favors the Starobinsky and Higgs inflation scenarios. Assuming that the BICEP2 signal is not entirely due to foregrounds, the addition of this last data set prefers instead the ϕ2 chaotic models. The constraint we get from Planck and WP data alone on the derived tensor-to-scalar ratio is r <0.18 at 95% C.L., value which is consistent with the one quoted from the BICEP2 Collaboration analysis, r =0.16-0.05+0-06, after foreground subtraction. This is not necessarily at odds with the 2σ tension found between Planck and BICEP2 measurements when analyzing data in terms of the usual ns and r parameters, given that the parametrization used here, for the preferred value ns≃0.96, allows only for a restricted parameter space in the usual (ns,r) plane.
Nonparametric Bayesian models for a spatial covariance.
Reich, Brian J; Fuentes, Montserrat
2012-01-01
A crucial step in the analysis of spatial data is to estimate the spatial correlation function that determines the relationship between a spatial process at two locations. The standard approach to selecting the appropriate correlation function is to use prior knowledge or exploratory analysis, such as a variogram analysis, to select the correct parametric correlation function. Rather that selecting a particular parametric correlation function, we treat the covariance function as an unknown function to be estimated from the data. We propose a flexible prior for the correlation function to provide robustness to the choice of correlation function. We specify the prior for the correlation function using spectral methods and the Dirichlet process prior, which is a common prior for an unknown distribution function. Our model does not require Gaussian data or spatial locations on a regular grid. The approach is demonstrated using a simulation study as well as an analysis of California air pollution data.
Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.
2015-01-01
Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316
NASA Technical Reports Server (NTRS)
Hilburger, Mark W.; Starnes, James H., Jr.
2004-01-01
The results of a parametric study of the effects of initial imperfections on the buckling and postbuckling response of three unstiffened thinwalled compression-loaded graphite-epoxy cylindrical shells with different orthotropic and quasi-isotropic shell-wall laminates are presented. The imperfections considered include initial geometric shell-wall midsurface imperfections, shell-wall thickness variations, local shell-wall ply-gaps associated with the fabrication process, shell-end geometric imperfections, nonuniform applied end loads, and variations in the boundary conditions including the effects of elastic boundary conditions. A high-fidelity nonlinear shell analysis procedure that accurately accounts for the effects of these imperfections on the nonlinear responses and buckling loads of the shells is described. The analysis procedure includes a nonlinear static analysis that predicts stable response characteristics of the shells and a nonlinear transient analysis that predicts unstable response characteristics.
Parametric Thermal Soak Model for Earth Entry Vehicles
NASA Technical Reports Server (NTRS)
Agrawal, Parul; Samareh, Jamshid; Doan, Quy D.
2013-01-01
The analysis and design of an Earth Entry Vehicle (EEV) is multidisciplinary in nature, requiring the application many disciplines. An integrated tool called Multi Mission System Analysis for Planetary Entry Descent and Landing or M-SAPE is being developed as part of Entry Vehicle Technology project under In-Space Technology program. Integration of a multidisciplinary problem is a challenging task. Automation of the execution process and data transfer among disciplines can be accomplished to provide significant benefits. Thermal soak analysis and temperature predictions of various interior components of entry vehicle, including the impact foam and payload container are part of the solution that M-SAPE will offer to spacecraft designers. The present paper focuses on the thermal soak analysis of an entry vehicle design based on the Mars Sample Return entry vehicle geometry and discusses a technical approach to develop parametric models for thermal soak analysis that will be integrated into M-SAPE. One of the main objectives is to be able to identify the important parameters and to develop correlation coefficients so that, for a given trajectory, can estimate the peak payload temperature based on relevant trajectory parameters and vehicle geometry. The models are being developed for two primary thermal protection (TPS) materials: 1) carbon phenolic that was used for Galileo and Pioneer Venus probes and, 2) Phenolic Impregnated Carbon Ablator (PICA), TPS material for Mars Science Lab mission. Several representative trajectories were selected from a very large trade space to include in the thermal analysis in order to develop an effective parametric thermal soak model. The selected trajectories covered a wide range of heatload and heatflux combinations. Non-linear, fully transient, thermal finite element simulations were performed for the selected trajectories to generate the temperature histories at the interior of the vehicle. Figure 1 shows the finite element model that was used for the simulations. The results indicate that it takes several hours for the thermal energy to soak into the interior of the vehicle and achieve maximum payload temperatures. In addition, a strong correlation between the heatload and peak payload container temperature is observed that will help establishing the parametric thermal soak model.
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...
2017-07-11
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences
NASA Astrophysics Data System (ADS)
Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.
2016-04-01
In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.
Quantification of soil water retention parameters using multi-section TDR-waveform analysis
NASA Astrophysics Data System (ADS)
Baviskar, S. M.; Heimovaara, T. J.
2017-06-01
Soil water retention parameters are important for describing flow in variably saturated soils. TDR is one of the standard methods used for determining water content in soil samples. In this study, we present an approach to estimate water retention parameters of a sample which is initially saturated and subjected to an incremental decrease in boundary head causing it to drain in a multi-step fashion. TDR waveforms are measured along the height of the sample at assumed different hydrostatic conditions at daily interval. The cumulative discharge outflow drained from the sample is also recorded. The saturated water content is obtained using volumetric analysis after the final step involved in multi-step drainage. The equation obtained by coupling the unsaturated parametric function and the apparent dielectric permittivity is fitted to a TDR wave propagation forward model. The unsaturated parametric function is used to spatially interpolate the water contents along TDR probe. The cumulative discharge outflow data is fitted with cumulative discharge estimated using the unsaturated parametric function. The weight of water inside the sample estimated at the first and final boundary head in multi-step drainage is fitted with the corresponding weights calculated using unsaturated parametric function. A Bayesian optimization scheme is used to obtain optimized water retention parameters for these different objective functions. This approach can be used for samples with long heights and is especially suitable for characterizing sands with a uniform particle size distribution at low capillary heads.
Solid state SPS microwave generation and transmission study. Volume 2, phase 2: Appendices
NASA Technical Reports Server (NTRS)
Maynard, O. E.
1980-01-01
The solid state sandwich concept for SPS was further defined. The design effort concentrated on the spacetenna, but did include some system analysis for parametric comparison reasons. Basic solid state microwave devices were defined and modeled. An initial conceptual subsystems and system design was performed as well as sidelobe control and system selection. The selected system concept and parametric solid state microwave power transmission system data were assessed relevant to the SPS concept. Although device efficiency was not a goal, the sensitivities to design of this efficiency were parametrically treated. Sidelobe control consisted of various single step tapers, multistep tapers and Gaussian tapers. A hybrid concept using tubes and solid state was evaluated. Thermal analyses are included with emphasis on sensitivities to waste heat radiator form factor, emissivity, absorptivity, amplifier efficiency, material and junction temperature.
Keeping nurses at work: a duration analysis.
Holmås, Tor Helge
2002-09-01
A shortage of nurses is currently a problem in several countries, and an important question is therefore how one can increase the supply of nursing labour. In this paper, we focus on the issue of nurses leaving the public health sector by utilising a unique data set containing information on both the supply and demand side of the market. To describe the exit rate from the health sector we apply a semi-parametric hazard rate model. In the estimations, we correct for unobserved heterogeneity by both a parametric (Gamma) and a non-parametric approach. We find that both wages and working conditions have an impact on nurses' decision to quit. Furthermore, failing to correct for the fact that nurses' income partly consists of compensation for inconvenient working hours results in a considerable downward bias of the wage effect. Copyright 2002 John Wiley & Sons, Ltd.
THz-wave parametric source and its imaging applications
NASA Astrophysics Data System (ADS)
Kawase, Kodo
2004-08-01
Widely tunable coherent terahertz (THz) wave generation has been demonstrated based on the parametric oscillation using MgO doped LiNbO3 crystal pumped by a Q-switched Nd:YAG laser. This method exhibits multiple advantages like wide tunability, coherency and compactness of its system. We have developed a novel basic technology for terahertz (THz) imaging, which allows detection and identification of chemicals by introducing the component spatial pattern analysis. The spatial distributions of the chemicals were obtained from terahertz multispectral transillumination images, using absorption spectra previously measured with a widely tunable THz-wave parametric oscillator. Further we have applied this technique to the detection and identification of illicit drugs concealed in envelopes. The samples we used were methamphetamine and MDMA, two of the most widely consumed illegal drugs in Japan, and aspirin as a reference.
Stick balancing with reflex delay in case of parametric forcing
NASA Astrophysics Data System (ADS)
Insperger, Tamas
2011-04-01
The effect of parametric forcing on a PD control of an inverted pendulum is analyzed in the presence of feedback delay. The stability of the time-periodic and time-delayed system is determined numerically using the first-order semi-discretization method in the 5-dimensional parameter space of the pendulum's length, the forcing frequency, the forcing amplitude, the proportional and the differential gains. It is shown that the critical length of the pendulum (that can just be balanced against the time-delay) can significantly be decreased by parametric forcing even if the maximum forcing acceleration is limited. The numerical analysis showed that the critical stick length about 30 cm corresponding to the unforced system with reflex delay 0.1 s can be decreased to 18 cm with keeping maximum acceleration below the gravitational acceleration.
NASA Astrophysics Data System (ADS)
Castellanos, P.; Boersma, F. F.
2011-12-01
We present a trend analysis of tropospheric NO2 for the time period of 2004-2010. Necessary for monitoring pollution abatement strategies, NO2 trends analyses are often based on surface networks, which suffer from high NO2 biases and spatial representativity issues inherent to the standard monitoring method (thermal reduction of NO2 followed by reaction with ozone and chemiluminescence). Space based NO2 trends are unbiased and self-consistent, but over Europe they have not been as obvious as those observed over North America and East Asia. In this work we exploit the daily NO2 column observations from the Ozone Monitoring Instrument (OMI) in order to isolate long-term (timescales greater than one year) variability in NO2 over Europe without imposing a parametric fit to the data. In general, we find between 2005 and 2008, 1-5% per year declines in NO2 concentration in many polluted regions (e.g. Germany, Netherlands, Belgium, Italy, Spain), but also 1-5% per year increases over the English Channel and the southern North Sea (a major shipping channel), as well as the United Kingdom, northern France and Eastern Europe. In 2009, NO2 almost exclusively decreased over Europe at a rate of 5-10% per year, coinciding with the abrupt decrease in industrial production and construction prompted by the global economic crisis. By 2010, in many areas the NO2 rate of change returned to pre-2009 levels suggesting economic recovery. We employ a simple fitting model to separate the forcing by meteorological variability, which can influence apparent NO2 trends, from that of NOx emissions. We calculate 1-3% per year NOx emissions reduction rates over most of Europe and an additional 15-30% per year decrease in NOx emissions during the economic crisis time period.
Islas-Granillo, H; Borges-Yañez, SA; Medina-Solís, CE; Galan-Vidal, CA; Navarrete-Hernández, JJ; Escoffié-Ramirez, M; Maupomé, G
2014-01-01
ABSTRACT Objective: To compare a limited array of chewing-stimulated saliva features (salivary flow, pH and buffer capacity) in a sample of elderly Mexicans with clinical, sociodemographic and socio-economic variables. Subjects and Methods: A cross-sectional study was carried out in 139 adults, 60 years old and older, from two retirement homes and a senior day care centre in the city of Pachuca, Mexico. Socio-demographic, socio-economic and behavioural variables were collected through a questionnaire. A trained and standardized examiner obtained the oral clinical variables. Chewing-stimulated saliva (paraffin method) was collected and the salivary flow rate, pH and buffer capacity were measured. The analysis was performed using non-parametric tests in Stata 9.0. Results: Mean age was 79.1 ± 9.8 years. Most of the subjects included were women (69.1%). Mean chewing-stimulated salivary flow was 0.75 ± 0.80 mL/minute, and the pH and buffer capacity were 7.88 ± 0.83 and 4.20 ± 1.24, respectively. Mean chewing-stimulated salivary flow varied (p < 0.05) across type of retirement home, tooth brushing frequency, number of missing teeth and use of dental prostheses. pH varied across the type of retirement home (p < 0.05) and marginally by age (p = 0.087); buffer capacity (p < 0.05) varied across type of retirement home, tobacco consumption and the number of missing teeth. Conclusions: These exploratory data add to the body of knowledge with regard to chewing-stimulated salivary features (salivary flow rate, pH and buffer capacity) and outline the variability of those features across selected sociodemographic, socio-economic and behavioural variables in a group of Mexican elders. PMID:25867562
Islas-Granillo, H; Borges-Yañez, S A; Medina-Solís, C E; Galan-Vidal, C A; Navarrete-Hernández, J J; Escoffié-Ramirez, M; Maupomé, G
2014-12-01
To compare a limited array of chewing-stimulated saliva features (salivary flow, pH and buffer capacity) in a sample of elderly Mexicans with clinical, sociodemographic and socio-economic variables. A cross-sectional study was carried out in 139 adults, 60 years old and older, from two retirement homes and a senior day care centre in the city of Pachuca, Mexico. Sociodemographic, socio-economic and behavioural variables were collected through a questionnaire. A trained and standardized examiner obtained the oral clinical variables. Chewing-stimulated saliva (paraffin method) was collected and the salivary flow rate, pH and buffer capacity were measured. The analysis was performed using non-parametric tests in Stata 9.0. Mean age was 79.1 ± 9.8 years. Most of the subjects included were women (69.1%). Mean chewing-stimulated salivary flow was 0.75 ± 0.80 mL/minute, and the pH and buffer capacity were 7.88 ± 0.83 and 4.20 ± 1.24, respectively. Mean chewing-stimulated salivary flow varied (p < 0.05) across type of retirement home, tooth brushing frequency, number of missing teeth and use of dental prostheses. pH varied across the type of retirement home (p < 0.05) and marginally by age (p = 0.087); buffer capacity (p < 0.05) varied across type of retirement home, tobacco consumption and the number of missing teeth. These exploratory data add to the body of knowledge with regard to chewing-stimulated salivary features (salivary flow rate, pH and buffer capacity) and outline the variability of those features across selected sociodemographic, socio-economic and behavioural variables in a group of Mexican elders.
NASA Astrophysics Data System (ADS)
Reynerson, Charles Martin
This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.
NASA Technical Reports Server (NTRS)
Smith, S. D.; Tevepaugh, J. A.; Penny, M. M.
1975-01-01
The exhaust plumes of the space shuttle solid rocket motors can have a significant effect on the base pressure and base drag of the shuttle vehicle. A parametric analysis was conducted to assess the sensitivity of the initial plume expansion angle of analytical solid rocket motor flow fields to various analytical input parameters and operating conditions. The results of the analysis are presented and conclusions reached regarding the sensitivity of the initial plume expansion angle to each parameter investigated. Operating conditions parametrically varied were chamber pressure, nozzle inlet angle, nozzle throat radius of curvature ratio and propellant particle loading. Empirical particle parameters investigated were mean size, local drag coefficient and local heat transfer coefficient. Sensitivity of the initial plume expansion angle to gas thermochemistry model and local drag coefficient model assumptions were determined.
Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Peng, E-mail: peng@ices.utexas.edu; Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch
2016-07-01
We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by themore » so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data assimilation and for Bayesian estimation. They also open a perspective for optimal experimental design.« less
Uncertainty in determining extreme precipitation thresholds
NASA Astrophysics Data System (ADS)
Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili
2013-10-01
Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Determinants of gender differences in health among the elderly in Latin America.
Trujillo, Antonio J; Mroz, Thomas A; Piras, Claudia; Vernon, John A; Angeles, Gustavo
2010-01-01
This paper identifies the main gender differences in health and socio-economic characteristics of the elderly in four Latin American cities. Using locally weighted regressions as well as a flexible model specification that treats age non-parametrically, we investigate whether these unadjusted gender gaps in health are due to gender differences in the distribution of age and other explanatory variables. Interestingly, for all cities, the analyses show a gender gap in health in favour of males at each age. The gaps are larger when one uses functional impairment in mobility and personal self-care as indicators of an individual's health instead of self-reported health. Furthermore, controlling for demographic characteristics, baseline health and the availability of family support do little to change the disadvantage for women in measured health outcomes. Controlling for socio-economic variables does, however, reduce most of the gender differences in health.
A Conceptual Wing Flutter Analysis Tool for Systems Analysis and Parametric Design Study
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
2003-01-01
An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate flutt er instability boundaries of a typical wing, when detailed structural and aerodynamic data are not available. Effects of change in key flu tter parameters can also be estimated in order to guide the conceptual design. This userfriendly software was developed using MathCad and M atlab codes. The analysis method was based on non-dimensional paramet ric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on wing torsion stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravit y location and pitch-inertia radius of gyration. These parametric plo ts were compiled in a Chance-Vought Corporation report from database of past experiments and wind tunnel test results. An example was prese nted for conceptual flutter analysis of outer-wing of a Blended-Wing- Body aircraft.
Parametric and experimental analysis using a power flow approach
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1988-01-01
Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.
Efficiency Analysis of Public Universities in Thailand
ERIC Educational Resources Information Center
Kantabutra, Saranya; Tang, John C. S.
2010-01-01
This paper examines the performance of Thai public universities in terms of efficiency, using a non-parametric approach called data envelopment analysis. Two efficiency models, the teaching efficiency model and the research efficiency model, are developed and the analysis is conducted at the faculty level. Further statistical analyses are also…
Parametric optimization of the MVC desalination plant with thermomechanical compressor
NASA Astrophysics Data System (ADS)
Blagin, E. V.; Biryuk, V. V.; Anisimov, M. Y.; Shimanov, A. A.; Gorshkalev, A. A.
2018-03-01
This article deals with parametric optimization of the Mechanical Vapour Compression (MVC) desalination plant with thermomechanical compressor. In this plants thermocompressor is used instead of commonly used centrifugal compressor. Influence of two main parameters was studied. These parameters are: inlet pressure and number of stages. Analysis shows that it is possible to achieve better plant performance in comparison with traditional MVC plant. But is required reducing the number of stages and utilization of low or high initial pressure with power consumption maximum at approximately 20-30 kPa.
Effects of cosmic rays on single event upsets
NASA Technical Reports Server (NTRS)
Venable, D. D.; Zajic, V.; Lowe, C. W.; Olidapupo, A.; Fogarty, T. N.
1989-01-01
Assistance was provided to the Brookhaven Single Event Upset (SEU) Test Facility. Computer codes were developed for fragmentation and secondary radiation affecting Very Large Scale Integration (VLSI) in space. A computer controlled CV (HP4192) test was developed for Terman analysis. Also developed were high speed parametric tests which are independent of operator judgment and a charge pumping technique for measurement of D(sub it) (E). The X-ray secondary effects, and parametric degradation as a function of dose rate were simulated. The SPICE simulation of static RAMs with various resistor filters was tested.
Four modes of optical parametric operation for squeezed state generation
NASA Astrophysics Data System (ADS)
Andersen, U. L.; Buchler, B. C.; Lam, P. K.; Wu, J. W.; Gao, J. R.; Bachor, H.-A.
2003-11-01
We report a versatile instrument, based on a monolithic optical parametric amplifier, which reliably generates four different types of squeezed light. We obtained vacuum squeezing, low power amplitude squeezing, phase squeezing and bright amplitude squeezing. We show a complete analysis of this light, including a full quantum state tomography. In addition we demonstrate the direct detection of the squeezed state statistics without the aid of a spectrum analyser. This technique makes the nonclassical properties directly visible and allows complete measurement of the statistical moments of the squeezed quadrature.
Optical realization of optimal symmetric real state quantum cloning machine
NASA Astrophysics Data System (ADS)
Hu, Gui-Yu; Zhang, Wen-Hai; Ye, Liu
2010-01-01
We present an experimentally uniform linear optical scheme to implement the optimal 1→2 symmetric and optimal 1→3 symmetric economical real state quantum cloning machine of the polarization state of the single photon. This scheme requires single-photon sources and two-photon polarization entangled state as input states. It also involves linear optical elements and three-photon coincidence. Then we consider the realistic realization of the scheme by using the parametric down-conversion as photon resources. It is shown that under certain condition, the scheme is feasible by current experimental technology.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
The FORTRAN computing program predicts the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case. The report has sufficient detail for the information of most readers.
NASA Astrophysics Data System (ADS)
Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.
2018-04-01
Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 < r < 1.1 R500. This is a wider range of spatial scales than is typically recovered by SZ instruments. Similar analyses will be possible with the new generation of SZ instruments such as NIKA2 and MUSTANG2.
Sustainability, collapse and oscillations in a simple World-Earth model
NASA Astrophysics Data System (ADS)
Nitzbon, Jan; Heitzig, Jobst; Parlitz, Ulrich
2017-07-01
The Anthropocene is characterized by close interdependencies between the natural Earth system and the global human society, posing novel challenges to model development. Here we present a conceptual model describing the long-term co-evolution of natural and socio-economic subsystems of Earth. While the climate is represented via a global carbon cycle, we use economic concepts to model socio-metabolic flows of biomass and fossil fuels between nature and society. A well-being-dependent parametrization of fertility and mortality governs human population dynamics. Our analysis focuses on assessing possible asymptotic states of the Earth system for a qualitative understanding of its complex dynamics rather than quantitative predictions. Low dimension and simple equations enable a parameter-space analysis allowing us to identify preconditions of several asymptotic states and hence fates of humanity and planet. These include a sustainable co-evolution of nature and society, a global collapse and everlasting oscillations. We consider different scenarios corresponding to different socio-cultural stages of human history. The necessity of accounting for the ‘human factor’ in Earth system models is highlighted by the finding that carbon stocks during the past centuries evolved opposing to what would ‘naturally’ be expected on a planet without humans. The intensity of biomass use and the contribution of ecosystem services to human well-being are found to be crucial determinants of the asymptotic state in a (pre-industrial) biomass-only scenario without capital accumulation. The capitalistic, fossil-based scenario reveals that trajectories with fundamentally different asymptotic states might still be almost indistinguishable during even a centuries-long transient phase. Given current human population levels, our study also supports the claim that besides reducing the global demand for energy, only the extensive use of renewable energies may pave the way into a sustainable future.
NASA Astrophysics Data System (ADS)
Perry, Dan; Nakamoto, Mark; Verghese, Nishath; Hurat, Philippe; Rouse, Rich
2007-03-01
Model-based hotspot detection and silicon-aware parametric analysis help designers optimize their chips for yield, area and performance without the high cost of applying foundries' recommended design rules. This set of DFM/ recommended rules is primarily litho-driven, but cannot guarantee a manufacturable design without imposing overly restrictive design requirements. This rule-based methodology of making design decisions based on idealized polygons that no longer represent what is on silicon needs to be replaced. Using model-based simulation of the lithography, OPC, RET and etch effects, followed by electrical evaluation of the resulting shapes, leads to a more realistic and accurate analysis. This analysis can be used to evaluate intelligent design trade-offs and identify potential failures due to systematic manufacturing defects during the design phase. The successful DFM design methodology consists of three parts: 1. Achieve a more aggressive layout through limited usage of litho-related recommended design rules. A 10% to 15% area reduction is achieved by using more aggressive design rules. DFM/recommended design rules are used only if there is no impact on cell size. 2. Identify and fix hotspots using a model-based layout printability checker. Model-based litho and etch simulation are done at the cell level to identify hotspots. Violations of recommended rules may cause additional hotspots, which are then fixed. The resulting design is ready for step 3. 3. Improve timing accuracy with a process-aware parametric analysis tool for transistors and interconnect. Contours of diffusion, poly and metal layers are used for parametric analysis. In this paper, we show the results of this physical and electrical DFM methodology at Qualcomm. We describe how Qualcomm was able to develop more aggressive cell designs that yielded a 10% to 15% area reduction using this methodology. Model-based shape simulation was employed during library development to validate architecture choices and to optimize cell layout. At the physical verification stage, the shape simulator was run at full-chip level to identify and fix residual hotspots on interconnect layers, on poly or metal 1 due to interaction between adjacent cells, or on metal 1 due to interaction between routing (via and via cover) and cell geometry. To determine an appropriate electrical DFM solution, Qualcomm developed an experiment to examine various electrical effects. After reporting the silicon results of this experiment, which showed sizeable delay variations due to lithography-related systematic effects, we also explain how contours of diffusion, poly and metal can be used for silicon-aware parametric analysis of transistors and interconnect at the cell-, block- and chip-level.
Parametric Methods for Determining the Characteristics of Long-Term Metal Strength
NASA Astrophysics Data System (ADS)
Nikitin, V. I.; Rybnikov, A. I.
2018-06-01
A large number of parametric methods were proposed to calculate the characteristics of the long-term strength of metals. All of them are based on the fact that temperature and time are mutually compensating factors in the processes of metal degradation at high temperature under the action of a constant stress. The analysis of the well-known Larson-Miller, Dorn-Shcherby, Menson-Haferd, Graham-Wallace, and Trunin parametric equations is performed. The widely used Larson-Miller parameter was subjected to a detailed analysis. The application of this parameter to the calculation of ultimate long-term strength for steels and alloys is substantiated provided that the laws of exponential dependence on temperature and power dependence on strength for the heat resistance are observed. It is established that the coefficient C in the Larson- Miller equation is a characteristic of the heat resistance and is different for each material. Therefore, the use of a universal constant C = 20 in parametric calculations, as well as an a priori presetting of numerical C values for each individual group of materials, is unacceptable. It is shown in what manner it is possible to determine an exact value of coefficient C for any material of interest as well as to obtain coefficient C depending on stress in case such a dependence is manifested. At present, the calculation of long-term strength characteristics can be performed to a sufficient accuracy using Larson-Miller's parameter and its refinements described therein as well as on the condition that a linear law in logσ- P dependence is observed and calculations in the interpolation range is performed. The use of the presented recommendations makes it possible to obtain a linear parametric logσ- P dependence, which makes it possible to determine to a sufficient accuracy the values of ultimate long-term strength for different materials.
Linkage analysis of high myopia susceptibility locus in 26 families.
Paget, Sandrine; Julia, Sophie; Vitezica, Zulma G; Soler, Vincent; Malecaze, François; Calvas, Patrick
2008-01-01
We conducted a linkage analysis in high myopia families to replicate suggestive results from chromosome 7q36 using a model of autosomal dominant inheritance and genetic heterogeneity. We also performed a genome-wide scan to identify novel loci. Twenty-six families, with at least two high-myopic subjects (ie. refractive value in the less affected eye of -5 diopters) in each family, were included. Phenotypic examination included standard autorefractometry, ultrasonographic eye length measurement, and clinical confirmation of the non-syndromic character of the refractive disorder. Nine families were collected de novo including 136 available members of whom 34 were highly myopic subjects. Twenty new subjects were added in 5 of the 17 remaining families. A total of 233 subjects were submitted to a genome scan using ABI linkage mapping set LMSv2-MD-10, additional markers in all regions where preliminary LOD scores were greater than 1.5 were used. Multipoint parametric and non-parametric analyses were conducted with the software packages Genehunter 2.0 and Merlin 1.0.1. Two autosomal recessive, two autosomal dominant, and four autosomal additive models were used in the parametric linkage analyses. No linkage was found using the subset of nine newly collected families. Study of the entire population of 26 families with a parametric model did not yield a significant LOD score (>3), even for the previously suggestive locus on 7q36. A non-parametric model demonstrated significant linkage to chromosome 7p15 in the entire population (Z-NPL=4.07, p=0.00002). The interval is 7.81 centiMorgans (cM) between markers D7S2458 and D7S2515. The significant interval reported here needs confirmation in other cohorts. Among possible susceptibility genes in the interval, certain candidates are likely to be involved in eye growth and development.
Borri, Marco; Schmidt, Maria A; Powell, Ceri; Koh, Dow-Mu; Riddell, Angela M; Partridge, Mike; Bhide, Shreerang A; Nutting, Christopher M; Harrington, Kevin J; Newbold, Katie L; Leach, Martin O
2015-01-01
To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters) of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment. The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4). Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters. The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4), determined with cluster validation, produced the best separation between reducing and non-reducing clusters. The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes.
Catastrophe risk data scoping for disaster risk finance in Asia
NASA Astrophysics Data System (ADS)
Millinship, Ian; Revilla-Romero, Beatriz
2017-04-01
Developing countries across Latin America, Africa, and Asia are some of the most exposed to natural catastrophes in the world. Over the last 20 years, Asia has borne almost half the estimated global economic cost of natural disasters - around 53billion annually. Losses from natural disasters can damage growth and hamper economic development and unlike in developed countries where risk is reallocated through re/insurance, typically these countries rely on budget reallocations and donor assistance in order to attempt to meet financing needs. There is currently an active international dialogue on the need to increase access to disaster risk financing solutions in Asia. The World Bank-GFDRR Disaster Risk Financing and Insurance Program with financial support from the Rockefeller Foundation, is currently working to develop regional options for disaster risk financing for developing countries in Asia. The first stage of this process has been to evaluate available catastrophe data suitable to support the design and implementation of disaster risk financing mechanisms in selected Asian countries. This project was carried out by a consortium of JBA Risk Management, JBA Consulting, ImageCat and Cat Risk Intelligence. The project focuses on investigating potential data sources for fourteen selected countries in Asia, for flood, tropical cyclone, earthquake and drought perils. The project was carried out under four stages. The first phase focused to identify and catalogue live/dynamic hazard data sources such as hazard gauging networks, or earth observations datasets which could be used to inform a parametric trigger. Live data sources were identified that provide credibility, transparency, independence, frequent reporting, consistency and stability. Data were catalogued at regional level, and prioritised at local level for five countries: Bangladesh, Indonesia, Pakistan, Sri Lanka and Viet Nam. The second phase was to identify, catalogue and evaluate catastrophe risk models that could quantify risk and provide a view of risk to support design and pricing of parametric disaster risk financing mechanisms. The third stage was to evaluate the usability of data sources and catastrophe models, and to develop index prototypes to outline how data and catastrophe models could be combined using local, regional and global data sources. Finally, the project identified priorities for investment to support the collection, analysis and evaluation of natural catastrophes in order to support disaster risk financing.
Stochastic Modelling, Analysis, and Simulations of the Solar Cycle Dynamic Process
NASA Astrophysics Data System (ADS)
Turner, Douglas C.; Ladde, Gangaram S.
2018-03-01
Analytical solutions, discretization schemes and simulation results are presented for the time delay deterministic differential equation model of the solar dynamo presented by Wilmot-Smith et al. In addition, this model is extended under stochastic Gaussian white noise parametric fluctuations. The introduction of stochastic fluctuations incorporates variables affecting the dynamo process in the solar interior, estimation error of parameters, and uncertainty of the α-effect mechanism. Simulation results are presented and analyzed to exhibit the effects of stochastic parametric volatility-dependent perturbations. The results generalize and extend the work of Hazra et al. In fact, some of these results exhibit the oscillatory dynamic behavior generated by the stochastic parametric additative perturbations in the absence of time delay. In addition, the simulation results of the modified stochastic models influence the change in behavior of the very recently developed stochastic model of Hazra et al.
Spectral and energy characteristics of four-photon parametric scattering in sodium vapor
NASA Astrophysics Data System (ADS)
Vaicaitis, V.; Ignatavicius, M.; Kudriashov, V. A.; Pimenov, Iu. N.; Jakyte, R.
1987-04-01
Consideration is given to processes of four-photon interaction upon two-photon resonance excitation of the 3d level of sodium by two-frequency radiation from a monopulse picosecond YAG:Nd laser with frequency doubling and an optical parametric oscillator utilizing KDP crystrals. The spatial and frequency spectra of the four-photon parametric scattering (FPS) are recorded and studied at different sodium vapor concentrations (10 to the 15th to 10 to the 17th/cu cm) and upon both collinear and noncollinear excitation. It is shown that the observed conical structure of the FPS radiation can be interpreted from an analysis of the realization of the frequency and spatial phase-matching conditions. The dependences of the FPS radiation intensity on the exciting radiation intensity, the sodium vapor concentration, and the mismatching of the exciting radiation from the two-photon resonance are obtained.
Stress Recovery and Error Estimation for Shell Structures
NASA Technical Reports Server (NTRS)
Yazdani, A. A.; Riggs, H. R.; Tessler, A.
2000-01-01
The Penalized Discrete Least-Squares (PDLS) stress recovery (smoothing) technique developed for two dimensional linear elliptic problems is adapted here to three-dimensional shell structures. The surfaces are restricted to those which have a 2-D parametric representation, or which can be built-up of such surfaces. The proposed strategy involves mapping the finite element results to the 2-D parametric space which describes the geometry, and smoothing is carried out in the parametric space using the PDLS-based Smoothing Element Analysis (SEA). Numerical results for two well-known shell problems are presented to illustrate the performance of SEA/PDLS for these problems. The recovered stresses are used in the Zienkiewicz-Zhu a posteriori error estimator. The estimated errors are used to demonstrate the performance of SEA-recovered stresses in automated adaptive mesh refinement of shell structures. The numerical results are encouraging. Further testing involving more complex, practical structures is necessary.
Entangled Parametric Hierarchies: Problems for an Overspecified Universal Grammar
Boeckx, Cedric; Leivada, Evelina
2013-01-01
This study addresses the feasibility of the classical notion of parameter in linguistic theory from the perspective of parametric hierarchies. A novel program-based analysis is implemented in order to show certain empirical problems related to these hierarchies. The program was developed on the basis of an enriched data base spanning 23 contemporary and 5 ancient languages. The empirical issues uncovered cast doubt on classical parametric models of language acquisition as well as on the conceptualization of an overspecified Universal Grammar that has parameters among its primitives. Pinpointing these issues leads to the proposal that (i) the (bio)logical problem of language acquisition does not amount to a process of triggering innately pre-wired values of parameters and (ii) it paves the way for viewing language, epigenetic (‘parametric’) variation as an externalization-related epiphenomenon, whose learning component may be more important than what sometimes is assumed. PMID:24019867
Advanced extravehicular protective systems study, volume 2
NASA Technical Reports Server (NTRS)
Sutton, J. G.; Heimlich, P. F.; Tepper, E. H.
1972-01-01
The results of the subsystem studies are presented. Initial identification and evaluation of candidate subsystem concepts in the area of thermal control, humidity control, CO2 control/O2 supply, contaminant control and power supply are discussed. The candidate concepts that were judged to be obviously noncompetitive were deleted from further consideration and the remaining candidate concepts were carried into the go/no go evaluation. A detailed parametric analysis of each of the thermal/humidity control and CO2 control/O2 supply subsystem concepts which passed the go/no go evaluation is described. Based upon the results of the parametric analyses, primary and secondary evaluations of the remaining candidate concepts were conducted. These results and the subsystem recommendations emanating from these results are discussed. In addition, the parametric analyses of the recommended subsystem concepts were updated to reflect the final AEPS specification requirements. A detailed discussion regarding the selection of the AEPS operating pressure level is presented.
NASA Astrophysics Data System (ADS)
Dai, Jun; Zhou, Haigang; Zhao, Shaoquan
2017-01-01
This paper considers a multi-scale future hedge strategy that minimizes lower partial moments (LPM). To do this, wavelet analysis is adopted to decompose time series data into different components. Next, different parametric estimation methods with known distributions are applied to calculate the LPM of hedged portfolios, which is the key to determining multi-scale hedge ratios over different time scales. Then these parametric methods are compared with the prevailing nonparametric kernel metric method. Empirical results indicate that in the China Securities Index 300 (CSI 300) index futures and spot markets, hedge ratios and hedge efficiency estimated by the nonparametric kernel metric method are inferior to those estimated by parametric hedging model based on the features of sequence distributions. In addition, if minimum-LPM is selected as a hedge target, the hedging periods, degree of risk aversion, and target returns can affect the multi-scale hedge ratios and hedge efficiency, respectively.
Latest astronomical constraints on some non-linear parametric dark energy models
NASA Astrophysics Data System (ADS)
Yang, Weiqiang; Pan, Supriya; Paliathanasis, Andronikos
2018-04-01
We consider non-linear redshift-dependent equation of state parameters as dark energy models in a spatially flat Friedmann-Lemaître-Robertson-Walker universe. To depict the expansion history of the universe in such cosmological scenarios, we take into account the large-scale behaviour of such parametric models and fit them using a set of latest observational data with distinct origin that includes cosmic microwave background radiation, Supernove Type Ia, baryon acoustic oscillations, redshift space distortion, weak gravitational lensing, Hubble parameter measurements from cosmic chronometers, and finally the local Hubble constant from Hubble space telescope. The fitting technique avails the publicly available code Cosmological Monte Carlo (COSMOMC), to extract the cosmological information out of these parametric dark energy models. From our analysis, it follows that those models could describe the late time accelerating phase of the universe, while they are distinguished from the Λ-cosmology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, L.T.; Hickey, M.
This paper summarizes the progress to date by CH2M HILL and the UKAEA in development of a parametric modelling capability for estimating the costs of large nuclear decommissioning projects in the United Kingdom (UK) and Europe. The ability to successfully apply parametric cost estimating techniques will be a key factor to commercial success in the UK and European multi-billion dollar waste management, decommissioning and environmental restoration markets. The most useful parametric models will be those that incorporate individual components representing major elements of work: reactor decommissioning, fuel cycle facility decommissioning, waste management facility decommissioning and environmental restoration. Models must bemore » sufficiently robust to estimate indirect costs and overheads, permit pricing analysis and adjustment, and accommodate the intricacies of international monetary exchange, currency fluctuations and contingency. The development of a parametric cost estimating capability is also a key component in building a forward estimating strategy. The forward estimating strategy will enable the preparation of accurate and cost-effective out-year estimates, even when work scope is poorly defined or as yet indeterminate. Preparation of cost estimates for work outside the organizations current sites, for which detailed measurement is not possible and historical cost data does not exist, will also be facilitated. (authors)« less
Robust biological parametric mapping: an improved technique for multimodal brain image analysis
NASA Astrophysics Data System (ADS)
Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.
2011-03-01
Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.
Parametric Instability Rates in Periodically Driven Band Systems
NASA Astrophysics Data System (ADS)
Lellouch, S.; Bukov, M.; Demler, E.; Goldman, N.
2017-04-01
In this work, we analyze the dynamical properties of periodically driven band models. Focusing on the case of Bose-Einstein condensates, and using a mean-field approach to treat interparticle collisions, we identify the origin of dynamical instabilities arising from the interplay between the external drive and interactions. We present a widely applicable generic numerical method to extract instability rates and link parametric instabilities to uncontrolled energy absorption at short times. Based on the existence of parametric resonances, we then develop an analytical approach within Bogoliubov theory, which quantitatively captures the instability rates of the system and provides an intuitive picture of the relevant physical processes, including an understanding of how transverse modes affect the formation of parametric instabilities. Importantly, our calculations demonstrate an agreement between the instability rates determined from numerical simulations and those predicted by theory. To determine the validity regime of the mean-field analysis, we compare the latter to the weakly coupled conserving approximation. The tools developed and the results obtained in this work are directly relevant to present-day ultracold-atom experiments based on shaken optical lattices and are expected to provide an insightful guidance in the quest for Floquet engineering.
Terahertz parametric sources and imaging applications
NASA Astrophysics Data System (ADS)
Kawase, Kodo; Ogawa, Yuichi; Minamide, Hiroaki; Ito, Hiromasa
2005-07-01
We have studied the generation of terahertz (THz) waves by optical parametric processes based on laser light scattering from the polariton mode of nonlinear crystals. Using parametric oscillation of LiNbO3 or MgO-doped LiNbO3 crystal pumped by a nano-second Q-switched Nd:YAG laser, we have realized a widely tunable coherent THz-wave source with a simple configuration. We report the detailed characteristics of the oscillation and the radiation including tunability, spatial and temporal coherency, uni-directivity, and efficiency. A Fourier transform limited THz-wave spectrum narrowing was achieved by introducing the injection seeding method. Further, we have developed a spectroscopic THz imaging system using a THz-wave parametric oscillator, which allows detection and identification of drugs concealed in envelopes, by introducing the component spatial pattern analysis. Several images of the envelope are recorded at different THz frequencies and then processed. The final result is an image that reveals what substances are present in the envelope, in what quantity, and how they are distributed across the envelope area. The example presented here shows the identification of three drugs, two of which are illegal, while one is an over-the-counter drug.
Variable selection for distribution-free models for longitudinal zero-inflated count responses.
Chen, Tian; Wu, Pan; Tang, Wan; Zhang, Hui; Feng, Changyong; Kowalski, Jeanne; Tu, Xin M
2016-07-20
Zero-inflated count outcomes arise quite often in research and practice. Parametric models such as the zero-inflated Poisson and zero-inflated negative binomial are widely used to model such responses. Like most parametric models, they are quite sensitive to departures from assumed distributions. Recently, new approaches have been proposed to provide distribution-free, or semi-parametric, alternatives. These methods extend the generalized estimating equations to provide robust inference for population mixtures defined by zero-inflated count outcomes. In this paper, we propose methods to extend smoothly clipped absolute deviation (SCAD)-based variable selection methods to these new models. Variable selection has been gaining popularity in modern clinical research studies, as determining differential treatment effects of interventions for different subgroups has become the norm, rather the exception, in the era of patent-centered outcome research. Such moderation analysis in general creates many explanatory variables in regression analysis, and the advantages of SCAD-based methods over their traditional counterparts render them a great choice for addressing this important and timely issues in clinical research. We illustrate the proposed approach with both simulated and real study data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
DOT National Transportation Integrated Search
1975-07-01
The describing function method of analysis is applied to investigate the influence of parametric variations on wheelset critical velocity. In addition, the relationship between the amplitude of sustained lateral oscillations and critical speed is der...
Bansal, Ravi; Peterson, Bradley S
2018-06-01
Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.
A multivariate approach for the study of the environmental drivers of wine production structure
NASA Astrophysics Data System (ADS)
Lorenzetti, Romina; Costantini, Edoardo A. C.; Malorgio, Giulio
2015-04-01
Vitivinicultural "terroir" is a concept referring to an area in which the collective knowledge of the interactions between environment and vitivinicultural practices develops, providing distinctive characteristics to the products. The effect of the environment components over the terroir has been already widely demonstrated. What it has not been studied yet is their possible effect on the structure of wine production. Therefore, the aim of this work was to find if environmental drivers influence the wine production structure. This kind of investigation necessarily involves a change of scale towards wide territories. We used the Italian Denomination of Origin territories, which were grouped in Macro-areas (reference scale 1:500,000) with respect of geographic proximity, environmental features, viticultural affinity and tradition. The characterization of the structure of the wine transformation industry was based on the official data reported in the wine production declarations related to the year 2008. Statistics were taken into account about general quantitative variables of wine farms, presence of associative forms, degree of vertical integration of wineries, quality orientation of wine producers, and acreage of vineyard. The environmental variables climate, soil, and vegetation vigour were selected for their direct influence on the vine growing. A second set of variables was chosen to express the effect of land morphology on viticultural management. The third one was intended to discover the possible relationships between viticultural structures and land quality, such as the indexes of sensitivity to desertification, the soil resistance to water erosion, and land vulnerability. A PCA was carried out separately for the environmental and economic data to reduce the database dimensions. The new economic and environmental synthetic descriptors were involved in three multivariate analyses: i) the correlation between economic and environmental descriptors through the non-parametric Spearman test; ii) a cluster analysis to group the Macro-areas in few homogeneous economic structures; iii) a discriminant analysis of economic clusters and environmental factors, to highlight the environmental drivers of the different wine production structures. The cluster analysis identified six systems of production and organization. Climatic, pedoclimatic, morphological mean conditions and morphological heterogeneity of Macro-areas had the most important discriminant power over the clusters. The economic structures addressed to large-scale kind of production and those with a not clear orientation were located in low hills and plains with Mediterranean climatic conditions. Lands at higher elevation and rougher morphology correlated with high quality products and structures, either made of little independent farms or cooperatives, in the highest cold wet areas, or large independent farms, on medium hill. In conclusion, for the first time it was proved that certain landscape characteristics have a significant influence over the typology of wine production structure. The result of this multivariate analyses suggest that pedo-climatic characteristics and landscape attributes care can have a strategic role on the wine industry.
NASA Astrophysics Data System (ADS)
Wang, Zhen-yu; Yu, Jian-cheng; Zhang, Ai-qun; Wang, Ya-xing; Zhao, Wen-tao
2017-12-01
Combining high precision numerical analysis methods with optimization algorithms to make a systematic exploration of a design space has become an important topic in the modern design methods. During the design process of an underwater glider's flying-wing structure, a surrogate model is introduced to decrease the computation time for a high precision analysis. By these means, the contradiction between precision and efficiency is solved effectively. Based on the parametric geometry modeling, mesh generation and computational fluid dynamics analysis, a surrogate model is constructed by adopting the design of experiment (DOE) theory to solve the multi-objects design optimization problem of the underwater glider. The procedure of a surrogate model construction is presented, and the Gaussian kernel function is specifically discussed. The Particle Swarm Optimization (PSO) algorithm is applied to hydrodynamic design optimization. The hydrodynamic performance of the optimized flying-wing structure underwater glider increases by 9.1%.
[The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].
Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel
2017-01-01
The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.
Gas engine heat pump cycle analysis. Volume 1: Model description and generic analysis
NASA Astrophysics Data System (ADS)
Fischer, R. D.
1986-10-01
The task has prepared performance and cost information to assist in evaluating the selection of high voltage alternating current components, values for component design variables, and system configurations and operating strategy. A steady-state computer model for performance simulation of engine-driven and electrically driven heat pumps was prepared and effectively used for parametric and seasonal performance analyses. Parametric analysis showed the effect of variables associated with design of recuperators, brine coils, domestic hot water heat exchanger, compressor size, engine efficiency, insulation on exhaust and brine piping. Seasonal performance data were prepared for residential and commercial units in six cities with system configurations closely related to existing or contemplated hardware of the five GRI engine contractors. Similar data were prepared for an advanced variable-speed electric unit for comparison purposes. The effect of domestic hot water production on operating costs was determined. Four fan-operating strategies and two brine loop configurations were explored.
NASA Technical Reports Server (NTRS)
Stagliano, T. R.; Witmer, E. A.; Rodal, J. J. A.
1979-01-01
Finite element modeling alternatives as well as the utility and limitations of the two dimensional structural response computer code CIVM-JET 4B for predicting the transient, large deflection, elastic plastic, structural responses of two dimensional beam and/or ring structures which are subjected to rigid fragment impact were investigated. The applicability of the CIVM-JET 4B analysis and code for the prediction of steel containment ring response to impact by complex deformable fragments from a trihub burst of a T58 turbine rotor was studied. Dimensional analysis considerations were used in a parametric examination of data from engine rotor burst containment experiments and data from sphere beam impact experiments. The use of the CIVM-JET 4B computer code for making parametric structural response studies on both fragment-containment structure and fragment-deflector structure was illustrated. Modifications to the analysis/computation procedure were developed to alleviate restrictions.
NASA Technical Reports Server (NTRS)
Jeffries, K. S.; Renz, D. D.
1984-01-01
A parametric analysis was performed of transmission cables for transmitting electrical power at high voltage (up to 1000 V) and high frequency (10 to 30 kHz) for high power (100 kW or more) space missions. Large diameter (5 to 30 mm) hollow conductors were considered in closely spaced coaxial configurations and in parallel lines. Formulas were derived to calculate inductance and resistance for these conductors. Curves of cable conductance, mass, inductance, capacitance, resistance, power loss, and temperature were plotted for various conductor diameters, conductor thickness, and alternating current frequencies. An example 5 mm diameter coaxial cable with 0.5 mm conductor thickness was calculated to transmit 100 kW at 1000 Vac, 50 m with a power loss of 1900 W, an inductance of 1.45 micron and a capacitance of 0.07 micron-F. The computer programs written for this analysis are listed in the appendix.
Parametric distribution approach for flow availability in small hydro potential analysis
NASA Astrophysics Data System (ADS)
Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel
2016-10-01
Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.
A Parametric Computational Analysis into Galvanic Coupling Intrabody Communication.
Callejon, M Amparo; Del Campo, P; Reina-Tosina, Javier; Roa, Laura M
2017-08-02
Intrabody Communication (IBC) uses the human body tissues as transmission media for electrical signals to interconnect personal health devices in wireless body area networks. The main goal of this work is to conduct a computational analysis covering some bioelectric issues that still have not been fully explained, such as the modeling of the skin-electrode impedance, the differences associated to the use of constant voltage or current excitation modes, or the influence on attenuation of the subject's anthropometrical and bioelectric properties. With this aim, a computational finite element model has been developed, allowing the IBC channel attenuation as well as the electric field and current density through arm tissues to be computed as a function of these parameters. As a conclusion, this parametric analysis has in turn permitted us to disclose some knowledge about the causes and effects of the above-mentioned issues, thus explaining and complementing previous results reported in the literature.
Can color-coded parametric maps improve dynamic enhancement pattern analysis in MR mammography?
Baltzer, P A; Dietzel, M; Vag, T; Beger, S; Freiberg, C; Herzog, A B; Gajda, M; Camara, O; Kaiser, W A
2010-03-01
Post-contrast enhancement characteristics (PEC) are a major criterion for differential diagnosis in MR mammography (MRM). Manual placement of regions of interest (ROIs) to obtain time/signal intensity curves (TSIC) is the standard approach to assess dynamic enhancement data. Computers can automatically calculate the TSIC in every lesion voxel and combine this data to form one color-coded parametric map (CCPM). Thus, the TSIC of the whole lesion can be assessed. This investigation was conducted to compare the diagnostic accuracy (DA) of CCPM with TSIC for the assessment of PEC. 329 consecutive patients with 469 histologically verified lesions were examined. MRM was performed according to a standard protocol (1.5 T, 0.1 mmol/kgbw Gd-DTPA). ROIs were drawn manually within any lesion to calculate the TSIC. CCPMs were created in all patients using dedicated software (CAD Sciences). Both methods were rated by 2 observers in consensus on an ordinal scale. Receiver operating characteristics (ROC) analysis was used to compare both methods. The area under the curve (AUC) was significantly (p=0.026) higher for CCPM (0.829) than TSIC (0.749). The sensitivity was 88.5% (CCPM) vs. 82.8% (TSIC), whereas equal specificity levels were found (CCPM: 63.7%, TSIC: 63.0%). The color-coded parametric maps (CCPMs) showed a significantly higher DA compared to TSIC, in particular the sensitivity could be increased. Therefore, the CCPM method is a feasible approach to assessing dynamic data in MRM and condenses several imaging series into one parametric map. © Georg Thieme Verlag KG Stuttgart · New York.
Comment on "Parametric Instability Induced by X-Mode Wave Heating at EISCAT" by Wang et al. (2016)
NASA Astrophysics Data System (ADS)
Blagoveshchenskaya, N. F.; Borisova, T. D.; Yeoman, T. K.
2017-12-01
In their recent article Wang et al. (2016) analyzed observations from EISCAT (European Incoherent Scatter) Scientific Association Russian X-mode heating experiments and claimed to explain the potential mechanisms for the parametric decay instability (PDI) and oscillating two-stream instability (OTSI). Wang et al. (2016) claim that they cannot separate the HF-enhanced plasma and ion lines excited by O or X mode in the EISCAT UHF radar spectra. Because of this they distinguished the parametric instability excited by O-/X-mode heating waves according to their different excitation heights. Their reflection heights were determined from ionosonde records, which provide a rough measure of excitation altitudes and cannot be used for the separation of the O- and X-mode effects. The serious limitation in their analysis is the use of a 30 s integration time of the UHF radar data. There are also serious disagreements between their analysis and the real observational facts. The fact is that it is the radical difference in the behavior of the X- and O-mode plasma and ion line spectra derived with a 5 s resolution, which provides the correct separation of the X- and O-mode effects. It is not discussed and explained how the parallel component of the electric field under X-mode heating is generated. Apart from the leakage to the O mode, results by Wang et al. (2016) do not explain the potential mechanisms for PDI and OTSI and add nothing to understanding the physical factors accounting for the parametric instability generated by an X-mode HF pump wave.
Lin, Sheng-Hsuan; Young, Jessica; Logan, Roger; Tchetgen Tchetgen, Eric J.; VanderWeele, Tyler J.
2016-01-01
The assessment of direct and indirect effects with time-varying mediators and confounders is a common but challenging problem, and standard mediation analysis approaches are generally not applicable in this context. The mediational g-formula was recently proposed to address this problem, paired with a semi-parametric estimation approach to evaluate longitudinal mediation effects empirically. In this paper, we develop a parametric estimation approach to the mediational g-formula, including a feasible algorithm implemented in a freely available SAS macro. In the Framingham Heart Study data, we apply this method to estimate the interventional analogues of natural direct and indirect effects of smoking behaviors sustained over a 10-year period on blood pressure when considering weight change as a time-varying mediator. Compared with not smoking, smoking 20 cigarettes per day for 10 years was estimated to increase blood pressure by 1.2 (95 % CI: −0.7, 2.7) mm-Hg. The direct effect was estimated to increase blood pressure by 1.5 (95 % CI: −0.3, 2.9) mm-Hg, and the indirect effect was −0.3 (95% CI: −0.5, −0.1) mm-Hg, which is negative because smoking which is associated with lower weight is associated in turn with lower blood pressure. These results provide evidence that weight change in fact partially conceals the detrimental effects of cigarette smoking on blood pressure. Our work represents, to our knowledge, the first application of the parametric mediational g-formula in an epidemiologic cohort study. PMID:27984420
Analysis of survival in breast cancer patients by using different parametric models
NASA Astrophysics Data System (ADS)
Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti
2017-09-01
In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.
Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.
Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A
2016-01-01
Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.
New approaches to the analysis of population trends in land birds: Comment
Link, W.A.; Sauer, J.R.
1997-01-01
James et al. (1996, Ecology 77:13-27) used data from the North American Breeding Bird Survey (BBS) to examine geographic variability in patterns of population change for 26 species of wood warblers. They emphasized the importance of evaluating nonlinear patterns of change in bird populations, proposed LOESS-based non-parametric and semi-parametric analyses of BBS data, and contrasted their results with other analyses, including those of Robbins et al. (1989, Proceedings of the National Academy of Sciences 86: 7658-7662) and Peterjohn et al. (1995, Pages 3-39 in T. E. Martin and D. M. Finch, eds. Ecology and management of Neotropical migratory birds: a synthesis and review of critical issues. Oxford University Press, New York.). In this note, we briefly comment on some of the issues that arose from their analysis of BBS data, suggest a few aspects of the survey that should inspire caution in analysts, and review the differences between the LOESS-based procedures and other procedures (e.g., Link and Sauer 1994). We strongly discourage the use of James et al.'s completely non-parametric procedure, which fails to account for observer effects. Our comparisons of estimators adds to the evidence already present in the literature of the bias associated with omitting observer information in analyses of BBS data. Bias resulting from change in observer abilities should be a consideration in any analysis of BBS data.
Marmarelis, Vasilis Z.; Berger, Theodore W.
2009-01-01
Parametric and non-parametric modeling methods are combined to study the short-term plasticity (STP) of synapses in the central nervous system (CNS). The nonlinear dynamics of STP are modeled by means: (1) previously proposed parametric models based on mechanistic hypotheses and/or specific dynamical processes, and (2) non-parametric models (in the form of Volterra kernels) that transforms the presynaptic signals into postsynaptic signals. In order to synergistically use the two approaches, we estimate the Volterra kernels of the parametric models of STP for four types of synapses using synthetic broadband input–output data. Results show that the non-parametric models accurately and efficiently replicate the input–output transformations of the parametric models. Volterra kernels provide a general and quantitative representation of the STP. PMID:18506609
Bláha, M; Hoch, J; Ferko, A; Ryška, A; Hovorková, E
Improvement in any human activity is preconditioned by inspection of results and providing feedback used for modification of the processes applied. Comparison of experts experience in the given field is another indispensable part leading to optimisation and improvement of processes, and optimally to implementation of standards. For the purpose of objective comparison and assessment of the processes, it is always necessary to describe the processes in a parametric way, to obtain representative data, to assess the achieved results, and to provide unquestionable and data-driven feedback based on such analysis. This may lead to a consensus on the definition of standards in the given area of health care. Total mesorectal excision (TME) is a standard procedure of rectal cancer (C20) surgical treatment. However, the quality of performed procedures varies in different health care facilities, which is given, among others, by internal processes and surgeons experience. Assessment of surgical treatment results is therefore of key importance. A pathologist who assesses the resected tissue can provide valuable feedback in this respect. An information system for the parametric assessment of TME performance is described in our article, including technical background in the form of a multicentre clinical registry and the structure of observed parameters. We consider the proposed system of TME parametric assessment as significant for improvement of TME performance, aimed at reducing local recurrences and at improving the overall prognosis of patients. rectal cancer total mesorectal excision parametric data clinical registries TME registry.
Ultra-Broad-Band Optical Parametric Amplifier or Oscillator
NASA Technical Reports Server (NTRS)
Strekalov, Dmitry; Matsko, Andrey; Savchenkov, Anatolly; Maleki, Lute
2009-01-01
A concept for an ultra-broad-band optical parametric amplifier or oscillator has emerged as a by-product of a theoretical study in fundamental quantum optics. The study was originally intended to address the question of whether the two-photon temporal correlation function of light [in particular, light produced by spontaneous parametric down conversion (SPDC)] can be considerably narrower than the inverse of the spectral width (bandwidth) of the light. The answer to the question was found to be negative. More specifically, on the basis of the universal integral relations between the quantum two-photon temporal correlation and the classical spectrum of light, it was found that the lower limit of two-photon correlation time is set approximately by the inverse of the bandwidth. The mathematical solution for the minimum two-photon correlation time also provides the minimum relative frequency dispersion of the down-converted light components; in turn, the minimum relative frequency dispersion translates to the maximum bandwidth, which is important for the design of an ultra-broad-band optical parametric oscillator or amplifier. In the study, results of an analysis of the general integral relations were applied in the case of an optically nonlinear, frequency-dispersive crystal in which SPDC produces collinear photons. Equations were found for the crystal orientation and pump wavelength, specific for each parametric-down-converting crystal, that eliminate the relative frequency dispersion of collinear degenerate (equal-frequency) signal and idler components up to the fourth order in the frequency-detuning parameter
Sarkar, Rajarshi
2013-07-01
The validity of the entire renal function tests as a diagnostic tool depends substantially on the Biological Reference Interval (BRI) of urea. Establishment of BRI of urea is difficult partly because exclusion criteria for selection of reference data are quite rigid and partly due to the compartmentalization considerations regarding age and sex of the reference individuals. Moreover, construction of Biological Reference Curve (BRC) of urea is imperative to highlight the partitioning requirements. This a priori study examines the data collected by measuring serum urea of 3202 age and sex matched individuals, aged between 1 and 80 years, by a kinetic UV Urease/GLDH method on a Roche Cobas 6000 auto-analyzer. Mann-Whitney U test of the reference data confirmed the partitioning requirement by both age and sex. Further statistical analysis revealed the incompatibility of the data for a proposed parametric model. Hence the data was non-parametrically analysed. BRI was found to be identical for both sexes till the 2(nd) decade, and the BRI for males increased progressively 6(th) decade onwards. Four non-parametric models were postulated for construction of BRC: Gaussian kernel, double kernel, local mean and local constant, of which the last one generated the best-fitting curves. Clinical decision making should become easier and diagnostic implications of renal function tests should become more meaningful if this BRI is followed and the BRC is used as a desktop tool in conjunction with similar data for serum creatinine.
Zhai, Haibo; Rubin, Edward S
2013-03-19
This study investigates the feasibility of polymer membrane systems for postcombustion carbon dioxide (CO(2)) capture at coal-fired power plants. Using newly developed performance and cost models, our analysis shows that membrane systems configured with multiple stages or steps are capable of meeting capture targets of 90% CO(2) removal efficiency and 95+% product purity. A combined driving force design using both compressors and vacuum pumps is most effective for reducing the cost of CO(2) avoided. Further reductions in the overall system energy penalty and cost can be obtained by recycling a portion of CO(2) via a two-stage, two-step membrane configuration with air sweep to increase the CO(2) partial pressure of feed flue gas. For a typical plant with carbon capture and storage, this yielded a 15% lower cost per metric ton of CO(2) avoided compared to a plant using a current amine-based capture system. A series of parametric analyses also is undertaken to identify paths for enhancing the viability of membrane-based capture technology.
A Parametric Study on Using Active Debris Removal for LEO Environment Remediation
NASA Technical Reports Server (NTRS)
2010-01-01
Recent analyses on the instability of the orbital debris population in the low Earth orbit (LEO) region and the collision between Iridium 33 and Cosmos 2251 have reignited the interest in using active debris removal (ADR) to remediate the environment. There are; however, monumental technical, resource, operational, legal, and political challenges in making economically viable ADR a reality. Before a consensus on the need for ADR can be reached, a careful analysis of its effectiveness must be conducted. The goal is to demonstrate the need and feasibility of using ADR to better preserve the future environment and to guide its implementation to maximize the benefit-to-cost ratio. This paper describes a new sensitivity study on using ADR to stabilize the future LEO debris environment. The NASA long-term orbital debris evolutionary model, LEGEND, is used to quantify the effects of several key parameters, including target selection criteria/constraints and the starting epoch of ADR implementation. Additional analyses on potential ADR targets among the currently existing satellites and the benefits of collision avoidance maneuvers are also included.
An analysis of heat effects in different subpopulations of Bangladesh
NASA Astrophysics Data System (ADS)
Burkart, Katrin; Breitner, Susanne; Schneider, Alexandra; Khan, Md. Mobarak Hossain; Krämer, Alexander; Endlicher, Wilfried
2014-03-01
A substantial number of epidemiological studies have demonstrated an association between atmospheric conditions and human all-cause as well as cause-specific mortality. However, most research has been performed in industrialised countries, whereas little is known about the atmosphere-mortality relationship in developing countries. Especially with regard to modifications from non-atmospheric conditions and intra-population differences, there is a substantial research deficit. Within the scope of this study, we aimed to investigate the effects of heat in a multi-stratified manner, distinguishing by the cause of death, age, gender, location and socio-economic status. We examined 22,840 death counts using semi-parametric Poisson regression models, adjusting for a multitude of potential confounders. Although Bangladesh is dominated by an increase of mortality with decreasing (equivalent) temperatures over a wide range of values, the findings demonstrated the existence of partly strong heat effects at the upper end of the temperature distribution. Moreover, the study demonstrated that the strength of these heat effects varied considerably over the investigated subgroups. The adverse effects of heat were particularly pronounced for males and the elderly above 65 years. Moreover, we found increased adverse effects of heat for urban areas and for areas with a high socio-economic status. The increase in, and acceleration of, urbanisation in Bangladesh, as well as the rapid aging of the population and the increase in non-communicable diseases, suggest that the relevance of heat-related mortality might increase further. Considering rising global temperatures, the adverse effects of heat might be further aggravated.
A review of parametric approaches specific to aerodynamic design process
NASA Astrophysics Data System (ADS)
Zhang, Tian-tian; Wang, Zhen-guo; Huang, Wei; Yan, Li
2018-04-01
Parametric modeling of aircrafts plays a crucial role in the aerodynamic design process. Effective parametric approaches have large design space with a few variables. Parametric methods that commonly used nowadays are summarized in this paper, and their principles have been introduced briefly. Two-dimensional parametric methods include B-Spline method, Class/Shape function transformation method, Parametric Section method, Hicks-Henne method and Singular Value Decomposition method, and all of them have wide application in the design of the airfoil. This survey made a comparison among them to find out their abilities in the design of the airfoil, and the results show that the Singular Value Decomposition method has the best parametric accuracy. The development of three-dimensional parametric methods is limited, and the most popular one is the Free-form deformation method. Those methods extended from two-dimensional parametric methods have promising prospect in aircraft modeling. Since different parametric methods differ in their characteristics, real design process needs flexible choice among them to adapt to subsequent optimization procedure.
NASA Astrophysics Data System (ADS)
Zhang, Chenglong; Guo, Ping
2017-10-01
The vague and fuzzy parametric information is a challenging issue in irrigation water management problems. In response to this problem, a generalized fuzzy credibility-constrained linear fractional programming (GFCCFP) model is developed for optimal irrigation water allocation under uncertainty. The model can be derived from integrating generalized fuzzy credibility-constrained programming (GFCCP) into a linear fractional programming (LFP) optimization framework. Therefore, it can solve ratio optimization problems associated with fuzzy parameters, and examine the variation of results under different credibility levels and weight coefficients of possibility and necessary. It has advantages in: (1) balancing the economic and resources objectives directly; (2) analyzing system efficiency; (3) generating more flexible decision solutions by giving different credibility levels and weight coefficients of possibility and (4) supporting in-depth analysis of the interrelationships among system efficiency, credibility level and weight coefficient. The model is applied to a case study of irrigation water allocation in the middle reaches of Heihe River Basin, northwest China. Therefore, optimal irrigation water allocation solutions from the GFCCFP model can be obtained. Moreover, factorial analysis on the two parameters (i.e. λ and γ) indicates that the weight coefficient is a main factor compared with credibility level for system efficiency. These results can be effective for support reasonable irrigation water resources management and agricultural production.
Krucien, Nicolas; Watson, Verity; Ryan, Mandy
2017-12-01
Health utility indices (HUIs) are widely used in economic evaluation. The best-worst scaling (BWS) method is being used to value dimensions of HUIs. However, little is known about the properties of this method. This paper investigates the validity of the BWS method to develop HUI, comparing it to another ordinal valuation method, the discrete choice experiment (DCE). Using a parametric approach, we find a low level of concordance between the two methods, with evidence of preference reversals. BWS responses are subject to decision biases, with significant effects on individuals' preferences. Non parametric tests indicate that BWS data has lower stability, monotonicity and continuity compared to DCE data, suggesting that the BWS provides lower quality data. As a consequence, for both theoretical and technical reasons, practitioners should be cautious both about using the BWS method to measure health-related preferences, and using HUI based on BWS data. Given existing evidence, it seems that the DCE method is a better method, at least because its limitations (and measurement properties) have been extensively researched. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Simple heterogeneity parametrization for sea surface temperature and chlorophyll
NASA Astrophysics Data System (ADS)
Skákala, Jozef; Smyth, Timothy J.
2016-06-01
Using satellite maps this paper offers a complex analysis of chlorophyll & SST heterogeneity in the shelf seas around the southwest of the UK. The heterogeneity scaling follows a simple power law and is consequently parametrized by two parameters. It is shown that in most cases these two parameters vary only relatively little with time. The paper offers a detailed comparison of field heterogeneity between different regions. How much heterogeneity is in each region preserved in the annual median data is also determined. The paper explicitly demonstrates how one can use these results to calculate representative measurement area for in situ networks.
NASA Technical Reports Server (NTRS)
Masters, A. I.; Galler, D. E.; Denman, T. F.; Shied, R. A.; Black, J. R.; Fierstein, A. R.; Clark, G. L.; Branstrom, B. R.
1993-01-01
A design and analysis study was conducted to provide advanced engine descriptions and parametric data for space transfer vehicles. The study was based on an advanced oxygen/hydrogen engine in the 7,500 to 50,000 lbf thrust range. Emphasis was placed on defining requirements for high-performance engines capable of achieving reliable and versatile operation in a space environment. Four variations on the expander cycle were compared, and the advantages and disadvantages of each were assessed. Parametric weight, envelope, and performance data were generated over a range of 7,500 to 50,000 lb thrust and a wide range of chamber pressure and nozzle expansion ratio.
An analytical parametric study of the broadband noise from axial-flow fans
NASA Technical Reports Server (NTRS)
Chou, Shau-Tak; George, Albert R.
1987-01-01
The rotating dipole analysis of Ffowcs Williams and Hawkings (1969) is used to predict the far field noise radiation due to various rotor broadband noise mechanisms. Consideration is given to inflow turbulence noise, attached boundary layer/trailing-edge interaction noise, tip-vortex formation noise, and trailing-edge thickness noise. The parametric dependence of broadband noise from unducted axial-flow fans on several critical variables is studied theoretically. The angle of attack of the rotor blades, which is related to the rotor performance, is shown to be important to the trailing-edge noise and to the tip-vortex formation noise.
Morphometric analysis of cortical sulci using parametric ribbons: a study of the central sulcus.
Davatzikos, Christos; Bryan, R Nick
2002-01-01
Interhemispheric and gender differences of the central sulcus were examined via a parametric ribbon approach. The central sulcus was found to be deeper and larger in the right nondominant hemisphere than in the left dominant hemisphere, both in males and in females. Based on its pattern, that asymmetry could be attributed to increased connectivity between motor and somatosensory cortex, facilitating fine movement, which could constrain the in-depth growth of the central sulcus. Position asymmetries were also found, which might be explained by a relative larger parietal association cortex in men but not in women.
Transfer pricing in hospitals and efficiency of physicians: the case of anesthesia services.
Kuntz, Ludwig; Vera, Antonio
2005-01-01
The objective is to investigate theoretically and empirically how the efficiency of the physicians involved in anesthesia and surgery can be optimized by the introduction of transfer pricing for anesthesia services. The anesthesiology data of approximately 57,000 operations carried out at the University Hospital Hamburg-Eppendorf (UKE) in Germany in the period from 2000 to 2002 are analyzed using parametric and non-parametric methods. The principal finding of the empirical analysis is that the efficiency of the physicians involved in anesthesia and surgery at the UKE improved after the introduction of transfer pricing.
Arisholm, Gunnar
2007-05-14
Group velocity mismatch (GVM) is a major concern in the design of optical parametric amplifiers (OPAs) and generators (OPGs) for pulses shorter than a few picoseconds. By simplifying the coupled propagation equations and exploiting their scaling properties, the number of free parameters for a collinear OPA is reduced to a level where the parameter space can be studied systematically by simulations. The resulting set of figures show the combinations of material parameters and pulse lengths for which high performance can be achieved, and they can serve as a basis for a design.
Ji, Jiadong; He, Di; Feng, Yang; He, Yong; Xue, Fuzhong; Xie, Lei
2017-10-01
A complex disease is usually driven by a number of genes interwoven into networks, rather than a single gene product. Network comparison or differential network analysis has become an important means of revealing the underlying mechanism of pathogenesis and identifying clinical biomarkers for disease classification. Most studies, however, are limited to network correlations that mainly capture the linear relationship among genes, or rely on the assumption of a parametric probability distribution of gene measurements. They are restrictive in real application. We propose a new Joint density based non-parametric Differential Interaction Network Analysis and Classification (JDINAC) method to identify differential interaction patterns of network activation between two groups. At the same time, JDINAC uses the network biomarkers to build a classification model. The novelty of JDINAC lies in its potential to capture non-linear relations between molecular interactions using high-dimensional sparse data as well as to adjust confounding factors, without the need of the assumption of a parametric probability distribution of gene measurements. Simulation studies demonstrate that JDINAC provides more accurate differential network estimation and lower classification error than that achieved by other state-of-the-art methods. We apply JDINAC to a Breast Invasive Carcinoma dataset, which includes 114 patients who have both tumor and matched normal samples. The hub genes and differential interaction patterns identified were consistent with existing experimental studies. Furthermore, JDINAC discriminated the tumor and normal sample with high accuracy by virtue of the identified biomarkers. JDINAC provides a general framework for feature selection and classification using high-dimensional sparse omics data. R scripts available at https://github.com/jijiadong/JDINAC. lxie@iscb.org. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Technical Reports Server (NTRS)
Hen, Itay; Rieffel, Eleanor G.; Do, Minh; Venturelli, Davide
2014-01-01
There are two common ways to evaluate algorithms: performance on benchmark problems derived from real applications and analysis of performance on parametrized families of problems. The two approaches complement each other, each having its advantages and disadvantages. The planning community has concentrated on the first approach, with few ways of generating parametrized families of hard problems known prior to this work. Our group's main interest is in comparing approaches to solving planning problems using a novel type of computational device - a quantum annealer - to existing state-of-the-art planning algorithms. Because only small-scale quantum annealers are available, we must compare on small problem sizes. Small problems are primarily useful for comparison only if they are instances of parametrized families of problems for which scaling analysis can be done. In this technical report, we discuss our approach to the generation of hard planning problems from classes of well-studied NP-complete problems that map naturally to planning problems or to aspects of planning problems that many practical planning problems share. These problem classes exhibit a phase transition between easy-to-solve and easy-to-show-unsolvable planning problems. The parametrized families of hard planning problems lie at the phase transition. The exponential scaling of hardness with problem size is apparent in these families even at very small problem sizes, thus enabling us to characterize even very small problems as hard. The families we developed will prove generally useful to the planning community in analyzing the performance of planning algorithms, providing a complementary approach to existing evaluation methods. We illustrate the hardness of these problems and their scaling with results on four state-of-the-art planners, observing significant differences between these planners on these problem families. Finally, we describe two general, and quite different, mappings of planning problems to QUBOs, the form of input required for a quantum annealing machine such as the D-Wave II.
A Genomewide Linkage Scan of Cocaine Dependence and Major Depressive Episode in Two Populations
Yang, Bao-Zhu; Han, Shizhong; Kranzler, Henry R; Farrer, Lindsay A; Gelernter, Joel
2011-01-01
Cocaine dependence (CD) and major depressive episode (MDE) frequently co-occur with poorer treatment outcome and higher relapse risk. Shared genetic risk was affirmed; to date, there have been no reports of genomewide linkage scans (GWLSs) surveying the susceptibility regions for comorbid CD and MDE (CD–MDE). We aimed to identify chromosomal regions and candidate genes susceptible to CD, MDE, and CD–MDE in African Americans (AAs) and European Americans (EAs). A total of 1896 individuals were recruited from 384 AA and 355 EA families, each with at least a sibling-pair with CD and/or opioid dependence. Array-based genotyping of about 6000 single-nucleotide polymorphisms was completed for all individuals. Parametric and non-parametric genomewide linkage analyses were performed. We found a genomewide-significant linkage peak on chromosome 7 at 183.4 cM for non-parametric analysis of CD–MDE in AAs (lod=3.8, genomewide empirical p=0.016; point-wise p=0.00001). A nearly genomewide significant linkage was identified for CD–MDE in EAs on chromosome 5 at 14.3 cM (logarithm of odds (lod)=2.95, genomewide empirical p=0.055; point-wise p=0.00012). Parametric analysis corroborated the findings in these two regions and improved the support for the peak on chromosome 5 so that it reached genomewide significance (heterogeneity lod=3.28, genomewide empirical p=0.046; point-wise p=0.00053). This is the first GWLS for CD–MDE. The genomewide significant linkage regions on chromosomes 5 and 7 harbor four particularly promising candidate genes: SRD5A1, UBE3C, PTPRN2, and VIPR2. Replication of the linkage findings in other populations is warranted, as is a focused analysis of the genes located in the linkage regions implicated here. PMID:21849985
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Theory and Application of DNA Histogram Analysis.
ERIC Educational Resources Information Center
Bagwell, Charles Bruce
The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…
An Instructional Module on Mokken Scale Analysis
ERIC Educational Resources Information Center
Wind, Stefanie A.
2017-01-01
Mokken scale analysis (MSA) is a probabilistic-nonparametric approach to item response theory (IRT) that can be used to evaluate fundamental measurement properties with less strict assumptions than parametric IRT models. This instructional module provides an introduction to MSA as a probabilistic-nonparametric framework in which to explore…
HBCU Efficiency and Endowments: An Exploratory Analysis
ERIC Educational Resources Information Center
Coupet, Jason; Barnum, Darold
2010-01-01
Discussions of efficiency among Historically Black Colleges and Universities (HBCUs) are often missing in academic conversations. This article seeks to assess efficiency of individual HBCUs using Data Envelopment Analysis (DEA), a non-parametric technique that can synthesize multiple inputs and outputs to determine a single efficiency score for…
Minati, Ludovico; Grisoli, Marina; Franceschetti, Silvana; Epifani, Francesca; Granvillano, Alice; Medford, Nick; Harrison, Neil A; Piacentini, Sylvie; Critchley, Hugo D
2012-01-01
Adaptive behaviour requires an ability to obtain rewards by choosing between different risky options. Financial gambles can be used to study effective decision-making experimentally, and to distinguish processes involved in choice option evaluation from outcome feedback and other contextual factors. Here, we used a paradigm where participants evaluated 'mixed' gambles, each presenting a potential gain and a potential loss and an associated variable outcome probability. We recorded neural responses using autonomic monitoring, electroencephalography (EEG) and functional neuroimaging (fMRI), and used a univariate, parametric design to test for correlations with the eleven economic parameters that varied across gambles, including expected value (EV) and amount magnitude. Consistent with behavioural economic theory, participants were risk-averse. Gamble evaluation generated detectable autonomic responses, but only weak correlations with outcome uncertainty were found, suggesting that peripheral autonomic feedback does not play a major role in this task. Long-latency stimulus-evoked EEG potentials were sensitive to expected gain and expected value, while alpha-band power reflected expected loss and amount magnitude, suggesting parallel representations of distinct economic qualities in cortical activation and central arousal. Neural correlates of expected value representation were localized using fMRI to ventromedial prefrontal cortex, while the processing of other economic parameters was associated with distinct patterns across lateral prefrontal, cingulate, insula and occipital cortices including default-mode network and early visual areas. These multimodal data provide complementary evidence for distributed substrates of choice evaluation across multiple, predominantly cortical, brain systems wherein distinct regions are preferentially attuned to specific economic features. Our findings extend biologically-plausible models of risky decision-making while providing potential biomarkers of economic representations that can be applied to the study of deficits in motivational behaviour in neurological and psychiatric patients.
Rephasing invariant parametrization of flavor mixing
NASA Astrophysics Data System (ADS)
Lee, Tae-Hun
A new rephasing invariant parametrization for the 3 x 3 CKM matrix, called (x, y) parametrization, is introduced and the properties and applications of the parametrization are discussed. The overall phase condition leads this parametrization to have only six rephsing invariant parameters and two constraints. Its simplicity and regularity become apparent when it is applied to the one-loop RGE (renormalization group equations) for the Yukawa couplings. The implications of this parametrization for unification of the Yukawa couplings are also explored.
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
Borri, Marco; Schmidt, Maria A.; Powell, Ceri; Koh, Dow-Mu; Riddell, Angela M.; Partridge, Mike; Bhide, Shreerang A.; Nutting, Christopher M.; Harrington, Kevin J.; Newbold, Katie L.; Leach, Martin O.
2015-01-01
Purpose To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters) of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment. Material and Methods The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4). Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters. Results The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4), determined with cluster validation, produced the best separation between reducing and non-reducing clusters. Conclusion The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes. PMID:26398888
BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs
Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen
2014-01-01
Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471
Explicit Trace Inequalities for Isogeometric Analysis and Parametric Hexahedral Finite Elements
2011-05-01
Computational Mechanics, 43:3– 37, 2008. [6] Y Bazilevs, L Beirao da Veiga , J A Cottrell, T J R Hughes, and G Sangalli. Isoge- ometric analysis... Veiga , A Buffa, J Rivas, and G Sangalli. Some estimates for h − p − k refinement in isogeometric analysis. Numerische Mathematik, 118:271–305, 2011
NASA Technical Reports Server (NTRS)
1972-01-01
Mission analysis is discussed, including the consolidation and expansion of mission equipment and experiment characteristics, and determination of simplified shuttle flight schedule. Parametric analysis of standard space hardware and preliminary shuttle/payload constraints analysis are evaluated, along with the cost impact of low cost standard hardware.
Hu, Leland S; Ning, Shuluo; Eschbacher, Jennifer M; Gaw, Nathan; Dueck, Amylou C; Smith, Kris A; Nakaji, Peter; Plasencia, Jonathan; Ranjbar, Sara; Price, Stephen J; Tran, Nhan; Loftus, Joseph; Jenkins, Robert; O'Neill, Brian P; Elmquist, William; Baxter, Leslie C; Gao, Fei; Frakes, David; Karis, John P; Zwart, Christine; Swanson, Kristin R; Sarkaria, Jann; Wu, Teresa; Mitchell, J Ross; Li, Jing
2015-01-01
Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM). Contrast-enhanced MRI (CE-MRI) targets enhancing core (ENH) but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor within surrounding non-enhancing parenchyma, or brain-around-tumor (BAT), despite the importance of characterizing this tumor segment, which universally recurs. In this study, we use multiple texture analysis and machine learning (ML) algorithms to analyze multi-parametric MRI, and produce new images indicating tumor-rich targets in GBM. We recruited primary GBM patients undergoing image-guided biopsies and acquired pre-operative MRI: CE-MRI, Dynamic-Susceptibility-weighted-Contrast-enhanced-MRI, and Diffusion Tensor Imaging. Following image coregistration and region of interest placement at biopsy locations, we compared MRI metrics and regional texture with histologic diagnoses of high- vs low-tumor content (≥80% vs <80% tumor nuclei) for corresponding samples. In a training set, we used three texture analysis algorithms and three ML methods to identify MRI-texture features that optimized model accuracy to distinguish tumor content. We confirmed model accuracy in a separate validation set. We collected 82 biopsies from 18 GBMs throughout ENH and BAT. The MRI-based model achieved 85% cross-validated accuracy to diagnose high- vs low-tumor in the training set (60 biopsies, 11 patients). The model achieved 81.8% accuracy in the validation set (22 biopsies, 7 patients). Multi-parametric MRI and texture analysis can help characterize and visualize GBM's spatial histologic heterogeneity to identify regional tumor-rich biopsy targets.
Design of plywood and paper flywheel rotors
NASA Astrophysics Data System (ADS)
Erdman, A. G.; Hagen, D. L.; Gaff, S. A.
1982-05-01
Technical and economic design factors of cellulosic rotors are compared with conventional materials for stationary flywheel energy storage systems. Wood species, operation in a vacuum, assembly and costs of rotors are evaluated. Wound kraft paper, twine and plywood rotors are examined. Two hub attachments are designed. Support stiffness is shown to be constrained by the material strength, rotor configuration and speed ratio. Preliminary duration of load tests was performed on vacuum dried hexagonal birch plywood. Dynamic and static rotor hub fatigue equipment is designed. Moisture loss rates while vacuum drying plywood cylinders were measured, and the radial and axial diffusion coefficients were evaluated. Diffusion coefficients of epoxy coated plywood cylinders were also obtained. Economics of cellulosic and conventional rotors were examined. Plywood rotor manufacturing costs were evaluated. The optimum economic shape for laminated rotors is shown to be cylindrical. Vacuum container costs are parametrically derived and based on material properties and costs. Containment costs are significant and are included in comparisons. The optimum design stress and wound rotor configuration are calculated for seventeen examples. Plywood rotors appear to be marginally competitive with the steel hose wire or E glass rotors. High performance oriented kraft paper rotors potentially provide the lowest energy storage costs in stationary systems.
Advanced supersonic technology concept study: Hydrogen fueled configuration
NASA Technical Reports Server (NTRS)
Brewer, G. D.
1974-01-01
Conceptual designs of hydrogen fueled supersonic transport configurations for the 1990 time period were developed and compared with equivalent technology Jet A-1 fueled vehicles to determine the economic and performance potential of liquid hydrogen as an alternate fuel. Parametric evaluations of supersonic cruise vehicles with varying design and transport mission characteristics established the basis for selecting a preferred configuration which was then studied in greater detail. An assessment was made of the general viability of the selected concept including an evaluation of costs and environmental considerations, i.e., exhaust emissions and sonic boom characteristics. Technology development requirements and suggested implementation schedules are presented.
Evaluation of air quality indicators in Alberta, Canada - An international perspective.
Bari, Md Aynul; Kindzierski, Warren B
2016-01-01
There has been an increase in oil sands development in northern Alberta, Canada and an overall increase in economic activity in the province in recent years. An evaluation of the state of air quality was conducted in four Alberta locations - urban centers of Calgary and Edmonton, and smaller communities of Fort McKay and Fort McMurray in the Athabasca Oil Sands Region (AOSR). Concentration trends, diurnal hourly and monthly average concentration profiles, and exceedances of provincial, national and international air quality guidelines were assessed for several criteria air pollutants over the period 1998 to 2014. Two methods were used to evaluate trends. Parametric analysis of annual median 1h concentrations and non-parametric analysis of annual geometric mean 1h concentrations showed consistent decreasing trends for NO2 and SO2 (<1ppb per year), CO (<0.1ppm per year) at all stations, decreasing for THC (<0.1ppm per year) and increasing for O3 (≤0.52ppb per year) at most stations and unchanged for PM2.5 at all stations in Edmonton and Calgary over a 17-year period. Little consistency in trends was observed among the methods for the same air pollutants other than for THC (increasing in Fort McKay <0.1ppm per year and no trend in Fort McMurray), PM2.5 in Fort McKay and Fort McMurray (no trend) and CO (decreasing <0.1ppm per year in Fort McMurray) over the same period. Levels of air quality indicators at the four locations were compared with other Canadian and international urban areas to judge the current state of air quality. Median and annual average concentrations for Alberta locations tended to be the smallest in Fort McKay and Fort McMurray. Other than for PM2.5, Calgary and Edmonton tended to have median and annual average concentrations comparable to and/or below that of larger populated Canadian and U.S. cities, depending upon the air pollutant. Copyright © 2016 Elsevier Ltd. All rights reserved.
Castillo, Maria Isabel; Larsen, Emily; Cooke, Marie; Marsh, Nicole M; Wallis, Marianne C; Finucane, Julie; Brown, Peter; Mihala, Gabor; Carr, Peter J; Byrnes, Joshua; Walker, Rachel; Cable, Prudence; Zhang, Li; Sear, Candi; Jackson, Gavin; Rowsome, Anna; Ryan, Alison; Humphries, Julie C; Sivyer, Susan; Flanigan, Kathy; Rickard, Claire M
2018-05-14
Peripheral intravenous catheters (PIVCs) are frequently used in hospitals. However, PIVC complications are common, with failures leading to treatment delays, additional procedures, patient pain and discomfort, increased clinician workload and substantially increased healthcare costs. Recent evidence suggests integrated PIVC systems may be more effective than traditional non-integrated PIVC systems in reducing phlebitis, infiltration and costs and increasing functional dwell time. The study aim is to determine the efficacy, cost-utility and acceptability to patients and professionals of an integrated PIVC system compared with a non-integrated PIVC system. Two-arm, multicentre, randomised controlled superiority trial of integrated versus non-integrated PIVC systems to compare effectiveness on clinical and economic outcomes. Recruitment of 1560 patients over 2 years, with randomisation by a centralised service ensuring allocation concealment. Primary outcomes: catheter failure (composite endpoint) for reasons of: occlusion, infiltration/extravasation, phlebitis/thrombophlebitis, dislodgement, localised or catheter-associated bloodstream infections. first time insertion success, types of PIVC failure, device colonisation, insertion pain, functional dwell time, adverse events, mortality, cost-utility and consumer acceptability. One PIVC per patient will be included, with intention-to-treat analysis. Baseline group comparisons will be made for potentially clinically important confounders. The proportional hazards assumption will be checked, and Cox regression will test the effect of group, patient, device and clinical variables on failure. An as-treated analysis will assess the effect of protocol violations. Kaplan-Meier survival curves with log-rank tests will compare failure by group over time. Secondary endpoints will be compared between groups using parametric/non-parametric techniques. Ethical approval from the Royal Brisbane and Women's Hospital Human Research Ethics Committee (HREC/16/QRBW/527), Griffith University Human Research Ethics Committee (Ref No. 2017/002) and the South Metropolitan Health Services Human Research Ethics Committee (Ref No. 2016-239). Results will be published in peer-reviewed journals. ACTRN12617000089336. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Back to the Future: Long-Term Seismic Archives Revisited
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2007-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring seismic activity. These archives typically consist of waveforms of seismic events and associated parametric data such as phase arrival time picks and the location of hypocenters. Catalogs of earthquake locations are fundamental data in seismology, and even in the Earth sciences in general. Yet, these locations have notoriously low spatial resolution because of errors in both the picks and the models commonly used to locate events one at a time. This limits their potential to address fundamental questions concerning the physics of earthquakes, the structure and composition of the Earth's interior, and the seismic hazards associated with active faults. We report on the comprehensive use of modern waveform cross-correlation based methodologies for high- resolution earthquake location - as applied to regional and global long-term seismic databases. By simultaneous re-analysis of two decades of the digital seismic archive of Northern California, reducing pick errors via cross-correlation and model errors via double-differencing, we achieve up to three orders of magnitude resolution improvement over existing hypocenter locations. The relocated events image networks of discrete faults at seismogenic depths across various tectonic settings that until now have been hidden in location uncertainties. Similar location improvements are obtained for earthquakes recorded at global networks by re- processing 40 years of parametric data from the ISC and corresponding waveforms archived at IRIS. Since our methods are scaleable and run on inexpensive Beowulf clusters, periodic re-analysis of entire archives may thus become a routine procedure to continuously improve resolution in existing catalogs. We demonstrate the role of seismic archives in obtaining the precise location of new events in real-time. Such information has considerable social and economic impact in the evaluation and mitigation of seismic hazards, for example, and highlights the need for consistent long-term seismic monitoring and archiving of records.
Testing the cosmic conservation of photon number with type Ia supernovae and ages of old objects
NASA Astrophysics Data System (ADS)
Jesus, J. F.; Holanda, R. F. L.; Dantas, M. A.
2017-12-01
In this paper, we obtain luminosity distances by using ages of 32 old passive galaxies distributed over the redshift interval 0.11< z < 1.84 and test the cosmic conservation of photon number by comparing them with 580 distance moduli of type Ia supernovae (SNe Ia) from the so-called Union 2.1 compilation. Our analyses are based on the fact that the method of obtaining ages of galaxies relies on the detailed shape of galaxy spectra but not on galaxy luminosity. Possible departures from cosmic conservation of photon number is parametrized by τ (z) = 2 ɛ z and τ (z) = ɛ z/(1+z) (for ɛ =0 the conservation of photon number is recovered). We find ɛ =0.016^{+0.078}_{-0.075} from the first parametrization and ɛ =- 0.18^{+0.25}_{-0.24} from the second parametrization, both limits at 95% c.l. In this way, no significant departure from cosmic conservation of photon number is verified. In addition, by considering the total age as inferred from Planck (2015) analysis, we find the incubation time t_{inc}=1.66± 0.29 Gyr and t_{inc}=1.23± 0.27 Gyr at 68% c.l. for each parametrization, respectively.
NASA Astrophysics Data System (ADS)
Fernández-Llamazares, Álvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción
2014-04-01
Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.
Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S
2015-04-01
To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.
The urban heat island in Rio de Janeiro, Brazil, in the last 30 years using remote sensing data
NASA Astrophysics Data System (ADS)
Peres, Leonardo de Faria; Lucena, Andrews José de; Rotunno Filho, Otto Corrêa; França, José Ricardo de Almeida
2018-02-01
The aim of this work is to study urban heat island (UHI) in Metropolitan Area of Rio de Janeiro (MARJ) based on the analysis of land-surface temperature (LST) and land-use patterns retrieved from Landsat-5/Thematic Mapper (TM), Landsat-7/Enhanced Thematic Mapper Plus (ETM+) and Landsat-8/Operational Land Imager (OLI) and Thermal Infrared Sensors (TIRS) data covering a 32-year period between 1984 and 2015. LST temporal evolution is assessed by comparing the average LST composites for 1984-1999 and 2000-2015 where the parametric Student t-test was conducted at 5% significance level to map the pixels where LST for the more recent period is statistically significantly greater than the previous one. The non-parametric Mann-Whitney-Wilcoxon rank sum test has also confirmed at the same 5% significance level that the more recent period (2000-2015) has higher LST values. UHI intensity between ;urban; and ;rural/urban low density; (;vegetation;) areas for 1984-1999 and 2000-2015 was established and confirmed by both parametric and non-parametric tests at 1% significance level as 3.3 °C (5.1 °C) and 4.4 °C (7.1 °C), respectively. LST has statistically significantly (p-value < 0.01) increased over time in two of three land cover classes (;urban; and ;urban low density;), respectively by 1.9 °C and 0.9 °C, except in ;vegetation; class. A spatial analysis was also performed to identify the urban pixels within MARJ where UHI is more intense by subtracting the LST of these pixels from the LST mean value of ;vegetation; land-use class.
NASA Astrophysics Data System (ADS)
Guoqing, Zhang; Junxin, Li; Jin, Li; Chengguang, Zhang; Zefeng, Xiao
2018-04-01
To fabricate porous implants with improved biocompatibility and mechanical properties that are matched to their application using selective laser melting (SLM), flow within the mold and compressive properties and performance of the porous structures must be comprehensively studied. Parametric modeling was used to build 3D models of octahedron and hexahedron structures. Finite element analysis was used to evaluate the mold flow and compressive properties of the parametric porous structures. A DiMetal-100 SLM molding apparatus was used to manufacture the porous structures and the results evaluated by light microscopy. The results showed that parametric modeling can produce robust models. Square structures caused higher blood cell adhesion than cylindrical structures. "Vortex" flow in square structures resulted in chaotic distribution of blood elements, whereas they were mostly distributed around the connecting parts in the cylindrical structures. No significant difference in elastic moduli or compressive strength was observed in square and cylindrical porous structures of identical characteristics. Hexahedron, square and cylindrical porous structures had the same stress-strain properties. For octahedron porous structures, cylindrical structures had higher stress-strain properties. Using these modeling and molding results, an important basis for designing and the direct manufacture of fixed biological implants is provided.
NASA Astrophysics Data System (ADS)
Guoqing, Zhang; Junxin, Li; Jin, Li; Chengguang, Zhang; Zefeng, Xiao
2018-05-01
To fabricate porous implants with improved biocompatibility and mechanical properties that are matched to their application using selective laser melting (SLM), flow within the mold and compressive properties and performance of the porous structures must be comprehensively studied. Parametric modeling was used to build 3D models of octahedron and hexahedron structures. Finite element analysis was used to evaluate the mold flow and compressive properties of the parametric porous structures. A DiMetal-100 SLM molding apparatus was used to manufacture the porous structures and the results evaluated by light microscopy. The results showed that parametric modeling can produce robust models. Square structures caused higher blood cell adhesion than cylindrical structures. "Vortex" flow in square structures resulted in chaotic distribution of blood elements, whereas they were mostly distributed around the connecting parts in the cylindrical structures. No significant difference in elastic moduli or compressive strength was observed in square and cylindrical porous structures of identical characteristics. Hexahedron, square and cylindrical porous structures had the same stress-strain properties. For octahedron porous structures, cylindrical structures had higher stress-strain properties. Using these modeling and molding results, an important basis for designing and the direct manufacture of fixed biological implants is provided.
Parametric Weight Study of Cryogenic Metallic Tanks for the ``Bimodal'' NTR Mars Vehicle Concept
NASA Astrophysics Data System (ADS)
Kosareo, Daniel N.; Roche, Joseph M.
2006-01-01
A parametric weight assessment of large cryogenic metallic tanks was conducted using the design optimization capabilities in the ANSYS ® finite element analysis code. This analysis was performed to support the sizing of a ``bimodal'' nuclear thermal rocket (NTR) Mars vehicle concept developed at the NASA Glenn Research Center. The tank design study was driven by two load conditions: an in-line, ``Shuttle-derived'' heavy-lift launch with the tanks filled and pressurized, and a burst-test pressure. The main tank structural arrangement is a state-of-the art metallic construction which uses an aluminum-lithium alloy stiffened internally with a ring and stringer framework. The tanks must carry liquid hydrogen in separate launches to orbit where all vehicle components will dock and mate. All tank designs stayed within the available mass and payload volume limits of both the in-line heavy lift and Shuttle derived launch vehicles. Weight trends were developed over a range of tank lengths with varying stiffener cross-sections and tank wall thicknesses. The object of this parametric study was to verify that the proper mass was allocated for the tanks in the overall vehicle sizing model. This paper summarizes the tank weights over a range of tank lengths.
Evaluation of Second-Level Inference in fMRI Analysis
Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs
2016-01-01
We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578
Parametric Structural Model for a Mars Entry Concept
NASA Technical Reports Server (NTRS)
Lane, Brittney M.; Ahmed, Samee W.
2017-01-01
This paper outlines the process of developing a parametric model for a vehicle that can withstand Earth launch and Mars entry conditions. This model allows the user to change a variety of parameters ranging from dimensions and meshing to materials and atmospheric entry angles to perform finite element analysis on the model for the specified load cases. While this work focuses on an aeroshell for Earth launch aboard the Space Launch System (SLS) and Mars entry, the model can be applied to different vehicles and destinations. This specific project derived from the need to deliver large payloads to Mars efficiently, safely, and cheaply. Doing so requires minimizing the structural mass of the body as much as possible. The code developed for this project allows for dozens of cases to be run with the single click of a button. The end result of the parametric model gives the user a sense of how the body reacts under different loading cases so that it can be optimized for its purpose. The data are reported in this paper and can provide engineers with a good understanding of the model and valuable information for improving the design of the vehicle. In addition, conclusions show that the frequency analysis drives the design and suggestions are made to reduce the significance of normal modes in the design.
Cryogenic storage tank thermal analysis
NASA Technical Reports Server (NTRS)
Wright, J. P.
1976-01-01
Parametric study discusses relationship between cryogenic boil-off and factors such as tank size, insulation thickness and performance, structural-support heat leaks and use of vapor-cooled shields. Data presented as series of nomographs and curves.
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Van, Luong
1992-01-01
The objective of this paper are to develop a multidisciplinary computational methodology to predict the hot-gas-side and coolant-side heat transfer and to use it in parametric studies to recommend optimized design of the coolant channels for a regeneratively cooled liquid rocket engine combustor. An integrated numerical model which incorporates CFD for the hot-gas thermal environment, and thermal analysis for the liner and coolant channels, was developed. This integrated CFD/thermal model was validated by comparing predicted heat fluxes with those of hot-firing test and industrial design methods for a 40 k calorimeter thrust chamber and the Space Shuttle Main Engine Main Combustion Chamber. Parametric studies were performed for the Advanced Main Combustion Chamber to find a strategy for a proposed combustion chamber coolant channel design.
Bayesian hierarchical functional data analysis via contaminated informative priors.
Scarpa, Bruno; Dunson, David B
2009-09-01
A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.
Sorting of Streptomyces Cell Pellets Using a Complex Object Parametric Analyzer and Sorter
Petrus, Marloes L. C.; van Veluw, G. Jerre; Wösten, Han A. B.; Claessen, Dennis
2014-01-01
Streptomycetes are filamentous soil bacteria that are used in industry for the production of enzymes and antibiotics. When grown in bioreactors, these organisms form networks of interconnected hyphae, known as pellets, which are heterogeneous in size. Here we describe a method to analyze and sort mycelial pellets using a Complex Object Parametric Analyzer and Sorter (COPAS). Detailed instructions are given for the use of the instrument and the basic statistical analysis of the data. We furthermore describe how pellets can be sorted according to user-defined settings, which enables downstream processing such as the analysis of the RNA or protein content. Using this methodology the mechanism underlying heterogeneous growth can be tackled. This will be instrumental for improving streptomycetes as a cell factory, considering the fact that productivity correlates with pellet size. PMID:24561666
Research on ponderomotive driven Vlasov–Poisson system in electron acoustic wave parametric region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, C. Z.; Huang, T. W.; Liu, Z. J.
2014-03-15
Theoretical analysis and corresponding 1D Particle-in-Cell (PIC) simulations of ponderomotive driven Vlasov–Poisson system in electron acoustic wave (EAW) parametric region are demonstrated. Theoretical analysis identifies that under the resonant condition, a monochromatic EAW can be excited when the wave number of the drive ponderomotive force satisfies 0.26≲k{sub d}λ{sub D}≲0.53. If k{sub d}λ{sub D}≲0.26, nonlinear superposition of harmonic waves can be resonantly excited, called kinetic electrostatic electron nonlinear waves. Numerical simulations have demonstrated these wave excitation and evolution dynamics, in consistence with the theoretical predictions. The physical nature of these two waves is supposed to be interaction of harmonic waves, andmore » their similar phase space properties are also discussed.« less
Trend analysis of Arctic sea ice extent
NASA Astrophysics Data System (ADS)
Silva, M. E.; Barbosa, S. M.; Antunes, Luís; Rocha, Conceição
2009-04-01
The extent of Arctic sea ice is a fundamental parameter of Arctic climate variability. In the context of climate change, the area covered by ice in the Arctic is a particularly useful indicator of recent changes in the Arctic environment. Climate models are in near universal agreement that Arctic sea ice extent will decline through the 21st century as a consequence of global warming and many studies predict a ice free Arctic as soon as 2012. Time series of satellite passive microwave observations allow to assess the temporal changes in the extent of Arctic sea ice. Much of the analysis of the ice extent time series, as in most climate studies from observational data, have been focussed on the computation of deterministic linear trends by ordinary least squares. However, many different processes, including deterministic, unit root and long-range dependent processes can engender trend like features in a time series. Several parametric tests have been developed, mainly in econometrics, to discriminate between stationarity (no trend), deterministic trend and stochastic trends. Here, these tests are applied in the trend analysis of the sea ice extent time series available at National Snow and Ice Data Center. The parametric stationary tests, Augmented Dickey-Fuller (ADF), Phillips-Perron (PP) and the KPSS, do not support an overall deterministic trend in the time series of Arctic sea ice extent. Therefore, alternative parametrizations such as long-range dependence should be considered for characterising long-term Arctic sea ice variability.
Robustness against parametric noise of nonideal holonomic gates
NASA Astrophysics Data System (ADS)
Lupo, Cosmo; Aniello, Paolo; Napolitano, Mario; Florio, Giuseppe
2007-07-01
Holonomic gates for quantum computation are commonly considered to be robust against certain kinds of parametric noise, the cause of this robustness being the geometric character of the transformation achieved in the adiabatic limit. On the other hand, the effects of decoherence are expected to become more and more relevant when the adiabatic limit is approached. Starting from the system described by Florio [Phys. Rev. A 73, 022327 (2006)], here we discuss the behavior of nonideal holonomic gates at finite operational time, i.e., long before the adiabatic limit is reached. We have considered several models of parametric noise and studied the robustness of finite-time gates. The results obtained suggest that the finite-time gates present some effects of cancellation of the perturbations introduced by the noise which mimic the geometrical cancellation effect of standard holonomic gates. Nevertheless, a careful analysis of the results leads to the conclusion that these effects are related to a dynamical instead of a geometrical feature.
Sleep analysis for wearable devices applying autoregressive parametric models.
Mendez, M O; Villantieri, O; Bianchi, A; Cerutti, S
2005-01-01
We applied time-variant and time-invariant parametric models in both healthy subjects and patients with sleep disorder recordings in order to assess the skills of those approaches to sleep disorders diagnosis in wearable devices. The recordings present the Obstructive Sleep Apnea (OSA) pathology which is characterized by fluctuations in the heart rate, bradycardia in apneonic phase and tachycardia at the recovery of ventilation. Data come from a web database in www.physionet.org. During OSA the spectral indexes obtained by time-variant lattice filters presented oscillations that correspond to the changes brady-tachycardia of the RR intervals and greater values than healthy ones. Multivariate autoregressive models showed an increment in very low frequency component (PVLF) at each apneic event. Also a rise in high frequency component (PHF) occurred over the breathing restore in the spectrum of both quadratic coherence and cross-spectrum in OSA. These autoregressive parametric approaches could help in the diagnosis of Sleep Disorder inside of the wearable devices.
Membrane reactor for water detritiation: a parametric study on operating parameters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mascarade, J.; Liger, K.; Troulay, M.
2015-03-15
This paper presents the results of a parametric study done on a single stage finger-type packed-bed membrane reactor (PBMR) used for heavy water vapor de-deuteration. Parametric studies have been done on 3 operating parameters which are: the membrane temperature, the total feed flow rate and the feed composition through D{sub 2}O content variations. Thanks to mass spectrometer analysis of streams leaving the PBMR, speciation of deuterated species was achieved. Measurement of the amounts of each molecular component allowed the calculation of reaction quotient at the packed-bed outlet. While temperature variation mainly influences permeation efficiency, feed flow rate perturbation reveals dependencemore » of conversion and permeation properties to contact time between catalyst and reacting mixture. The study shows that isotopic exchange reactions occurring on the catalyst particles surface are not thermodynamically balanced. Moreover, the variation of the heavy water content in the feed exhibits competition between permeation and conversion kinetics.« less
Ionescu, Crina-Maria; Geidl, Stanislav; Svobodová Vařeková, Radka; Koča, Jaroslav
2013-10-28
We focused on the parametrization and evaluation of empirical models for fast and accurate calculation of conformationally dependent atomic charges in proteins. The models were based on the electronegativity equalization method (EEM), and the parametrization procedure was tailored to proteins. We used large protein fragments as reference structures and fitted the EEM model parameters using atomic charges computed by three population analyses (Mulliken, Natural, iterative Hirshfeld), at the Hartree-Fock level with two basis sets (6-31G*, 6-31G**) and in two environments (gas phase, implicit solvation). We parametrized and successfully validated 24 EEM models. When tested on insulin and ubiquitin, all models reproduced quantum mechanics level charges well and were consistent with respect to population analysis and basis set. Specifically, the models showed on average a correlation of 0.961, RMSD 0.097 e, and average absolute error per atom 0.072 e. The EEM models can be used with the freely available EEM implementation EEM_SOLVER.
Non-Parametric Collision Probability for Low-Velocity Encounters
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2007-01-01
An implicit, but not necessarily obvious, assumption in all of the current techniques for assessing satellite collision probability is that the relative position uncertainty is perfectly correlated in time. If there is any mis-modeling of the dynamics in the propagation of the relative position error covariance matrix, time-wise de-correlation of the uncertainty will increase the probability of collision over a given time interval. The paper gives some examples that illustrate this point. This paper argues that, for the present, Monte Carlo analysis is the best available tool for handling low-velocity encounters, and suggests some techniques for addressing the issues just described. One proposal is for the use of a non-parametric technique that is widely used in actuarial and medical studies. The other suggestion is that accurate process noise models be used in the Monte Carlo trials to which the non-parametric estimate is applied. A further contribution of this paper is a description of how the time-wise decorrelation of uncertainty increases the probability of collision.
NASA Astrophysics Data System (ADS)
Czerwiński, Andrzej; Łuczko, Jan
2018-01-01
The paper summarises the experimental investigations and numerical simulations of non-planar parametric vibrations of a statically deformed pipe. Underpinning the theoretical analysis is a 3D dynamic model of curved pipe. The pipe motion is governed by four non-linear partial differential equations with periodically varying coefficients. The Galerkin method was applied, the shape function being that governing the beam's natural vibrations. Experiments were conducted in the range of simple and combination parametric resonances, evidencing the possibility of in-plane and out-of-plane vibrations as well as fully non-planar vibrations in the combination resonance range. It is demonstrated that sub-harmonic and quasi-periodic vibrations are likely to be excited. The method suggested allows the spatial modes to be determined basing on results registered at selected points in the pipe. Results are summarised in the form of time histories, phase trajectory plots and spectral diagrams. Dedicated video materials give us a better insight into the investigated phenomena.
Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.
2000-01-01
The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.
Bifurcation analysis of eight coupled degenerate optical parametric oscillators
NASA Astrophysics Data System (ADS)
Ito, Daisuke; Ueta, Tetsushi; Aihara, Kazuyuki
2018-06-01
A degenerate optical parametric oscillator (DOPO) network realized as a coherent Ising machine can be used to solve combinatorial optimization problems. Both theoretical and experimental investigations into the performance of DOPO networks have been presented previously. However a problem remains, namely that the dynamics of the DOPO network itself can lower the search success rates of globally optimal solutions for Ising problems. This paper shows that the problem is caused by pitchfork bifurcations due to the symmetry structure of coupled DOPOs. Some two-parameter bifurcation diagrams of equilibrium points express the performance deterioration. It is shown that the emergence of non-ground states regarding local minima hampers the system from reaching the ground states corresponding to the global minimum. We then describe a parametric strategy for leading a system to the ground state by actively utilizing the bifurcation phenomena. By adjusting the parameters to break particular symmetry, we find appropriate parameter sets that allow the coherent Ising machine to obtain the globally optimal solution alone.
Feasibility Study of a Satellite Solar Power Station
NASA Technical Reports Server (NTRS)
Glaser, P. E.; Maynard, O. E.; Mackovciak, J. J. R.; Ralph, E. I.
1974-01-01
A feasibility study of a satellite solar power station (SSPS) was conducted to: (1) explore how an SSPS could be flown and controlled in orbit; (2) determine the techniques needed to avoid radio frequency interference (RFI); and (3) determine the key environmental, technological, and economic issues involved. Structural and dynamic analyses of the SSPS structure were performed, and deflections and internal member loads were determined. Desirable material characteristics were assessed and technology developments identified. Flight control performance of the SSPS baseline design was evaluated and parametric sizing studies were performed. The study of RFI avoidance techniques covered (1) optimization of the microwave transmission system; (2) device design and expected RFI; and (3) SSPS RFI effects. The identification of key issues involved (1) microwave generation, transmissions, and rectification and solar energy conversion; (2) environmental-ecological impact and biological effects; and (3) economic issues, i.e., costs and benefits associated with the SSPS. The feasibility of the SSPS based on the parameters of the study was established.
Economic and social measures of biologic and climatic change. CIAP monograph 6. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1975-09-01
The Climatic Impact Assessment Program (CIAP) of the U.S. Department of Transportation is charged with the 'assessment' of the impact of future aircraft fleets and other vehicles operating in, or transiting through the stratosphere. CIAP monograph 6 addresses and conceptualizes socio-economic considerations for evaluating the terrestrial impact of, and potential necessary controls for, pollutant emissions in the stratosphere. It is best viewed as an exercise that utilizes methods developed in cost-benefit analyses and concentrates its attention on the conceptualization and measurement of the environmental costs associated with stratospheric travel. A set of parametric changes in particular climatic variables is considered.more » The results reported in this monograph are not considered definitive, because the monograph as a whole attacks a new problem, and one of a scale not confronted by economics or other social sciences in the past. However, one central fact, essentially independent of the investigative methods used, does emerge: seemingly small changes in mankind's climatic environment may give rise to subtle, diverse, but significant economic impacts and that these impacts, rather than being evenly spread across the economy, are likely to be specialized by activity, by location, or by both within nations and among nations. Discussed is the impact on fuel consumption, agriculture, forestry, materials degradation and wealth. (GRA)« less
NASA Astrophysics Data System (ADS)
Zhang, Lei; Yang, Si-Gang; Wang, Xiao-Jian; Gou, Dou-Dou; Chen, Hong-Wei; Chen, Ming-Hua; Xie, Shi-Zhong
2014-01-01
We report the experimental demonstration of the optical parametric gain generation in the 1 μm regime based on a photonic crystal fiber (PCF) with a zero group velocity dispersion (GVD) wavelength of 1062 nm pumped by a homemade tunable picosecond mode-locked ytterbium-doped fiber laser. A broad parametric gain band is obtained by pumping the PCF in the anomalous GVD regime with a relatively low power. Two separated narrow parametric gain bands are observed by pumping the PCF in the normal GVD regime. The peak of the parametric gain profile can be tuned from 927 to 1038 nm and from 1099 to 1228 nm. This widely tunable parametric gain band can be used for a broad band optical parametric amplifier, large span wavelength conversion or a tunable optical parametric oscillator.
Murphy, J R; Wasserman, S S; Baqar, S; Schlesinger, L; Ferreccio, C; Lindberg, A A; Levine, M M
1989-01-01
Experiments were performed in Baltimore, Maryland and in Santiago, Chile, to determine the level of Salmonella typhi antigen-driven in vitro lymphocyte replication response which signifies specific acquired immunity to this bacterium and to determine the best method of data analysis and form of data presentation. Lymphocyte replication was measured as incorporation of 3H-thymidine into desoxyribonucleic acid. Data (ct/min/culture) were analyzed in raw form and following log transformation, by non-parametric and parametric statistical procedures. A preference was developed for log-transformed data and discriminant analysis. Discriminant analysis of log-transformed data revealed 3H-thymidine incorporation rates greater than 3,433 for particulate S. typhi, Ty2 antigen stimulated cultures signified acquired immunity at a sensitivity and specificity of 82.7; for soluble S. typhi O polysaccharide antigen-stimulated cultures, ct/min/culture values of greater than 1,237 signified immunity (sensitivity and specificity 70.5%). PMID:2702777
A BEFORE AND AFTER TRIAL OF THE EFFECTIVENESS OF NETWORK ANALYSIS IN HEALTH OPERATIONS MANAGEMENT.
Bhalwar, R; Srivastava, M; Verma, S S; Vaze, M; Tilak, V W
1996-10-01
An intervention trial using "before-and-after" approach was undertaken to address the question whether network analysis as a health managerial tool of control can favourably affect the delays that occur in planning and executing the antimalaria operations of a Station Health Organization in a large military station. Exposure variable of interest was intervention with a network diagram, by which the potential causes of delay along the various activities were assessed and remedial measures were introduced during the second year. Sample size was calculated using conventional alpha and beta error levels. The study indicated that there was a definite beneficial outcome in that the operations could be started as well as completed in time during the intervention year. There was reduction in time requirement in 5 out of the 9 activities, the exact 'p' value being 0.08, by both parametric and non-parametric tests. The use of network analysis in health care management has been recommended.
NASA Astrophysics Data System (ADS)
Avila, Edward R.
The Electric Insertion Transfer Experiment (ELITE) is an Air Force Advanced Technology Transition Demonstration which is being executed as a cooperative Research and Development Agreement between the Phillips Lab and TRW. The objective is to build, test, and fly a solar-electric orbit transfer and orbit maneuvering vehicle, as a precursor to an operational electric orbit transfer vehicle (EOTV). This paper surveys some of the analysis tools used to do parametric studies and discusses the study results. The primary analysis tool was the Electric Vehicle Analyzer (EVA) developed by the Phillips Lab and modified by The Aerospace Corporation. It uses a simple orbit averaging approach to model low-thrust transfer performance, and runs in a PC environment. The assumptions used in deriving the EVA math model are presented. This tool and others surveyed were used to size the solar array power required for the spacecraft, and develop a baseline mission profile that meets the requirements of the ELITE mission.
NASA Astrophysics Data System (ADS)
Hayata, K.; Yanagawa, K.; Koshiba, M.
1990-12-01
A mode field analysis is presented of the second-harmonic electromagnetic wave that radiates from a nonlinear core bounded by a dielectric cladding. With this analysis the ultimate performance of the organic crystal-cored single-mode optical fiber waveguide as a guided-wave frequency doubler is evaluated through the solution of nonlinear parametric equations derived from Maxwell's equations under some assumptions. As a phase-matching scheme, a Cerenkov approach is considered because of advantages in actual device applications, in which the phase matching is achievable between the fundamental guided LP01 mode and the second-harmonic radiation (leaky) mode. Calculated results for organic cores made of benzil, 4-(N,N-dimethyl-amino)-3-acetamidonitrobenzen, 2-methyl-4-nitroaniline, and 4'-nitrobenzilidene-3-acetoamino-4-metxianiline provide useful data for designing an efficient fiber-optic wavelength converter utilizing nonlinear parametric processes. A detailed comparison is made between results for infinite and finite cladding thicknesses.
3D DNS and LES of Breaking Inertia-Gravity Waves
NASA Astrophysics Data System (ADS)
Remmler, S.; Fruman, M. D.; Hickel, S.; Achatz, U.
2012-04-01
As inertia-gravity waves we refer to gravity waves that have a sufficiently low frequency and correspondingly large horizontal wavelength to be strongly influenced by the Coriolis force. Inertia-gravity waves are very active in the middle atmosphere and their breaking is potentially an important influence on the circulation in this region. The parametrization of this process requires a good theoretical understanding, which we want to enhance with the present study. Primary linear instabilities of an inertia-gravity wave and "2.5-dimensional" nonlinear simulations (where the spatial dependence is two dimensional but the velocity and vorticity fields are three-dimensional) with the wave perturbed by its leading primary instabilities by Achatz [1] have shown that the breaking differs significantly from that of high-frequency gravity waves due to the strongly sheared component of velocity perpendicular to the plane of wave-propagation. Fruman & Achatz [2] investigated the three-dimensionalization of the breaking by computing the secondary linear instabilities of the same waves using singular vector analysis. These secondary instabilities are variations perpendicular to the direction of the primary perturbation and the wave itself, and their wavelengths are an order of magnitude shorter than both. In continuation of this work, we carried out fully three-dimensional nonlinear simulations of inertia-gravity waves perturbed by their leading primary and secondary instabilities. The direct numerical simulation (DNS) was made tractable by restricting the domain size to the dominant scales selected by the linear analyses. The study includes both convectively stable and unstable waves. To the best of our knowledge, this is the first fully three-dimensional nonlinear direct numerical simulation of inertia-gravity waves at realistic Reynolds numbers with complete resolution of the smallest turbulence scales. Previous simulations either were restricted to high frequency gravity waves (e. g. Fritts et al. [3]), or the ratio N/f was artificially reduced (e. g. Lelong & Dunkerton [4]). The present simulations give us insight into the three-dimensional breaking process as well as the emerging turbulence. We assess the possibility of reducing the computational costs of three-dimensional simulations by using an implicit turbulence subgrid-scale parametrization based on the Adaptive Local Deconvolution Method (ALDM) for stratified turbulence [5]. In addition, we have performed ensembles of nonlinear 2.5-dimensional DNS, like those in Achatz [1] but with a small amount of noise superposed to the initial state, and compared the results with coarse-resolution simulations using either ALDM as well as with standard LES schemes. We found that the results of the models with parametrized turbulence, which are orders of magnitude more computationally economical than the DNS, compare favorably with the DNS in terms of the decay of the wave amplitude with time (the quantity most important for application to gravity-wave drag parametrization) suggesting that they may be trusted in future simulations of gravity wave breaking.
Hall, Peter S; McCabe, Christopher; Stein, Robert C; Cameron, David
2012-01-04
Multi-parameter genomic tests identify patients with early-stage breast cancer who are likely to derive little benefit from adjuvant chemotherapy. These tests can potentially spare patients the morbidity from unnecessary chemotherapy and reduce costs. However, the costs of the test must be balanced against the health benefits and cost savings produced. This economic evaluation compared genomic test-directed chemotherapy using the Oncotype DX 21-gene assay with chemotherapy for all eligible patients with lymph node-positive, estrogen receptor-positive early-stage breast cancer. We performed a cost-utility analysis using a state transition model to calculate expected costs and benefits over the lifetime of a cohort of women with estrogen receptor-positive lymph node-positive breast cancer from a UK perspective. Recurrence rates for Oncotype DX-selected risk groups were derived from parametric survival models fitted to data from the Southwest Oncology Group 8814 trial. The primary outcome was the incremental cost-effectiveness ratio, expressed as the cost (in 2011 GBP) per quality-adjusted life-year (QALY). Confidence in the incremental cost-effectiveness ratio was expressed as a probability of cost-effectiveness and was calculated using Monte Carlo simulation. Model parameters were varied deterministically and probabilistically in sensitivity analysis. Value of information analysis was used to rank priorities for further research. The incremental cost-effectiveness ratio for Oncotype DX-directed chemotherapy using a recurrence score cutoff of 18 was £5529 (US $8852) per QALY. The probability that test-directed chemotherapy is cost-effective was 0.61 at a willingness-to-pay threshold of £30 000 per QALY. Results were sensitive to the recurrence rate, long-term anthracycline-related cardiac toxicity, quality of life, test cost, and the time horizon. The highest priority for further research identified by value of information analysis is the recurrence rate in test-selected subgroups. There is substantial uncertainty regarding the cost-effectiveness of Oncotype DX-directed chemotherapy. It is particularly important that future research studies to inform cost-effectiveness-based decisions collect long-term outcome data.
NASA Technical Reports Server (NTRS)
1973-01-01
A comprehensive analysis and parametric design effort was conducted under the earth-storable phase of the program. Passive Acquisition/expulsion system concepts were evaluated for a reusable Orbital Maneuvering System (OMS) application. The passive surface tension technique for providing gas free liquid on demand was superior to other propellant acquisition methods. Systems using fine mesh screens can provide the requisite stability and satisfy OMS mission requirements. Both fine mesh screen liner and trap systems were given detailed consideration in the parametric design, and trap systems were selected for this particular application. These systems are compatible with the 100- to 500-manned mission reuse requirements.
Individual heterogeneity and identifiability in capture-recapture models
Link, W.A.
2004-01-01
Individual heterogeneity in detection probabilities is a far more serious problem for capture-recapture modeling than has previously been recognized. In this note, I illustrate that population size is not an identifiable parameter under the general closed population mark-recapture model Mh. The problem of identifiability is obvious if the population includes individuals with pi = 0, but persists even when it is assumed that individual detection probabilities are bounded away from zero. Identifiability may be attained within parametric families of distributions for pi, but not among parametric families of distributions. Consequently, in the presence of individual heterogeneity in detection probability, capture-recapture analysis is strongly model dependent.
NASA Technical Reports Server (NTRS)
Hamabata, Hiromitsu
1993-01-01
A class of parametric instabilities of finite-amplitude, circularly polarized Alfven waves in a plasma with pressure anisotropy is studied by application of the CGL equations. A linear perturbation analysis is used to find the dispersion relation governing the instabilities, which is a fifth-order polynomial and is solved numerically. A large-amplitude, circularly polarized wave is unstable with respect to decay into three waves: one sound-like wave and two side-band Alfven-like waves. It is found that, in addition to the decay instability, two new instabilities that are absent in the framework of the MHD equations can occur, depending on the plasma parameters.
The effect of pumping noise on the characteristics of a single-stage parametric amplifier
NASA Astrophysics Data System (ADS)
Medvedev, S. Iu.; Muzychuk, O. V.
1983-10-01
An analysis is made of the operation of a single-stage parametric amplifier based on a varactor with a sharp transition. Analytical expressions are obtained for the statistical moments of the output signal, the signal-noise ratio, and other characteristics in the case when the output signal and the pump are a mixture of harmonic oscillation and Gaussian noise. It is shown that, when a noise component is present in the pump, an increase of its harmonic component to values close to the threshold leads to a sharp decrease in the signal-noise ratio at the amplifier output.
Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry
2004-06-01
This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.
Quantum noise and squeezing in optical parametric oscillator with arbitrary output coupling
NASA Technical Reports Server (NTRS)
Prasad, Sudhakar
1993-01-01
The redistribution of intrinsic quantum noise in the quadratures of the field generated in a sub-threshold degenerate optical parametric oscillator exhibits interesting dependences on the individual output mirror transmittances, when they are included exactly. We present a physical picture of this problem, based on mirror boundary conditions, which is valid for arbitrary transmittances. Hence, our picture applies uniformly to all values of the cavity Q factor representing, in the opposite extremes, both perfect oscillator and amplifier configurations. Beginning with a classical second-harmonic pump, we shall generalize our analysis to the finite amplitude and phase fluctuations of the pump.
Chacón, R; Martínez García-Hoz, A
1999-06-01
We study a parametrically damped two-well Duffing oscillator, subjected to a periodic string of symmetric pulses. The order-chaos threshold when altering solely the width of the pulses is investigated theoretically through Melnikov analysis. We show analytically and numerically that most of the results appear independent of the particular wave form of the pulses provided that the transmitted impulse is the same. By using this property, the stability boundaries of the stationary solutions are determined to first approximation by means of an elliptic harmonic balance method. Finally, the bifurcation behavior at the stability boundaries is determined numerically.
Multilevel Latent Class Analysis: Parametric and Nonparametric Models
ERIC Educational Resources Information Center
Finch, W. Holmes; French, Brian F.
2014-01-01
Latent class analysis is an analytic technique often used in educational and psychological research to identify meaningful groups of individuals within a larger heterogeneous population based on a set of variables. This technique is flexible, encompassing not only a static set of variables but also longitudinal data in the form of growth mixture…
Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares
NASA Technical Reports Server (NTRS)
Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.
2012-01-01
A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.
NASA Astrophysics Data System (ADS)
Leka, K. D.; Barnes, Graham; Wagner, Eric
2018-04-01
A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.
Edison, Paul; Brooks, David J; Turkheimer, Federico E; Archer, Hilary A; Hinz, Rainer
2009-11-01
Pittsburgh compound B or [11C]PIB is an amyloid imaging agent which shows a clear differentiation between subjects with Alzheimer's disease (AD) and controls. However the observed signal difference in other forms of dementia such as dementia with Lewy bodies (DLB) is smaller, and mild cognitively impaired (MCI) subjects and some healthy elderly normals may show intermediate levels of [11C]PIB binding. The cerebellum, a commonly used reference region for non-specific tracer uptake in [11C]PIB studies in AD may not be valid in Prion disorders or monogenic forms of AD. The aim of this work was to: 1-compare methods for generating parametric maps of [11C]PIB retention in tissue using a plasma input function in respect of their ability to discriminate between AD subjects and controls and 2-estimate the test-retest reproducibility in AD subjects. 12 AD subjects (5 of which underwent a repeat scan within 6 weeks) and 10 control subjects had 90 minute [11C]PIB dynamic PET scans, and arterial plasma input functions were measured. Parametric maps were generated with graphical analysis of reversible binding (Logan plot), irreversible binding (Patlak plot), and spectral analysis. Between group differentiation was calculated using Student's t-test and comparisons between different methods were made using p values. Reproducibility was assessed by intraclass correlation coefficients (ICC). We found that the 75 min value of the impulse response function showed the best group differentiation and had a higher ICC than volume of distribution maps generated from Logan and spectral analysis. Patlak analysis of [11C]PIB binding was the least reproducible.
Constellation Program Life-cycle Cost Analysis Model (LCAM)
NASA Technical Reports Server (NTRS)
Prince, Andy; Rose, Heidi; Wood, James
2008-01-01
The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.
Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus
2017-09-05
Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.
Hyperbolic and semi-parametric models in finance
NASA Astrophysics Data System (ADS)
Bingham, N. H.; Kiesel, Rüdiger
2001-02-01
The benchmark Black-Scholes-Merton model of mathematical finance is parametric, based on the normal/Gaussian distribution. Its principal parametric competitor, the hyperbolic model of Barndorff-Nielsen, Eberlein and others, is briefly discussed. Our main theme is the use of semi-parametric models, incorporating the mean vector and covariance matrix as in the Markowitz approach, plus a non-parametric part, a scalar function incorporating features such as tail-decay. Implementation is also briefly discussed.
Estimation of railroad capacity using parametric methods.
DOT National Transportation Integrated Search
2013-12-01
This paper reviews different methodologies used for railroad capacity estimation and presents a user-friendly method to measure capacity. The objective of this paper is to use multivariate regression analysis to develop a continuous relation of the d...
NASA Technical Reports Server (NTRS)
Toll, T. A.
1980-01-01
A parametric analysis was made to investigate the relationship between current cargo airplanes and possible future designs that may differ greatly in both size and configuration. The method makes use of empirical scaling laws developed from statistical studies of data from current and advanced airplanes and, in addition, accounts for payload density, effects of span distributed load, and variations in tail area ratio. The method is believed to be particularly useful for exploratory studies of design and technology options for large airplanes. The analysis predicts somewhat more favorable variations of the ratios of payload to gross weight and block fuel to payload as the airplane size is increased than has been generally understood from interpretations of the cube-square law. In terms of these same ratios, large all wing (spanloader) designs show an advantage over wing-fuselage designs.
Design of a High Thermal Gradient Bridgman Furnace
NASA Technical Reports Server (NTRS)
LeCroy, J. E.; Popok, D. P.
1994-01-01
The Advanced Automated Directional Solidification Furnace (AADSF) is a Bridgman-Stockbarger microgravity processing facility, designed and manifested to first fly aboard the second United States Microgravity Payload (USMP-2) Space Shuttle mission. The AADSF was principally designed to produce high axial thermal gradients, and is particularly suitable for metals solidification experiments, including non-dilute alloys. To accommodate a wider range of experimental conditions, the AADSF is equipped with a reconfigurable gradient zone. The overall design of the AADSF and the relationship between gradient zone design and furnace performance are described. Parametric thermal analysis was performed and used to select gradient zone design features that fulfill the high thermal gradient requirements of the USMP-2 experiment. The thermal model and analytical procedure, and parametric results leading to the first flight gradient zone configuration, are presented. Performance for the USMP-2 flight experiment is also predicted, and analysis results are compared to test data.
Barnes, A P
2006-09-01
Recent policy changes within the Common Agricultural Policy have led to a shift from a solely production-led agriculture towards the promotion of multi-functionality. Conversely, the removal of production-led supports would indicate that an increased concentration on production efficiencies would seem a critical strategy for a country's future competitiveness. This paper explores the relationship between the 'multi-functional' farming attitude desired by policy makers and its effect on technical efficiency within Scottish dairy farming. Technical efficiency scores are calculated by applying the non-parametric data envelopment analysis technique and then measured against causes of inefficiency. Amongst these explanatory factors is a constructed score of multi-functionality. This research finds that, amongst other factors, a multi-functional attitude has a significant positive effect on technical efficiency. Consequently, this seems to validate the promotion of a multi-functional approach to farming currently being championed by policy-makers.
Assessing noninferiority in a three-arm trial using the Bayesian approach.
Ghosh, Pulak; Nathoo, Farouk; Gönen, Mithat; Tiwari, Ram C
2011-07-10
Non-inferiority trials, which aim to demonstrate that a test product is not worse than a competitor by more than a pre-specified small amount, are of great importance to the pharmaceutical community. As a result, methodology for designing and analyzing such trials is required, and developing new methods for such analysis is an important area of statistical research. The three-arm trial consists of a placebo, a reference and an experimental treatment, and simultaneously tests the superiority of the reference over the placebo along with comparing this reference to an experimental treatment. In this paper, we consider the analysis of non-inferiority trials using Bayesian methods which incorporate both parametric as well as semi-parametric models. The resulting testing approach is both flexible and robust. The benefit of the proposed Bayesian methods is assessed via simulation, based on a study examining home-based blood pressure interventions. Copyright © 2011 John Wiley & Sons, Ltd.
NIRS-SPM: statistical parametric mapping for near infrared spectroscopy
NASA Astrophysics Data System (ADS)
Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul
2008-02-01
Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.
Structure of the alexithymic brain: A parametric coordinate-based meta-analysis.
Xu, Pengfei; Opmeer, Esther M; van Tol, Marie-José; Goerlich, Katharina S; Aleman, André
2018-04-01
Alexithymia refers to deficiencies in identifying and expressing emotions. This might be related to changes in structural brain volumes, but its neuroanatomical basis remains uncertain as studies have shown heterogeneous findings. Therefore, we conducted a parametric coordinate-based meta-analysis. We identified seventeen structural neuroimaging studies (including a total of 2586 individuals with different levels of alexithymia) investigating the association between gray matter volume and alexithymia. Volumes of the left insula, left amygdala, orbital frontal cortex and striatum were consistently smaller in people with high levels of alexithymia. These areas are important for emotion perception and emotional experience. Smaller volumes in these areas might lead to deficiencies in appropriately identifying and expressing emotions. These findings provide the first quantitative integration of results pertaining to the structural neuroanatomical basis of alexithymia. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Outer planet entry probe system study. Volume 2: Supporting technical studies
NASA Technical Reports Server (NTRS)
1972-01-01
The environment, science investigations, and general mission analysis considerations are given first. These data are followed by discussions of the studies pertaining to the planets Jupiter, Saturn, Uranus, and Neptune. Except for Neptune, each planet discussion is divided into two parts: (1) parametric activities and (2) probe definition for that planet, or the application of a given probe for that planet. The Neptune discussion is limited to parametrics in the area of science and mission analysis. Each of the probe system definitions consists of system and subsystem details including telecommunications, data handling, power pyrotechnics, attitude control, structures, propulsion, thermal control, and probe to spacecraft integration. The first configuration is discussed in detail and the subsequent configuration discussions are limited to the differences. Finally, the hardware availability to support a probe system and commonality of science, missions, and subsystems for use at the various planets are considered.
Parametric study using modal analysis of a bi-material plate with defects
NASA Astrophysics Data System (ADS)
Esola, S.; Bartoli, I.; Horner, S. E.; Zheng, J. Q.; Kontsos, A.
2015-03-01
Global vibrational method feasibility as a non-destructive inspection tool for multi-layered composites is evaluated using a simulated parametric study approach. A finite element model of a composite consisting of two, isotropic layers of dissimilar materials and a third, thin isotropic layer of adhesive is constructed as the representative test subject. Next, artificial damage is inserted according to systematic variations of the defect morphology parameters. A free-vibrational modal analysis simulation is executed for pristine and damaged plate conditions. Finally, resultant mode shapes and natural frequencies are extracted, compared and analyzed for trends. Though other defect types may be explored, the focus of this research is on interfacial delamination and its effects on the global, free-vibrational behavior of a composite plate. This study is part of a multi-year research effort conducted for the U.S. Army Program Executive Office - Soldier.
On-Line Robust Modal Stability Prediction using Wavelet Processing
NASA Technical Reports Server (NTRS)
Brenner, Martin J.; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.
NASA Technical Reports Server (NTRS)
1975-01-01
The transportation mass requirements developed for each mission and transportation mode were based on vehicle systems sized to fit the exact needs of each mission (i.e. rubber vehicles). The parametric data used to derive the mass requirements for each mission and transportation mode are presented to enable accommodation of possible changes in mode options or payload definitions. The vehicle sizing and functional requirements used to derive the parametric data will form the basis for conceptual configurations of the transportation elements in a later phase of study. An investigation of the weight growth approach to future space transportation systems analysis is presented. Parameters which affect weight growth, past weight histories, and the current state of future space-mission design are discussed. Weight growth factors of from 10 percent to 41 percent were derived for various missions or vehicles.
An analysis of heat effects in different subpopulations of Bangladesh.
Burkart, Katrin; Breitner, Susanne; Schneider, Alexandra; Khan, Md Mobarak Hossain; Krämer, Alexander; Endlicher, Wilfried
2014-03-01
A substantial number of epidemiological studies have demonstrated an association between atmospheric conditions and human all-cause as well as cause-specific mortality. However, most research has been performed in industrialised countries, whereas little is known about the atmosphere-mortality relationship in developing countries. Especially with regard to modifications from non-atmospheric conditions and intra-population differences, there is a substantial research deficit. Within the scope of this study, we aimed to investigate the effects of heat in a multi-stratified manner, distinguishing by the cause of death, age, gender, location and socio-economic status. We examined 22,840 death counts using semi-parametric Poisson regression models, adjusting for a multitude of potential confounders. Although Bangladesh is dominated by an increase of mortality with decreasing (equivalent) temperatures over a wide range of values, the findings demonstrated the existence of partly strong heat effects at the upper end of the temperature distribution. Moreover, the study demonstrated that the strength of these heat effects varied considerably over the investigated subgroups. The adverse effects of heat were particularly pronounced for males and the elderly above 65 years. Moreover, we found increased adverse effects of heat for urban areas and for areas with a high socio-economic status. The increase in, and acceleration of, urbanisation in Bangladesh, as well as the rapid aging of the population and the increase in non-communicable diseases, suggest that the relevance of heat-related mortality might increase further. Considering rising global temperatures, the adverse effects of heat might be further aggravated.
Gender differences of suicide in Japan, 1947-2010.
Liu, Y; Zhang, Y; Cho, Y T; Obayashi, Y; Arai, A; Tamashiro, H
2013-10-01
The effects of socio-economic factors on suicide were gender-dependent. Japanese suicide mortality gender ratio (male: female) had gradually increased during the twentieth century. With the data covering 1947-2010 collected from Japanese official websites, we conducted non-parametric rank test, curve estimations, spearman ranking correlation and quantile regression in succession with Stata version 12.0. The suicide mortality rate in male with a "U" shape had been always higher than that in female with a "J" shape. The male suicide mortality peaked around in 1955 (38.5 per 100,000 populations), dropped quickly afterwards until the 1970s; it increased in the 1980s with another peak in 2003 (33.2 per 100,000 populations). For female, an overall decreasing trend was seen with a peak during the 1950s (23.5 per 100,000 populations in 1958). It dropped gradually afterwards with small variations in 1970s and 80s, and was stabilized after 1995 (9.3 per 100,000 populations). The unemployment rate could be used as a single positive predictor of suicide mortality for men (p<0.01), while the total fertility rate (TFR) (p<0.01) and divorce rate (p<0.01) were significantly associated positively and negatively with women's suicide, respectively. The impact of mental disorders was not analyzed and age-specific analysis was not conducted. The findings of these gender differences in, and the associated factors with, suicide in Japan, warranted further studies including delineation of the implications of differential economic pressure between genders, as well as child-rearing pressure and marriage satisfaction. © 2013 Elsevier B.V. All rights reserved.
Villanueva, Pia; Newbury, Dianne F; Jara, Lilian; De Barbieri, Zulema; Mirza, Ghazala; Palomino, Hernán M; Fernández, María Angélica; Cazier, Jean-Baptiste; Monaco, Anthony P; Palomino, Hernán
2011-01-01
Specific language impairment (SLI) is an unexpected deficit in the acquisition of language skills and affects between 5 and 8% of pre-school children. Despite its prevalence and high heritability, our understanding of the aetiology of this disorder is only emerging. In this paper, we apply genome-wide techniques to investigate an isolated Chilean population who exhibit an increased frequency of SLI. Loss of heterozygosity (LOH) mapping and parametric and non-parametric linkage analyses indicate that complex genetic factors are likely to underlie susceptibility to SLI in this population. Across all analyses performed, the most consistently implicated locus was on chromosome 7q. This locus achieved highly significant linkage under all three non-parametric models (max NPL=6.73, P=4.0 × 10−11). In addition, it yielded a HLOD of 1.24 in the recessive parametric linkage analyses and contained a segment that was homozygous in two affected individuals. Further, investigation of this region identified a two-SNP haplotype that occurs at an increased frequency in language-impaired individuals (P=0.008). We hypothesise that the linkage regions identified here, in particular that on chromosome 7, may contain variants that underlie the high prevalence of SLI observed in this isolated population and may be of relevance to other populations affected by language impairments. PMID:21248734
Vasilyev, M; Choi, S K; Kumar, P; D'Ariano, G M
1998-09-01
Photon-number distributions for parametric fluorescence from a nondegenerate optical parametric amplifier are measured with a novel self-homodyne technique. These distributions exhibit the thermal-state character predicted by theory. However, a difference between the fluorescence gain and the signal gain of the parametric amplifier is observed. We attribute this difference to a change in the signal-beam profile during the traveling-wave pulsed amplification process.
NASA Astrophysics Data System (ADS)
Arakelyan, E. K.; Andryushin, A. V.; Burtsev, S. Y.; Andryushin, K. A.
2017-11-01
The analysis of technical and parametric constraints on the adjustment range of highpower CCP and recommended technological solutions in the technical literature for their elimination. Established that in the conditions of toughening the requirements for economy, reliability and maneuverability on the part of the system operator with the participation of CCP in control the frequency and power in the power system, existing methods do not ensure the fulfillment of these requirements. The current situation in the energy sector — the lack of highly manoeuvrable power equipment leads to the need participate in control of power consumption diagrams for all types of power plants, including CCP, although initially they were intended primarily for basic loads. Large-scale research conducted at the department of Automated control systems of technological processes, showed the possibility of a significant expansion of the adjustment range of CCP when it operating in the condensing mode and in the heating mode. The report presents the main results of these research for example the CCP-450 and CCP-450T. Various technological solutions are considered: when CCP in the condensation mode — the use of bypass steam distribution schemes, the transfer of a part of the steam turbine into a low-steam mode; when CCP operation in the heating mode — bypass steam distribution and the transfer CCP to gas turbine unit — power heating plants mode with the transfer the steam turbine to the motor mode. Data on the evaluation of the technical and economic feasibility of the proposed innovative technological solutions are presented in comparison with the methods used to solve this problem, which are used in practice, such as passing through the failures of the electric load graphs by transferring the CCP to the mode of operation with incomplete equipment. When comparing, both the economics, and the maneuverability and reliability of the equipment are considered.