Sample records for parametric cost analysis

  1. Parametric Cost Deployment

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1995-01-01

    Parametric cost analysis is a mathematical approach to estimating cost. Parametric cost analysis uses non-cost parameters, such as quality characteristics, to estimate the cost to bring forth, sustain, and retire a product. This paper reviews parametric cost analysis and shows how it can be used within the cost deployment process.

  2. Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates

    NASA Technical Reports Server (NTRS)

    Peffley, Al F.

    1991-01-01

    The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.

  3. Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney

    2010-01-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  4. Ground-Based Telescope Parametric Cost Model

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  5. Parametric Analysis of Light Truck and Automobile Maintenance

    DOT National Transportation Integrated Search

    1979-05-01

    Utilizing the Automotive and Light Truck Service and Repair Data Base developed in the campanion report, parametric analyses were made of the relationships between maintenance costs, schduled and unschduled, and vehicle parameters; body class, manufa...

  6. Multivariable Parametric Cost Model for Ground Optical Telescope Assembly

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia

    2005-01-01

    A parametric cost model for ground-based telescopes is developed using multivariable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction-limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature are examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e., multi-telescope phased-array systems). Additionally, single variable models Based on aperture diameter are derived.

  7. Constellation Program Life-cycle Cost Analysis Model (LCAM)

    NASA Technical Reports Server (NTRS)

    Prince, Andy; Rose, Heidi; Wood, James

    2008-01-01

    The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, L.T.; Hickey, M.

    This paper summarizes the progress to date by CH2M HILL and the UKAEA in development of a parametric modelling capability for estimating the costs of large nuclear decommissioning projects in the United Kingdom (UK) and Europe. The ability to successfully apply parametric cost estimating techniques will be a key factor to commercial success in the UK and European multi-billion dollar waste management, decommissioning and environmental restoration markets. The most useful parametric models will be those that incorporate individual components representing major elements of work: reactor decommissioning, fuel cycle facility decommissioning, waste management facility decommissioning and environmental restoration. Models must bemore » sufficiently robust to estimate indirect costs and overheads, permit pricing analysis and adjustment, and accommodate the intricacies of international monetary exchange, currency fluctuations and contingency. The development of a parametric cost estimating capability is also a key component in building a forward estimating strategy. The forward estimating strategy will enable the preparation of accurate and cost-effective out-year estimates, even when work scope is poorly defined or as yet indeterminate. Preparation of cost estimates for work outside the organizations current sites, for which detailed measurement is not possible and historical cost data does not exist, will also be facilitated. (authors)« less

  9. Multivariable Parametric Cost Model for Ground Optical: Telescope Assembly

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature were examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter were derived.

  10. Parametric analysis of closed cycle magnetohydrodynamic (MHD) power plants

    NASA Technical Reports Server (NTRS)

    Owens, W.; Berg, R.; Murthy, R.; Patten, J.

    1981-01-01

    A parametric analysis of closed cycle MHD power plants was performed which studied the technical feasibility, associated capital cost, and cost of electricity for the direct combustion of coal or coal derived fuel. Three reference plants, differing primarily in the method of coal conversion utilized, were defined. Reference Plant 1 used direct coal fired combustion while Reference Plants 2 and 3 employed on site integrated gasifiers. Reference Plant 2 used a pressurized gasifier while Reference Plant 3 used a ""state of the art' atmospheric gasifier. Thirty plant configurations were considered by using parametric variations from the Reference Plants. Parametric variations include the type of coal (Montana Rosebud or Illinois No. 6), clean up systems (hot or cold gas clean up), on or two stage atmospheric or pressurized direct fired coal combustors, and six different gasifier systems. Plant sizes ranged from 100 to 1000 MWe. Overall plant performance was calculated using two methodologies. In one task, the channel performance was assumed and the MHD topping cycle efficiencies were based on the assumed values. A second task involved rigorous calculations of channel performance (enthalpy extraction, isentropic efficiency and generator output) that verified the original (task one) assumptions. Closed cycle MHD capital costs were estimated for the task one plants; task two cost estimates were made for the channel and magnet only.

  11. Parametric Cost Analysis: A Design Function

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1989-01-01

    Parametric cost analysis uses equations to map measurable system attributes into cost. The measures of the system attributes are called metrics. The equations are called cost estimating relationships (CER's), and are obtained by the analysis of cost and technical metric data of products analogous to those to be estimated. Examples of system metrics include mass, power, failure_rate, mean_time_to_repair, energy _consumed, payload_to_orbit, pointing_accuracy, manufacturing_complexity, number_of_fasteners, and percent_of_electronics_weight. The basic assumption is that a measurable relationship exists between system attributes and the cost of the system. If a function exists, the attributes are cost drivers. Candidates for metrics include system requirement metrics and engineering process metrics. Requirements are constraints on the engineering process. From optimization theory we know that any active constraint generates cost by not permitting full optimization of the objective. Thus, requirements are cost drivers. Engineering processes reflect a projection of the requirements onto the corporate culture, engineering technology, and system technology. Engineering processes are an indirect measure of the requirements and, hence, are cost drivers.

  12. Impact of low cost refurbishable and standard spacecraft upon future NASA space programs. Payload effects follow-on study, appendix

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Mission analysis is discussed, including the consolidation and expansion of mission equipment and experiment characteristics, and determination of simplified shuttle flight schedule. Parametric analysis of standard space hardware and preliminary shuttle/payload constraints analysis are evaluated, along with the cost impact of low cost standard hardware.

  13. The costs of transit fare prepayment programs : a parametric cost analysis.

    DOT National Transportation Integrated Search

    Despite the renewed interest in transit fare prepayment plans over the past : 10 years, few transit managers have a clear idea of how much it costs to operate : and maintain a fare prepayment program. This report provides transit managers : with the ...

  14. Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2010-01-01

    A study is in-process to develop a multivariable parametric cost model for space telescopes. Cost and engineering parametric data has been collected on 30 different space telescopes. Statistical correlations have been developed between 19 variables of 59 variables sampled. Single Variable and Multi-Variable Cost Estimating Relationships have been developed. Results are being published.

  15. Can the Direct Medical Cost of Chronic Disease Be Transferred across Different Countries? Using Cost-of-Illness Studies on Type 2 Diabetes, Epilepsy and Schizophrenia as Examples

    PubMed Central

    Gao, Lan; Hu, Hao; Zhao, Fei-Li; Li, Shu-Chuen

    2016-01-01

    Objectives To systematically review cost of illness studies for schizophrenia (SC), epilepsy (EP) and type 2 diabetes mellitus (T2DM) and explore the transferability of direct medical cost across countries. Methods A comprehensive literature search was performed to yield studies that estimated direct medical costs. A generalized linear model (GLM) with gamma distribution and log link was utilized to explore the variation in costs that accounted by the included factors. Both parametric (Random-effects model) and non-parametric (Boot-strapping) meta-analyses were performed to pool the converted raw cost data (expressed as percentage of GDP/capita of the country where the study was conducted). Results In total, 93 articles were included (40 studies were for T2DM, 34 studies for EP and 19 studies for SC). Significant variances were detected inter- and intra-disease classes for the direct medical costs. Multivariate analysis identified that GDP/capita (p<0.05) was a significant factor contributing to the large variance in the cost results. Bootstrapping meta-analysis generated more conservative estimations with slightly wider 95% confidence intervals (CI) than the parametric meta-analysis, yielding a mean (95%CI) of 16.43% (11.32, 21.54) for T2DM, 36.17% (22.34, 50.00) for SC and 10.49% (7.86, 13.41) for EP. Conclusions Converting the raw cost data into percentage of GDP/capita of individual country was demonstrated to be a feasible approach to transfer the direct medical cost across countries. The approach from our study to obtain an estimated direct cost value along with the size of specific disease population from each jurisdiction could be used for a quick check on the economic burden of particular disease for countries without such data. PMID:26814959

  16. Cost Estimation and Control for Flight Systems

    NASA Technical Reports Server (NTRS)

    Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)

    2002-01-01

    Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.

  17. Creating A Data Base For Design Of An Impeller

    NASA Technical Reports Server (NTRS)

    Prueger, George H.; Chen, Wei-Chung

    1993-01-01

    Report describes use of Taguchi method of parametric design to create data base facilitating optimization of design of impeller in centrifugal pump. Data base enables systematic design analysis covering all significant design parameters. Reduces time and cost of parametric optimization of design: for particular impeller considered, one can cover 4,374 designs by computational simulations of performance for only 18 cases.

  18. Space biology initiative program definition review. Trade study 3: Hardware miniaturization versus cost

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Davidson, William L.; Herbert, Frank J.; Bilodeau, James W.; Stoval, J. Michael; Sutton, Terry

    1989-01-01

    The optimum hardware miniaturization level with the lowest cost impact for space biology hardware was determined. Space biology hardware and/or components/subassemblies/assemblies which are the most likely candidates for application of miniaturization are to be defined and relative cost impacts of such miniaturization are to be analyzed. A mathematical or statistical analysis method with the capability to support development of parametric cost analysis impacts for levels of production design miniaturization are provided.

  19. Preliminary Multi-Variable Parametric Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.

  20. PARAMETRIC ANALYSIS OF THE INSTALLATION AND OPERATING COSTS OF ACTIVE SOIL DEPRESSURIZATION SYSTEMS FOR RESIDENTIAL RADON MITIGATION

    EPA Science Inventory

    The report gives results of a recent analysis showing that cost- effective indoor radon reduction technology is required for houses with initial radon concentrations < 4 pCi/L, because 78-86% of the national lung cancer risk due to radon is associated with those houses. ctive soi...

  1. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering process parameters are elicited from the engineer/expert on the project and are based on that expert's technical knowledge. These are converted by a parametric cost model into a cost estimate. The method discussed makes no assumptions about the distribution underlying the distribution of possible costs, and is not tied to the analysis of previous projects, except through the expert calibrations performed by the parametric cost analyst.

  2. Approximation Model Building for Reliability & Maintainability Characteristics of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.

    2000-01-01

    This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.

  3. Parametric versus Cox's model: an illustrative analysis of divorce in Canada.

    PubMed

    Balakrishnan, T R; Rao, K V; Krotki, K J; Lapierre-adamcyk, E

    1988-06-01

    Recent demographic literature clearly recognizes the importance of survival modes in the analysis of cross-sectional event histories. Of the various survival models, Cox's (1972) partial parametric model has been very popular due to its simplicity, and readily available computer software for estimation, sometimes at the cost of precision and parsimony of the model. This paper focuses on parametric failure time models for event history analysis such as Weibell, lognormal, loglogistic, and exponential models. The authors also test the goodness of fit of these parametric models versus the Cox's proportional hazards model taking Kaplan-Meier estimate as base. As an illustration, the authors reanalyze the Canadian Fertility Survey data on 1st marriage dissolution with parametric models. Though these parametric model estimates were not very different from each other, there seemed to be a slightly better fit with loglogistic. When 8 covariates were used in the analysis, it was found that the coefficients were similar in the models, and the overall conclusions about the relative risks would not have been different. The findings reveal that in marriage dissolution, the differences according to demographic and socioeconomic characteristics may be far more important than is generally found in many studies. Therefore, one should not treat the population as homogeneous in analyzing survival probabilities of marriages, other than for cursory analysis of overall trends.

  4. Estimation Model of Spacecraft Parameters and Cost Based on a Statistical Analysis of COMPASS Designs

    NASA Technical Reports Server (NTRS)

    Gerberich, Matthew W.; Oleson, Steven R.

    2013-01-01

    The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.

  5. Preliminary Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Prince, F. Andrew; Smart, Christian; Stephens, Kyle; Henrichs, Todd

    2009-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. However, great care is required. Some space telescope cost models, such as those based only on mass, lack sufficient detail to support such analysis and may lead to inaccurate conclusions. Similarly, using ground based telescope models which include the dome cost will also lead to inaccurate conclusions. This paper reviews current and historical models. Then, based on data from 22 different NASA space telescopes, this paper tests those models and presents preliminary analysis of single and multi-variable space telescope cost models.

  6. Energy Conversion Alternatives Study (ECAS), Westinghouse phase 1. Volume 11: Advanced steam systems. [energy conversion efficiency for electric power plants using steam

    NASA Technical Reports Server (NTRS)

    Wolfe, R. W.

    1976-01-01

    A parametric analysis was made of three types of advanced steam power plants that use coal in order to have a comparison of the cost of electricity produced by them a wide range of primary performance variables. Increasing the temperature and pressure of the steam above current industry levels resulted in increased energy costs because the cost of capital increased more than the fuel cost decreased. While the three plant types produced comparable energy cost levels, the pressurized fluidized bed boiler plant produced the lowest energy cost by the small margin of 0.69 mills/MJ (2.5 mills/kWh). It is recommended that this plant be designed in greater detail to determine its cost and performance more accurately than was possible in a broad parametric study and to ascertain problem areas which will require development effort. Also considered are pollution control measures such as scrubbers and separates for particulate emissions from stack gases.

  7. Space biology initiative program definition review. Trade study 4: Design modularity and commonality

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Davidson, William L.; Herbert, Frank J.; Bilodeau, James W.; Stoval, J. Michael; Sutton, Terry

    1989-01-01

    The relative cost impacts (up or down) of developing Space Biology hardware using design modularity and commonality is studied. Recommendations for how the hardware development should be accomplished to meet optimum design modularity requirements for Life Science investigation hardware will be provided. In addition, the relative cost impacts of implementing commonality of hardware for all Space Biology hardware are defined. Cost analysis and supporting recommendations for levels of modularity and commonality are presented. A mathematical or statistical cost analysis method with the capability to support development of production design modularity and commonality impacts to parametric cost analysis is provided.

  8. Parametric study of prospective early commercial MHD power plants (PSPEC). General Electric Company, task 1: Parametric analysis

    NASA Technical Reports Server (NTRS)

    Marston, C. H.; Alyea, F. N.; Bender, D. J.; Davis, L. K.; Dellinger, T. C.; Hnat, J. G.; Komito, E. H.; Peterson, C. A.; Rogers, D. A.; Roman, A. J.

    1980-01-01

    The performance and cost of moderate technology coal-fired open cycle MHD/steam power plant designs which can be expected to require a shorter development time and have a lower development cost than previously considered mature OCMHD/steam plants were determined. Three base cases were considered: an indirectly-fired high temperature air heater (HTAH) subsystem delivering air at 2700 F, fired by a state of the art atmospheric pressure gasifier, and the HTAH subsystem was deleted and oxygen enrichment was used to obtain requisite MHD combustion temperature. Coal pile to bus bar efficiencies in ease case 1 ranged from 41.4% to 42.9%, and cost of electricity (COE) was highest of the three base cases. For base case 2 the efficiency range was 42.0% to 45.6%, and COE was lowest. For base case 3 the efficiency range was 42.9% to 44.4%, and COE was intermediate. The best parametric cases in bases cases 2 and 3 are recommended for conceptual design. Eventual choice between these approaches is dependent on further evaluation of the tradeoffs among HTAH development risk, O2 plant integration, and further refinements of comparative costs.

  9. Gas engine heat pump cycle analysis. Volume 1: Model description and generic analysis

    NASA Astrophysics Data System (ADS)

    Fischer, R. D.

    1986-10-01

    The task has prepared performance and cost information to assist in evaluating the selection of high voltage alternating current components, values for component design variables, and system configurations and operating strategy. A steady-state computer model for performance simulation of engine-driven and electrically driven heat pumps was prepared and effectively used for parametric and seasonal performance analyses. Parametric analysis showed the effect of variables associated with design of recuperators, brine coils, domestic hot water heat exchanger, compressor size, engine efficiency, insulation on exhaust and brine piping. Seasonal performance data were prepared for residential and commercial units in six cities with system configurations closely related to existing or contemplated hardware of the five GRI engine contractors. Similar data were prepared for an advanced variable-speed electric unit for comparison purposes. The effect of domestic hot water production on operating costs was determined. Four fan-operating strategies and two brine loop configurations were explored.

  10. Single-stage-to-orbit versus two-stage-two-orbit: A cost perspective

    NASA Astrophysics Data System (ADS)

    Hamaker, Joseph W.

    1996-03-01

    This paper considers the possible life-cycle costs of single-stage-to-orbit (SSTO) and two-stage-to-orbit (TSTO) reusable launch vehicles (RLV's). The analysis parametrically addresses the issue such that the preferred economic choice comes down to the relative complexity of the TSTO compared to the SSTO. The analysis defines the boundary complexity conditions at which the two configurations have equal life-cycle costs, and finally, makes a case for the economic preference of SSTO over TSTO.

  11. NASA's X-Plane Database and Parametric Cost Model v 2.0

    NASA Technical Reports Server (NTRS)

    Sterk, Steve; Ogluin, Anthony; Greenberg, Marc

    2016-01-01

    The NASA Armstrong Cost Engineering Team with technical assistance from NASA HQ (SID)has gone through the full process in developing new CERs from Version #1 to Version #2 CERs. We took a step backward and reexamined all of the data collected, such as dependent and independent variables, cost, dry weight, length, wingspan, manned versus unmanned, altitude, Mach number, thrust, and skin. We used a well- known statistical analysis tool called CO$TAT instead of using "R" multiple linear or the "Regression" tool found in Microsoft Excel(TradeMark). We setup an "array of data" by adding 21" dummy variables;" we analyzed the standard error (SE) and then determined the "best fit." We have parametrically priced-out several future X-planes and compared our results to those of other resources. More work needs to be done in getting "accurate and traceable cost data" from historical X-plane records!

  12. Parametric modelling of cost data in medical studies.

    PubMed

    Nixon, R M; Thompson, S G

    2004-04-30

    The cost of medical resources used is often recorded for each patient in clinical studies in order to inform decision-making. Although cost data are generally skewed to the right, interest is in making inferences about the population mean cost. Common methods for non-normal data, such as data transformation, assuming asymptotic normality of the sample mean or non-parametric bootstrapping, are not ideal. This paper describes possible parametric models for analysing cost data. Four example data sets are considered, which have different sample sizes and degrees of skewness. Normal, gamma, log-normal, and log-logistic distributions are fitted, together with three-parameter versions of the latter three distributions. Maximum likelihood estimates of the population mean are found; confidence intervals are derived by a parametric BC(a) bootstrap and checked by MCMC methods. Differences between model fits and inferences are explored.Skewed parametric distributions fit cost data better than the normal distribution, and should in principle be preferred for estimating the population mean cost. However for some data sets, we find that models that fit badly can give similar inferences to those that fit well. Conversely, particularly when sample sizes are not large, different parametric models that fit the data equally well can lead to substantially different inferences. We conclude that inferences are sensitive to choice of statistical model, which itself can remain uncertain unless there is enough data to model the tail of the distribution accurately. Investigating the sensitivity of conclusions to choice of model should thus be an essential component of analysing cost data in practice. Copyright 2004 John Wiley & Sons, Ltd.

  13. Cost and efficiency of disaster waste disposal: A case study of the Great East Japan Earthquake.

    PubMed

    Sasao, Toshiaki

    2016-12-01

    This paper analyzes the cost and efficiency of waste disposal associated with the Great East Japan Earthquake. The following two analyses were performed: (1) a popular parametric approach, which is an ordinary least squares (OLS) method to estimate the factors that affect the disposal costs; (2) a non-parametric approach, which is a two-stage data envelopment analysis (DEA) to analyze the efficiency of each municipality and clarify the best performance of the disaster waste management. Our results indicate that a higher recycling rate of disaster waste and a larger amount of tsunami sediments decrease the average disposal costs. Our results also indicate that area-wide management increases the average cost. In addition, the efficiency scores were observed to vary widely by municipality, and more temporary incinerators and secondary waste stocks improve the efficiency scores. However, it is likely that the radioactive contamination from the Fukushima Daiichi nuclear power station influenced the results. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Update on Multi-Variable Parametric Cost Models for Ground and Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2012-01-01

    Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper reports on recent revisions and improvements to our ground telescope cost model and refinements of our understanding of space telescope cost models. One interesting observation is that while space telescopes are 50X to 100X more expensive than ground telescopes, their respective scaling relationships are similar. Another interesting speculation is that the role of technology development may be different between ground and space telescopes. For ground telescopes, the data indicates that technology development tends to reduce cost by approximately 50% every 20 years. But for space telescopes, there appears to be no such cost reduction because we do not tend to re-fly similar systems. Thus, instead of reducing cost, 20 years of technology development may be required to enable a doubling of space telescope capability. Other findings include: mass should not be used to estimate cost; spacecraft and science instrument costs account for approximately 50% of total mission cost; and, integration and testing accounts for only about 10% of total mission cost.

  15. Parametric Cost and Schedule Modeling for Early Technology Development

    DTIC Science & Technology

    2018-04-02

    Best Paper in the Analysis Methods Category and 2017 Best Paper Overall awards. It was also presented at the 2017 NASA Cost and Schedule Symposium... Methods over the Project Life Cycle .............................................................................................. 2 Figure 2. Average...information contribute to the lack of data, objective models, and methods that can be broadly applied in early planning stages. Scientific

  16. Development of Regional Supply Functions and a Least-Cost Model for Allocating Water Resources in Utah: A Parametric Linear Programming Approach.

    DTIC Science & Technology

    SYSTEMS ANALYSIS, * WATER SUPPLIES, MATHEMATICAL MODELS, OPTIMIZATION, ECONOMICS, LINEAR PROGRAMMING, HYDROLOGY, REGIONS, ALLOCATIONS, RESTRAINT, RIVERS, EVAPORATION, LAKES, UTAH, SALVAGE, MINES(EXCAVATIONS).

  17. The total assessment profile, volume 2. [including societal impact, cost effectiveness, and economic analysis

    NASA Technical Reports Server (NTRS)

    Leininger, G.; Jutila, S.; King, J.; Muraco, W.; Hansell, J.; Lindeen, J.; Franckowiak, E.; Flaschner, A.

    1975-01-01

    Appendices are presented which include discussions of interest formulas, factors in regionalization, parametric modeling of discounted benefit-sacrifice streams, engineering economic calculations, and product innovation. For Volume 1, see .

  18. Non-parametric methods for cost-effectiveness analysis: the central limit theorem and the bootstrap compared.

    PubMed

    Nixon, Richard M; Wonderling, David; Grieve, Richard D

    2010-03-01

    Cost-effectiveness analyses (CEA) alongside randomised controlled trials commonly estimate incremental net benefits (INB), with 95% confidence intervals, and compute cost-effectiveness acceptability curves and confidence ellipses. Two alternative non-parametric methods for estimating INB are to apply the central limit theorem (CLT) or to use the non-parametric bootstrap method, although it is unclear which method is preferable. This paper describes the statistical rationale underlying each of these methods and illustrates their application with a trial-based CEA. It compares the sampling uncertainty from using either technique in a Monte Carlo simulation. The experiments are repeated varying the sample size and the skewness of costs in the population. The results showed that, even when data were highly skewed, both methods accurately estimated the true standard errors (SEs) when sample sizes were moderate to large (n>50), and also gave good estimates for small data sets with low skewness. However, when sample sizes were relatively small and the data highly skewed, using the CLT rather than the bootstrap led to slightly more accurate SEs. We conclude that while in general using either method is appropriate, the CLT is easier to implement, and provides SEs that are at least as accurate as the bootstrap. (c) 2009 John Wiley & Sons, Ltd.

  19. Towards a Multi-Variable Parametric Cost Model for Ground and Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd

    2016-01-01

    Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper hypothesizes a single model, based on published models and engineering intuition, for both ground and space telescopes: OTA Cost approximately (X) D(exp (1.75 +/- 0.05)) lambda(exp(-0.5 +/- 0.25) T(exp -0.25) e (exp (-0.04)Y). Specific findings include: space telescopes cost 50X to 100X more ground telescopes; diameter is the most important CER; cost is reduced by approximately 50% every 20 years (presumably because of technology advance and process improvements); and, for space telescopes, cost associated with wavelength performance is balanced by cost associated with operating temperature. Finally, duplication only reduces cost for the manufacture of identical systems (i.e. multiple aperture sparse arrays or interferometers). And, while duplication does reduce the cost of manufacturing the mirrors of segmented primary mirror, this cost savings does not appear to manifest itself in the final primary mirror assembly (presumably because the structure for a segmented mirror is more complicated than for a monolithic mirror).

  20. Multivariable parametric cost model for space and ground telescopes

    NASA Astrophysics Data System (ADS)

    Stahl, H. Philip; Henrichs, Todd

    2016-09-01

    Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper hypothesizes a single model, based on published models and engineering intuition, for both ground and space telescopes: OTA Cost (X) D (1.75 +/- 0.05) λ (-0.5 +/- 0.25) T-0.25 e (-0.04) Y Specific findings include: space telescopes cost 50X to 100X more ground telescopes; diameter is the most important CER; cost is reduced by approximately 50% every 20 years (presumably because of technology advance and process improvements); and, for space telescopes, cost associated with wavelength performance is balanced by cost associated with operating temperature. Finally, duplication only reduces cost for the manufacture of identical systems (i.e. multiple aperture sparse arrays or interferometers). And, while duplication does reduce the cost of manufacturing the mirrors of segmented primary mirror, this cost savings does not appear to manifest itself in the final primary mirror assembly (presumably because the structure for a segmented mirror is more complicated than for a monolithic mirror).

  1. Update on Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl. H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2011-01-01

    Since the June 2010 Astronomy Conference, an independent review of our cost data base discovered some inaccuracies and inconsistencies which can modify our previously reported results. This paper will review changes to the data base, our confidence in those changes and their effect on various parametric cost models

  2. Modeling personnel turnover in the parametric organization

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1991-01-01

    A model is developed for simulating the dynamics of a newly formed organization, credible during all phases of organizational development. The model development process is broken down into the activities of determining the tasks required for parametric cost analysis (PCA), determining the skills required for each PCA task, determining the skills available in the applicant marketplace, determining the structure of the model, implementing the model, and testing it. The model, parameterized by the likelihood of job function transition, has demonstrated by the capability to represent the transition of personnel across functional boundaries within a parametric organization using a linear dynamical system, and the ability to predict required staffing profiles to meet functional needs at the desired time. The model can be extended by revisions of the state and transition structure to provide refinements in functional definition for the parametric and extended organization.

  3. Weight and the Future of Space Flight Hardware Cost Modeling

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.

    2003-01-01

    Weight has been used as the primary input variable for cost estimating almost as long as there have been parametric cost models. While there are good reasons for using weight, serious limitations exist. These limitations have been addressed by multi-variable equations and trend analysis in models such as NAFCOM, PRICE, and SEER; however, these models have not be able to address the significant time lags that can occur between the development of similar space flight hardware systems. These time lags make the cost analyst's job difficult because insufficient data exists to perform trend analysis, and the current set of parametric models are not well suited to accommodating process improvements in space flight hardware design, development, build and test. As a result, people of good faith can have serious disagreement over the cost for new systems. To address these shortcomings, new cost modeling approaches are needed. The most promising approach is process based (sometimes called activity) costing. Developing process based models will require a detailed understanding of the functions required to produce space flight hardware combined with innovative approaches to estimating the necessary resources. Particularly challenging will be the lack of data at the process level. One method for developing a model is to combine notional algorithms with a discrete event simulation and model changes to the total cost as perturbations to the program are introduced. Despite these challenges, the potential benefits are such that efforts should be focused on developing process based cost models.

  4. Astronomy sortie mission definition study. Addendum: Follow-on analyses

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of design analyses, trade studies, and planning data of the Astronomy Sortie Mission Definition Study are presented. An in-depth analysis of UV instruments, nondeployed solar payload, and on-orbit access is presented. Planning data are considered, including the cost and schedules associated with the astronomy instruments and/or support hardware. Costs are presented in a parametric fashion.

  5. Uncertainty importance analysis using parametric moment ratio functions.

    PubMed

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  6. Multi Response Optimization of Laser Micro Marking Process:A Grey- Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Shivakoti, I.; Das, P. P.; Kibria, G.; Pradhan, B. B.; Mustafa, Z.; Ghadai, R. K.

    2017-07-01

    The selection of optimal parametric combination for efficient machining has always become a challenging issue for the manufacturing researcher. The optimal parametric combination always provides a better machining which improves the productivity, product quality and subsequently reduces the production cost and time. The paper presents the hybrid approach of Grey relational analysis and Fuzzy logic to obtain the optimal parametric combination for better laser beam micro marking on the Gallium Nitride (GaN) work material. The response surface methodology has been implemented for design of experiment considering three parameters with their five levels. The parameter such as current, frequency and scanning speed has been considered and the mark width, mark depth and mark intensity has been considered as the process response.

  7. Latest NASA Instrument Cost Model (NICM): Version VI

    NASA Technical Reports Server (NTRS)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  8. Launcher Systems Development Cost: Behavior, Uncertainty, Influences, Barriers and Strategies for Reduction

    NASA Technical Reports Server (NTRS)

    Shaw, Eric J.

    2001-01-01

    This paper will report on the activities of the IAA Launcher Systems Economics Working Group in preparations for its Launcher Systems Development Cost Behavior Study. The Study goals include: improve launcher system and other space system parametric cost analysis accuracy; improve launcher system and other space system cost analysis credibility; and provide launcher system and technology development program managers and other decisionmakers with useful information on development cost impacts of their decisions. The Working Group plans to explore at least the following five areas in the Study: define and explain development cost behavior terms and concepts for use in the Study; identify and quantify sources of development cost and cost estimating uncertainty; identify and quantify significant influences on development cost behavior; identify common barriers to development cost understanding and reduction; and recommend practical, realistic strategies to accomplish reductions in launcher system development cost.

  9. X-1 to X-Wings: Developing a Parametric Cost Model

    NASA Technical Reports Server (NTRS)

    Sterk, Steve; McAtee, Aaron

    2015-01-01

    In todays cost-constrained environment, NASA needs an X-Plane database and parametric cost model that can quickly provide rough order of magnitude predictions of cost from initial concept to first fight of potential X-Plane aircraft. This paper takes a look at the steps taken in developing such a model and reports the results. The challenges encountered in the collection of historical data and recommendations for future database management are discussed. A step-by-step discussion of the development of Cost Estimating Relationships (CERs) is then covered.

  10. Orbit transfer vehicle engine study. Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The orbit transfer vehicle (OTV) engine study provided parametric performance, engine programmatic, and cost data on the complete propulsive spectrum that is available for a variety of high energy, space maneuvering missions. Candidate OTV engines from the near term RL 10 (and its derivatives) to advanced high performance expander and staged combustion cycle engines were examined. The RL 10/RL 10 derivative performance, cost and schedule data were updated and provisions defined which would be necessary to accommodate extended low thrust operation. Parametric performance, weight, envelope, and cost data were generated for advanced expander and staged combustion OTV engine concepts. A prepoint design study was conducted to optimize thrust chamber geometry and cooling, engine cycle variations, and controls for an advanced expander engine. Operation at low thrust was defined for the advanced expander engine and the feasibility and design impact of kitting was investigated. An analysis of crew safety and mission reliability was conducted for both the staged combustion and advanced expander OTV engine candidates.

  11. Developing integrated parametric planning models for budgeting and managing complex projects

    NASA Technical Reports Server (NTRS)

    Etnyre, Vance A.; Black, Ken U.

    1988-01-01

    The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.

  12. Experiment in multiple-criteria energy policy analysis

    NASA Astrophysics Data System (ADS)

    Ho, J. K.

    1980-07-01

    An international panel of energy analysts participated in an experiment to use HOPE (holistic preference evaluation): an interactive parametric linear programming method for multiple criteria optimization. The criteria of cost, environmental effect, crude oil, and nuclear fuel were considered, according to BESOM: an energy model for the US in the year 2000.

  13. Modeling Personnel Turnover in the Parametric Organization

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1991-01-01

    A primary issue in organizing a new parametric cost analysis function is to determine the skill mix and number of personnel required. The skill mix can be obtained by a functional decomposition of the tasks required within the organization and a matrixed correlation with educational or experience backgrounds. The number of personnel is a function of the skills required to cover all tasks, personnel skill background and cross training, the intensity of the workload for each task, migration through various tasks by personnel along a career path, personnel hiring limitations imposed by management and the applicant marketplace, personnel training limitations imposed by management and personnel capability, and the rate at which personnel leave the organization for whatever reason. Faced with the task of relating all of these organizational facets in order to grow a parametric cost analysis (PCA) organization from scratch, it was decided that a dynamic model was required in order to account for the obvious dynamics of the forming organization. The challenge was to create such a simple model which would be credible during all phases of organizational development. The model development process was broken down into the activities of determining the tasks required for PCA, determining the skills required for each PCA task, determining the skills available in the applicant marketplace, determining the structure of the dynamic model, implementing the dynamic model, and testing the dynamic model.

  14. Parametric study of potential early commercial power plants Task 3-A MHD cost analysis

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The development of costs for an MHD Power Plant and the comparison of these costs to a conventional coal fired power plant are reported. The program is divided into three activities: (1) code of accounts review; (2) MHD pulverized coal power plant cost comparison; (3) operating and maintenance cost estimates. The scope of each NASA code of account item was defined to assure that the recently completed Task 3 capital cost estimates are consistent with the code of account scope. Improvement confidence in MHD plant capital cost estimates by identifying comparability with conventional pulverized coal fired (PCF) power plant systems is undertaken. The basis for estimating the MHD plant operating and maintenance costs of electricity is verified.

  15. Air Brayton Solar Receiver, phase 1

    NASA Technical Reports Server (NTRS)

    Zimmerman, D. K.

    1979-01-01

    A six month analysis and conceptual design study of an open cycle Air Brayton Solar Receiver (ABSR) for use on a tracking, parabolic solar concentrator are discussed. The ABSR, which includes a buffer storage system, is designed to provide inlet air to a power conversion unit. Parametric analyses, conceptual design, interface requirements, and production cost estimates are described. The design features were optimized to yield a zero maintenance, low cost, high efficiency concept that will provide a 30 year operational life.

  16. Parametric analysis of ATT configurations.

    NASA Technical Reports Server (NTRS)

    Lange, R. H.

    1972-01-01

    This paper describes the results of a Lockheed parametric analysis of the performance, environmental factors, and economics of an advanced commercial transport envisioned for operation in the post-1985 time period. The design parameters investigated include cruise speeds from Mach 0.85 to Mach 1.0, passenger capacities from 200 to 500, ranges of 2800 to 5500 nautical miles, and noise level criteria. NASA high performance configurations and alternate configurations are operated over domestic and international route structures. Indirect and direct costs and return on investment are determined for approximately 40 candidate aircraft configurations. The candidate configurations are input to an aircraft sizing and performance program which includes a subroutine for noise criteria. Comparisons are made between preferred configurations on the basis of maximum return on investment as a function of payload, range, and design cruise speed.

  17. Cost-effectiveness analysis of trastuzumab emtansine (T-DM1) in human epidermal growth factor receptor 2 (HER2): positive advanced breast cancer.

    PubMed

    Le, Quang A; Bae, Yuna H; Kang, Jenny H

    2016-10-01

    The EMILIA trial demonstrated that trastuzumab emtansine (T-DM1) significantly increased the median profession-free and overall survival relative to combination therapy with lapatinib plus capecitabine (LC) in patients with HER2-positive advanced breast cancer (ABC) previously treated with trastuzumab and a taxane. We performed an economic analysis of T-DM1 as a second-line therapy compared to LC and monotherapy with capecitabine (C) from both perspectives of the US payer and society. We developed four possible Markov models for ABC to compare the projected life-time costs and outcomes of T-DM1, LC, and C. Model transition probabilities were estimated from the EMILIA and EGF100151 clinical trials. Direct costs of the therapies, major adverse events, laboratory tests, and disease progression, indirect costs (productivity losses due to morbidity and mortality), and health utilities were obtained from published sources. The models used 3 % discount rate and reported in 2015 US dollars. Probabilistic sensitivity analysis and model averaging were used to account for model parametric and structural uncertainty. When incorporating both model parametric and structural uncertainty, the resulting incremental cost-effectiveness ratios (ICER) comparing T-DM1 to LC and T-DM1 to C were $183,828 per quality-adjusted life year (QALY) and $126,001/QALY from the societal perspective, respectively. From the payer's perspective, the ICERs were $220,385/QALY (T-DM1 vs. LC) and $168,355/QALY (T-DM1 vs. C). From both perspectives of the US payer and society, T-DM1 is not cost-effective when comparing to the LC combination therapy at a willingness-to-pay threshold of $150,000/QALY. T-DM1 might have a better chance to be cost-effective compared to capecitabine monotherapy from the US societal perspective.

  18. How to perform a cost-effectiveness analysis with surrogate endpoint: renal denervation in patients with resistant hypertension (DENERHTN) trial as an example.

    PubMed

    Bulsei, Julie; Darlington, Meryl; Durand-Zaleski, Isabelle; Azizi, Michel

    2018-04-01

    Whilst much uncertainty exists as to the efficacy of renal denervation (RDN), the positive results of the DENERHTN study in France confirmed the interest of an economic evaluation in order to assess efficiency of RDN and inform local decision makers about the costs and benefits of this intervention. The uncertainty surrounding both the outcomes and the costs can be described using health economic methods such as the non-parametric bootstrap. Internationally, numerous health economic studies using a cost-effectiveness model to assess the impact of RDN in terms of cost and effectiveness compared to antihypertensive medical treatment have been conducted. The DENERHTN cost-effectiveness study was the first health economic evaluation specifically designed to assess the cost-effectiveness of RDN using individual data. Using the DENERHTN results as an example, we provide here a summary of the principle methods used to perform a cost-effectiveness analysis.

  19. Advanced transportation system studies technical area 2 (TA-2): Heavy lift launch vehicle development. volume 3; Program Cost estimates

    NASA Technical Reports Server (NTRS)

    McCurry, J. B.

    1995-01-01

    The purpose of the TA-2 contract was to provide advanced launch vehicle concept definition and analysis to assist NASA in the identification of future launch vehicle requirements. Contracted analysis activities included vehicle sizing and performance analysis, subsystem concept definition, propulsion subsystem definition (foreign and domestic), ground operations and facilities analysis, and life cycle cost estimation. The basic period of performance of the TA-2 contract was from May 1992 through May 1993. No-cost extensions were exercised on the contract from June 1993 through July 1995. This document is part of the final report for the TA-2 contract. The final report consists of three volumes: Volume 1 is the Executive Summary, Volume 2 is Technical Results, and Volume 3 is Program Cost Estimates. The document-at-hand, Volume 3, provides a work breakdown structure dictionary, user's guide for the parametric life cycle cost estimation tool, and final report developed by ECON, Inc., under subcontract to Lockheed Martin on TA-2 for the analysis of heavy lift launch vehicle concepts.

  20. The cost of doing business: cost structure of electronic immunization registries.

    PubMed

    Fontanesi, John M; Flesher, Don S; De Guire, Michelle; Lieberthal, Allan; Holcomb, Kathy

    2002-10-01

    To predict the true cost of developing and maintaining an electronic immunization registry, and to set the framework for developing future cost-effective and cost-benefit analysis. Primary data collected at three immunization registries located in California, accounting for 90 percent of all immunization records in registries in the state during the study period. A parametric cost analysis compared registry development and maintenance expenditures to registry performance requirements. Data were collected at each registry through interviews, reviews of expenditure records, technical accomplishments development schedules, and immunization coverage rates. The cost of building immunization registries is predictable and independent of the hardware/software combination employed. The effort requires four man-years of technical effort or approximately $250,000 in 1998 dollars. Costs for maintaining a registry were approximately $5,100 per end user per three-year period. There is a predictable cost structure for both developing and maintaining immunization registries. The cost structure can be used as a framework for examining the cost-effectiveness and cost-benefits of registries. The greatest factor effecting improvement in coverage rates was ongoing, user-based administrative investment.

  1. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  2. Prospects for reduced energy transports: A preliminary analysis

    NASA Technical Reports Server (NTRS)

    Ardema, M. D.; Harper, M.; Smith, C. L.; Waters, M. H.; Williams, L. J.

    1974-01-01

    The recent energy crisis and subsequent substantial increase in fuel prices have provided increased incentive to reduce the fuel consumption of civil transport aircraft. At the present time many changes in operational procedures have been introduced to decrease fuel consumption of the existing fleet. In the future, however, it may become desirable or even necessary to introduce new fuel-conservative aircraft designs. This paper reports the results of a preliminary study of new near-term fuel conservative aircraft. A parametric study was made to determine the effects of cruise Mach number and fuel cost on the optimum configuration characteristics and on economic performance. For each design, the wing geometry was optimized to give maximum return on investment at a particular fuel cost. Based on the results of the parametric study, a nominal reduced energy configuration was selected. Compared with existing transport designs, the reduced energy design has a higher aspect ratio wing with lower sweep, and cruises at a lower Mach number. It has about 30% less fuel consumption on a seat-mile basis.

  3. Stirling heat pump external heat systems - An appliance perspective

    NASA Astrophysics Data System (ADS)

    Vasilakis, Andrew D.; Thomas, John F.

    A major issue facing the Stirling Engine Heat Pump is system cost, and, in particular, the cost of the External Heat System (EHS). The need for high temperature at the heater head (600 C to 700 C) results in low combustion system efficiencies unless efficient heat recovery is employed. The balance between energy efficiency and use of costly high temperature materials is critical to design and cost optimization. Blower power consumption and NO(x) emissions are also important. A new approach to the design and cost optimization of the EHS was taken by viewing the system from a natural gas-fired appliance perspective. To develop a design acceptable to gas industry requirements, American National Standards Institute (ANSI) code considerations were incorporated into the design process and material selections. A parametric engineering design and cost model was developed to perform the analysis, including the impact of design on NO(x) emissions. Analysis results and recommended EHS design and material choices are given.

  4. Stirling heat pump external heat systems: An appliance perspective

    NASA Astrophysics Data System (ADS)

    Vasilakis, A. D.; Thomas, J. F.

    1992-08-01

    A major issue facing the Stirling Engine Heat Pump is system cost, and, in particular, the cost of the External Heat System (EHS). The need for high temperature at the heater head (600 C to 700 C) results in low combustion system efficiencies unless efficient heat recovery is employed. The balance between energy efficiency and use of costly high temperature materials is critical to design and cost optimization. Blower power consumption and NO(x) emissions are also important. A new approach to the design and cost optimization of the EHS system was taken by viewing the system from a natural gas-fired appliance perspective. To develop a design acceptable to gas industry requirements, American National Standards Institute (ANSI) code considerations were incorporated into the design process and material selections. A parametric engineering design and cost model was developed to perform the analysis, including the impact of design on NO(x) emissions. Analysis results and recommended EHS design and material choices are given.

  5. Bridge maintenance to enhance corrosion resistance and performance of steel girder bridges

    NASA Astrophysics Data System (ADS)

    Moran Yanez, Luis M.

    The integrity and efficiency of any national highway system relies on the condition of the various components. Bridges are fundamental elements of a highway system, representing an important investment and a strategic link that facilitates the transport of persons and goods. The cost to rehabilitate or replace a highway bridge represents an important expenditure to the owner, who needs to evaluate the correct time to assume that cost. Among the several factors that affect the condition of steel highway bridges, corrosion is identified as the main problem. In the USA corrosion is the primary cause of structurally deficient steel bridges. The benefit of regular high-pressure superstructure washing and spot painting were evaluated as effective maintenance activities to reduce the corrosion process. The effectiveness of steel girder washing was assessed by developing models of corrosion deterioration of composite steel girders and analyzing steel coupons at the laboratory under atmospheric corrosion for two alternatives: when high-pressure washing was performed and when washing was not considered. The effectiveness of spot painting was assessed by analyzing the corrosion on steel coupons, with small damages, unprotected and protected by spot painting. A parametric analysis of corroded steel girder bridges was considered. The emphasis was focused on the parametric analyses of corroded steel girder bridges under two alternatives: (a) when steel bridge girder washing is performed according to a particular frequency, and (b) when no bridge washing is performed to the girders. The reduction of structural capacity was observed for both alternatives along the structure service life, estimated at 100 years. An economic analysis, using the Life-Cycle Cost Analysis method, demonstrated that it is more cost-effective to perform steel girder washing as a scheduled maintenance activity in contrast to the no washing alternative.

  6. Advanced Technology Lifecycle Analysis System (ATLAS)

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Mankins, John C.

    2004-01-01

    Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.

  7. Information transfer satellite concept study. Volume 4: computer manual

    NASA Technical Reports Server (NTRS)

    Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.

    1971-01-01

    The Satellite Telecommunications Analysis and Modeling Program (STAMP) provides the user with a flexible and comprehensive tool for the analysis of ITS system requirements. While obtaining minimum cost design points, the program enables the user to perform studies over a wide range of user requirements and parametric demands. The program utilizes a total system approach wherein the ground uplink and downlink, the spacecraft, and the launch vehicle are simultaneously synthesized. A steepest descent algorithm is employed to determine the minimum total system cost design subject to the fixed user requirements and imposed constraints. In the process of converging to the solution, the pertinent subsystem tradeoffs are resolved. This report documents STAMP through a technical analysis and a description of the principal techniques employed in the program.

  8. Some advanced parametric methods for assessing waveform distortion in a smart grid with renewable generation

    NASA Astrophysics Data System (ADS)

    Alfieri, Luisa

    2015-12-01

    Power quality (PQ) disturbances are becoming an important issue in smart grids (SGs) due to the significant economic consequences that they can generate on sensible loads. However, SGs include several distributed energy resources (DERs) that can be interconnected to the grid with static converters, which lead to a reduction of the PQ levels. Among DERs, wind turbines and photovoltaic systems are expected to be used extensively due to the forecasted reduction in investment costs and other economic incentives. These systems can introduce significant time-varying voltage and current waveform distortions that require advanced spectral analysis methods to be used. This paper provides an application of advanced parametric methods for assessing waveform distortions in SGs with dispersed generation. In particular, the Standard International Electrotechnical Committee (IEC) method, some parametric methods (such as Prony and Estimation of Signal Parameters by Rotational Invariance Technique (ESPRIT)), and some hybrid methods are critically compared on the basis of their accuracy and the computational effort required.

  9. The Cost of Doing Business: Cost Structure of Electronic Immunization Registries

    PubMed Central

    Fontanesi, John M; Flesher, Don S; De Guire, Michelle; Lieberthal, Allan; Holcomb, Kathy

    2002-01-01

    Objective To predict the true cost of developing and maintaining an electronic immunization registry, and to set the framework for developing future cost-effective and cost-benefit analysis. Data Sources/Study Setting Primary data collected at three immunization registries located in California, accounting for 90 percent of all immunization records in registries in the state during the study period. Study Design A parametric cost analysis compared registry development and maintenance expenditures to registry performance requirements. Data Collection/Extraction Methods Data were collected at each registry through interviews, reviews of expenditure records, technical accomplishments development schedules, and immunization coverage rates. Principal Findings The cost of building immunization registries is predictable and independent of the hardware/software combination employed. The effort requires four man-years of technical effort or approximately $250,000 in 1998 dollars. Costs for maintaining a registry were approximately $5,100 per end user per three-year period. Conclusions There is a predictable cost structure for both developing and maintaining immunization registries. The cost structure can be used as a framework for examining the cost-effectiveness and cost-benefits of registries. The greatest factor effecting improvement in coverage rates was ongoing, user-based administrative investment. PMID:12479497

  10. Democratizing science with the aid of parametric design and additive manufacturing: Design and fabrication of a versatile and low-cost optical instrument for scattering measurement.

    PubMed

    Nadal-Serrano, Jose M; Nadal-Serrano, Adolfo; Lopez-Vallejo, Marisa

    2017-01-01

    This paper focuses on the application of rapid prototyping techniques using additive manufacturing in combination with parametric design to create low-cost, yet accurate and reliable instruments. The methodology followed makes it possible to make instruments with a degree of customization until now available only to a narrow audience, helping democratize science. The proposal discusses a holistic design-for-manufacturing approach that comprises advanced modeling techniques, open-source design strategies, and an optimization algorithm using free parametric software for both professional and educational purposes. The design and fabrication of an instrument for scattering measurement is used as a case of study to present the previous concepts.

  11. Democratizing science with the aid of parametric design and additive manufacturing: Design and fabrication of a versatile and low-cost optical instrument for scattering measurement

    PubMed Central

    Lopez-Vallejo, Marisa

    2017-01-01

    This paper focuses on the application of rapid prototyping techniques using additive manufacturing in combination with parametric design to create low-cost, yet accurate and reliable instruments. The methodology followed makes it possible to make instruments with a degree of customization until now available only to a narrow audience, helping democratize science. The proposal discusses a holistic design-for-manufacturing approach that comprises advanced modeling techniques, open-source design strategies, and an optimization algorithm using free parametric software for both professional and educational purposes. The design and fabrication of an instrument for scattering measurement is used as a case of study to present the previous concepts. PMID:29112987

  12. An economics systems analysis of land mobile radio telephone services

    NASA Technical Reports Server (NTRS)

    Leroy, B. E.; Stevenson, S. M.

    1980-01-01

    The economic interaction of the terrestrial and satellite systems is considered. Parametric equations are formulated to allow examination of necessary user thresholds and growth rates as a function of system costs. Conversely, first order allowable systems costs are found as a function of user thresholds and growth rates. Transitions between satellite and terrestrial service systems are examined. User growth rate density (user/year/sq km) is shown to be a key parameter in the analysis of systems compatibility. The concept of system design matching the price/demand curves is introduced and examples are given. The role of satellite systems is critically examined and the economic conditions necessary for the introduction of satellite service are identified.

  13. Topics in the two-dimensional sampling and reconstruction of images. [in remote sensing

    NASA Technical Reports Server (NTRS)

    Schowengerdt, R.; Gray, S.; Park, S. K.

    1984-01-01

    Mathematical analysis of image sampling and interpolative reconstruction is summarized and extended to two dimensions for application to data acquired from satellite sensors such as the Thematic mapper and SPOT. It is shown that sample-scene phase influences the reconstruction of sampled images, adds a considerable blur to the average system point spread function, and decreases the average system modulation transfer function. It is also determined that the parametric bicubic interpolator with alpha = -0.5 is more radiometrically accurate than the conventional bicubic interpolator with alpha = -1, and this at no additional cost. Finally, the parametric bicubic interpolator is found to be suitable for adaptive implementation by relating the alpha parameter to the local frequency content of an image.

  14. Relation between cost of drug treatment and body mass index in people with type 2 diabetes in Latin America.

    PubMed

    Elgart, Jorge Federico; Prestes, Mariana; Gonzalez, Lorena; Rucci, Enzo; Gagliardino, Juan Jose

    2017-01-01

    Despite the frequent association of obesity with type 2 diabetes (T2D), the effect of the former on the cost of drug treatment of the latest has not been specifically addressed. We studied the association of overweight/obesity on the cost of drug treatment of hyperglycemia, hypertension and dyslipidemia in a population with T2D. This observational study utilized data from the QUALIDIAB database on 3,099 T2D patients seen in Diabetes Centers in Argentina, Chile, Colombia, Peru, and Venezuela. Data were grouped according to body mass index (BMI) as Normal (18.5≤BMI<25), Overweight (25≤BMI<30), and Obese (BMI≥30). Thereafter, we assessed clinical and metabolic data and cost of drug treatment in each category. Statistical analyses included group comparisons for continuous variables (parametric or non-parametric tests), Chi-square tests for differences between proportions, and multivariable regression analysis to assess the association between BMI and monthly cost of drug treatment. Although all groups showed comparable degree of glycometabolic control (FBG, HbA1c), we found significant differences in other metabolic control indicators. Total cost of drug treatment of hyperglycemia and associated cardiovascular risk factors (CVRF) increased significantly (p<0.001) with increment of BMI. Hyperglycemia treatment cost showed a significant increase concordant with BMI whereas hypertension and dyslipidemia did not. Despite different values and percentages of increase, this growing cost profile was reproduced in every participating country. BMI significantly and independently affected hyperglycemia treatment cost. Our study shows for the first time that BMI significantly increases total expenditure on drugs for T2D and its associated CVRF treatment in Latin America.

  15. Parametric cost estimation for space science missions

    NASA Astrophysics Data System (ADS)

    Lillie, Charles F.; Thompson, Bruce E.

    2008-07-01

    Cost estimation for space science missions is critically important in budgeting for successful missions. The process requires consideration of a number of parameters, where many of the values are only known to a limited accuracy. The results of cost estimation are not perfect, but must be calculated and compared with the estimates that the government uses for budgeting purposes. Uncertainties in the input parameters result from evolving requirements for missions that are typically the "first of a kind" with "state-of-the-art" instruments and new spacecraft and payload technologies that make it difficult to base estimates on the cost histories of previous missions. Even the cost of heritage avionics is uncertain due to parts obsolescence and the resulting redesign work. Through experience and use of industry best practices developed in participation with the Aerospace Industries Association (AIA), Northrop Grumman has developed a parametric modeling approach that can provide a reasonably accurate cost range and most probable cost for future space missions. During the initial mission phases, the approach uses mass- and powerbased cost estimating relationships (CER)'s developed with historical data from previous missions. In later mission phases, when the mission requirements are better defined, these estimates are updated with vendor's bids and "bottoms- up", "grass-roots" material and labor cost estimates based on detailed schedules and assigned tasks. In this paper we describe how we develop our CER's for parametric cost estimation and how they can be applied to estimate the costs for future space science missions like those presented to the Astronomy & Astrophysics Decadal Survey Study Committees.

  16. ENERGY COSTS OF IAQ CONTROL THROUGH INCREASED VENTILATION IN A SMALL OFFICE IN A WARM, HUMID CLIMATE: PARAMETRIC ANALYSIS USING THE DOE-2 COMPUTER MODEL

    EPA Science Inventory

    The report gives results of a series of computer runs using the DOE-2.1E building energy model, simulating a small office in a hot, humid climate (Miami). These simulations assessed the energy and relative humidity (RH) penalties when the outdoor air (OA) ventilation rate is inc...

  17. A Parametric Regression of the Cost of Base Realignment Action (COBRA) Model

    DTIC Science & Technology

    1993-09-20

    Douglas D. Hardman , Captain, USAF Michael S. Nelson, Captain, USAF AFIT/GEE/ENS/93S-03 93 P’ 8 143 Approved for public release, distribution unlimited 93... Hardman CLASS: GEE 93S Captain Michael Nelson TITLE: A Parametric Regression of the Cost of Base Realignment Action (COBRA) Model DEFENSE DATE: 20...Science in Engineering and Environmental Management Douglas D. Hardman , B.S.E.E. Michael S. Nelson, B.S.C.E Captain, USAF Captain, USAF September 1993

  18. Economic analysis and assessment of syngas production using a modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei

    Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost ofmore » syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.« less

  19. Analysis and assessment of STES technologies

    NASA Astrophysics Data System (ADS)

    Brown, D. R.; Blahnik, D. E.; Huber, H. D.

    1982-12-01

    Technical and economic assessments completed in FY 1982 in support of the Seasonal Thermal Energy Storage (STES) segment of the Underground Energy Storage Program included: (1) a detailed economic investigation of the cost of heat storage in aquifers, (2) documentation for AQUASTOR, a computer model for analyzing aquifer thermal energy storage (ATES) coupled with district heating or cooling, and (3) a technical and economic evaluation of several ice storage concepts. This paper summarizes the research efforts and main results of each of these three activities. In addition, a detailed economic investigation of the cost of chill storage in aquifers is currently in progress. The work parallels that done for ATES heat storage with technical and economic assumptions being varied in a parametric analysis of the cost of ATES delivered chill. The computer model AQUASTOR is the principal analytical tool being employed.

  20. Solid rocket motor cost model

    NASA Technical Reports Server (NTRS)

    Harney, A. G.; Raphael, L.; Warren, S.; Yakura, J. K.

    1972-01-01

    A systematic and standardized procedure for estimating life cycle costs of solid rocket motor booster configurations. The model consists of clearly defined cost categories and appropriate cost equations in which cost is related to program and hardware parameters. Cost estimating relationships are generally based on analogous experience. In this model the experience drawn on is from estimates prepared by the study contractors. Contractors' estimates are derived by means of engineering estimates for some predetermined level of detail of the SRM hardware and program functions of the system life cycle. This method is frequently referred to as bottom-up. A parametric cost analysis is a useful technique when rapid estimates are required. This is particularly true during the planning stages of a system when hardware designs and program definition are conceptual and constantly changing as the selection process, which includes cost comparisons or trade-offs, is performed. The use of cost estimating relationships also facilitates the performance of cost sensitivity studies in which relative and comparable cost comparisons are significant.

  1. Cost component analysis.

    PubMed

    Lörincz, András; Póczos, Barnabás

    2003-06-01

    In optimizations the dimension of the problem may severely, sometimes exponentially increase optimization time. Parametric function approximatiors (FAPPs) have been suggested to overcome this problem. Here, a novel FAPP, cost component analysis (CCA) is described. In CCA, the search space is resampled according to the Boltzmann distribution generated by the energy landscape. That is, CCA converts the optimization problem to density estimation. Structure of the induced density is searched by independent component analysis (ICA). The advantage of CCA is that each independent ICA component can be optimized separately. In turn, (i) CCA intends to partition the original problem into subproblems and (ii) separating (partitioning) the original optimization problem into subproblems may serve interpretation. Most importantly, (iii) CCA may give rise to high gains in optimization time. Numerical simulations illustrate the working of the algorithm.

  2. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  3. Optimization of space manufacturing systems

    NASA Technical Reports Server (NTRS)

    Akin, D. L.

    1979-01-01

    Four separate analyses are detailed: transportation to low earth orbit, orbit-to-orbit optimization, parametric analysis of SPS logistics based on earth and lunar source locations, and an overall program option optimization implemented with linear programming. It is found that smaller vehicles are favored for earth launch, with the current Space Shuttle being right at optimum payload size. Fully reusable launch vehicles represent a savings of 50% over the Space Shuttle; increased reliability with less maintenance could further double the savings. An optimization of orbit-to-orbit propulsion systems using lunar oxygen for propellants shows that ion propulsion is preferable by a 3:1 cost margin over a mass driver reaction engine at optimum values; however, ion engines cannot yet operate in the lower exhaust velocity range where the optimum lies, and total program costs between the two systems are ambiguous. Heavier payloads favor the use of a MDRE. A parametric model of a space manufacturing facility is proposed, and used to analyze recurring costs, total costs, and net present value discounted cash flows. Parameters studied include productivity, effects of discounting, materials source tradeoffs, economic viability of closed-cycle habitats, and effects of varying degrees of nonterrestrial SPS materials needed from earth. Finally, candidate optimal scenarios are chosen, and implemented in a linear program with external constraints in order to arrive at an optimum blend of SPS production strategies in order to maximize returns.

  4. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models.

    PubMed

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-12-15

    Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  5. A Scenario-Based Parametric Analysis of Stable Marriage Approaches to the Army Officer Assignment Problem

    DTIC Science & Technology

    2017-03-23

    solutions obtained through their proposed method to comparative instances of a generalized assignment problem with either ordinal cost components or... method flag: Designates the method by which the changed/ new assignment problem instance is solved. methodFlag = 0:SMAWarmstart Returns a matching...of randomized perturbations. We examine the contrasts between these methods in the context of assigning Army Officers among a set of identified

  6. Efficient scheme for parametric fitting of data in arbitrary dimensions.

    PubMed

    Pang, Ning-Ning; Tzeng, Wen-Jer; Kao, Hisen-Ching

    2008-07-01

    We propose an efficient scheme for parametric fitting expressed in terms of the Legendre polynomials. For continuous systems, our scheme is exact and the derived explicit expression is very helpful for further analytical studies. For discrete systems, our scheme is almost as accurate as the method of singular value decomposition. Through a few numerical examples, we show that our algorithm costs much less CPU time and memory space than the method of singular value decomposition. Thus, our algorithm is very suitable for a large amount of data fitting. In addition, the proposed scheme can also be used to extract the global structure of fluctuating systems. We then derive the exact relation between the correlation function and the detrended variance function of fluctuating systems in arbitrary dimensions and give a general scaling analysis.

  7. Near-term hybrid vehicle program, phase 1. Appendix D: Sensitivity analysis resport

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Parametric analyses, using a hybrid vehicle synthesis and economics program (HYVELD) are described investigating the sensitivity of hybrid vehicle cost, fuel usage, utility, and marketability to changes in travel statistics, energy costs, vehicle lifetime and maintenance, owner use patterns, internal combustion engine (ICE) reference vehicle fuel economy, and drive-line component costs and type. The lowest initial cost of the hybrid vehicle would be $1200 to $1500 higher than that of the conventional vehicle. For nominal energy costs ($1.00/gal for gasoline and 4.2 cents/kWh for electricity), the ownership cost of the hybrid vehicle is projected to be 0.5 to 1.0 cents/mi less than the conventional ICE vehicle. To attain this ownership cost differential, the lifetime of the hybrid vehicle must be extended to 12 years and its maintenance cost reduced by 25 percent compared with the conventional vehicle. The ownership cost advantage of the hybrid vehicle increases rapidly as the price of fuel increases from $1 to $2/gal.

  8. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.

  9. Cost Modeling for low-cost planetary missions

    NASA Technical Reports Server (NTRS)

    Kwan, Eric; Habib-Agahi, Hamid; Rosenberg, Leigh

    2005-01-01

    This presentation will provide an overview of the JPL parametric cost models used to estimate flight science spacecrafts and instruments. This material will emphasize the cost model approaches to estimate low-cost flight hardware, sensors, and instrumentation, and to perform cost-risk assessments. This presentation will also discuss JPL approaches to perform cost modeling and the methodologies and analyses used to capture low-cost vs. key cost drivers.

  10. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    NASA Astrophysics Data System (ADS)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  11. A convolution model for computing the far-field directivity of a parametric loudspeaker array.

    PubMed

    Shi, Chuang; Kajikawa, Yoshinobu

    2015-02-01

    This paper describes a method to compute the far-field directivity of a parametric loudspeaker array (PLA), whereby the steerable parametric loudspeaker can be implemented when phased array techniques are applied. The convolution of the product directivity and the Westervelt's directivity is suggested, substituting for the past practice of using the product directivity only. Computed directivity of a PLA using the proposed convolution model achieves significant improvement in agreement to measured directivity at a negligible computational cost.

  12. Acceleration of the direct reconstruction of linear parametric images using nested algorithms.

    PubMed

    Wang, Guobao; Qi, Jinyi

    2010-03-07

    Parametric imaging using dynamic positron emission tomography (PET) provides important information for biological research and clinical diagnosis. Indirect and direct methods have been developed for reconstructing linear parametric images from dynamic PET data. Indirect methods are relatively simple and easy to implement because the image reconstruction and kinetic modeling are performed in two separate steps. Direct methods estimate parametric images directly from raw PET data and are statistically more efficient. However, the convergence rate of direct algorithms can be slow due to the coupling between the reconstruction and kinetic modeling. Here we present two fast gradient-type algorithms for direct reconstruction of linear parametric images. The new algorithms decouple the reconstruction and linear parametric modeling at each iteration by employing the principle of optimization transfer. Convergence speed is accelerated by running more sub-iterations of linear parametric estimation because the computation cost of the linear parametric modeling is much less than that of the image reconstruction. Computer simulation studies demonstrated that the new algorithms converge much faster than the traditional expectation maximization (EM) and the preconditioned conjugate gradient algorithms for dynamic PET.

  13. Research on AutoCAD secondary development and function expansion based on VBA technology

    NASA Astrophysics Data System (ADS)

    Zhang, Runmei; Gu, Yehuan

    2017-06-01

    AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.

  14. Reducing numerical costs for core wide nuclear reactor CFD simulations by the Coarse-Grid-CFD

    NASA Astrophysics Data System (ADS)

    Viellieber, Mathias; Class, Andreas G.

    2013-11-01

    Traditionally complete nuclear reactor core simulations are performed with subchannel analysis codes, that rely on experimental and empirical input. The Coarse-Grid-CFD (CGCFD) intends to replace the experimental or empirical input with CFD data. The reactor core consists of repetitive flow patterns, allowing the general approach of creating a parametrized model for one segment and composing many of those to obtain the entire reactor simulation. The method is based on a detailed and well-resolved CFD simulation of one representative segment. From this simulation we extract so-called parametrized volumetric forces which close, an otherwise strongly under resolved, coarsely-meshed model of a complete reactor setup. While the formulation so far accounts for forces created internally in the fluid others e.g. obstruction and flow deviation through spacers and wire wraps, still need to be accounted for if the geometric details are not represented in the coarse mesh. These are modelled with an Anisotropic Porosity Formulation (APF). This work focuses on the application of the CGCFD to a complete reactor core setup and the accomplishment of the parametrization of the volumetric forces.

  15. Feasibility study of modern airships, phase 1. Volume 3: Historical overview (task 1)

    NASA Technical Reports Server (NTRS)

    Faurote, G. L.

    1975-01-01

    The history of lighter-than-air vehicles is reviewed in terms of providing a background for the mission analysis and parametric analysis tasks. Data from past airships and airship operations are presented in the following areas: (1) parameterization of design characteristics; (2) markets, missions, costs, and operating procedures, (3) indices of efficiency for comparison; (4) identification of critical design and operational characteristics; and (5) definition of the 1930 state-of-the-art and the 1974 state-of-the-art from a technical and economic standpoint.

  16. New opportunities for future small civil turbine engines: Overviewing the GATE studies

    NASA Technical Reports Server (NTRS)

    Strack, W. C.

    1979-01-01

    An overview of four independent studies forecasts the potential impact of advanced technology turbine engines in the post 1988 market, identifies important aircraft and missions, desirable engine sizes, engine performance, and cost goals. Parametric evaluations of various engine cycles, configurations, design features, and advanced technology elements defined baseline conceptual engines for each of the important missions identified by the market analysis. Both fixed-wing and helicopter aircraft, and turboshaft, turboprop, and turbofan engines were considered. Sizable performance gains (e.g., 20% SFC decrease), and large engine cost reductions of sufficient magnitude are predicted to challenge the reciprocating engine in the 300-500 SHP class.

  17. Parametric CERs (Cost Estimate Relationships) for Replenishment Repair Parts (Selected U.S. Army Helicopters and Combat Vehicles)

    DTIC Science & Technology

    1989-07-31

    Information System (OSMIS). The long-range objective is to develop methods to determine total operating and support (O&S) costs within life-cycle cost...objective was to assess the feasibility of developing cost estimating relationships (CERs) based on data from the Army Operating and Support Management

  18. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    NASA Technical Reports Server (NTRS)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  19. Turboprop cargo aircraft systems study

    NASA Technical Reports Server (NTRS)

    Muehlbauer, J. C.; Hewell, J. G., Jr.; Lindenbaum, S. P.; Randall, C. C.; Searle, N.; Stone, R. G., Jr.

    1981-01-01

    The effects of using advanced turboprop propulsion systems to reduce the fuel consumption and direct operating costs of cargo aircraft were studied, and the impact of these systems on aircraft noise and noise prints around a terminal area was determined. Parametric variations of aircraft and propeller characteristics were investigated to determine their effects on noiseprint areas, fuel consumption, and direct operating costs. From these results, three aircraft designs were selected and subjected to design refinements and sensitivity analyses. Three competitive turbofan aircraft were also defined from parametric studies to provide a basis for comparing the two types of propulsion.

  20. An economic systems analysis of land mobile radio telephone services

    NASA Technical Reports Server (NTRS)

    Leroy, B. E.; Stevenson, S. M.

    1980-01-01

    This paper deals with the economic interaction of the terrestrial and satellite land-mobile radio service systems. The cellular, trunked and satellite land-mobile systems are described. Parametric equations are formulated to allow examination of necessary user thresholds and growth rates as functions of system costs. Conversely, first order allowable systems costs are found as a function of user thresholds and growth rates. Transitions between satellite and terrestrial service systems are examined. User growth rate density (user/year/km squared) is shown to be a key parameter in the analysis of systems compatibility. The concept of system design matching the price demand curves is introduced and examples are given. The role of satellite systems is critically examined and the economic conditions necessary for the introduction of satellite service are identified.

  1. Study of solid rocket motor for space shuttle booster. Volume 4: Cost

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The cost data for solid propellant rocket engines for use with the space shuttle are presented. The data are based on the selected 156 inch parallel and series burn configurations. Summary cost data are provided for the production of the 120 inch and 260 inch configurations. Graphs depicting parametric cost estimating relationships are included.

  2. Prepositioning emergency supplies under uncertainty: a parametric optimization method

    NASA Astrophysics Data System (ADS)

    Bai, Xuejie; Gao, Jinwu; Liu, Yankui

    2018-07-01

    Prepositioning of emergency supplies is an effective method for increasing preparedness for disasters and has received much attention in recent years. In this article, the prepositioning problem is studied by a robust parametric optimization method. The transportation cost, supply, demand and capacity are unknown prior to the extraordinary event, which are represented as fuzzy parameters with variable possibility distributions. The variable possibility distributions are obtained through the credibility critical value reduction method for type-2 fuzzy variables. The prepositioning problem is formulated as a fuzzy value-at-risk model to achieve a minimum total cost incurred in the whole process. The key difficulty in solving the proposed optimization model is to evaluate the quantile of the fuzzy function in the objective and the credibility in the constraints. The objective function and constraints can be turned into their equivalent parametric forms through chance constrained programming under the different confidence levels. Taking advantage of the structural characteristics of the equivalent optimization model, a parameter-based domain decomposition method is developed to divide the original optimization problem into six mixed-integer parametric submodels, which can be solved by standard optimization solvers. Finally, to explore the viability of the developed model and the solution approach, some computational experiments are performed on realistic scale case problems. The computational results reported in the numerical example show the credibility and superiority of the proposed parametric optimization method.

  3. Tri-Center Analysis: Determining Measures of Trichotomous Central Tendency for the Parametric Analysis of Tri-Squared Test Results

    ERIC Educational Resources Information Center

    Osler, James Edward

    2014-01-01

    This monograph provides an epistemological rational for the design of a novel post hoc statistical measure called "Tri-Center Analysis". This new statistic is designed to analyze the post hoc outcomes of the Tri-Squared Test. In Tri-Center Analysis trichotomous parametric inferential parametric statistical measures are calculated from…

  4. Numerical model of solar dynamic radiator for parametric analysis

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer L.

    1989-01-01

    Growth power requirements for Space Station Freedom will be met through addition of 25 kW solar dynamic (SD) power modules. The SD module rejects waste heat from the power conversion cycle to space through a pumped-loop, multi-panel, deployable radiator. The baseline radiator configuration was defined during the Space Station conceptual design phase and is a function of the state point and heat rejection requirements of the power conversion unit. Requirements determined by the overall station design such as mass, system redundancy, micrometeoroid and space debris impact survivability, launch packaging, costs, and thermal and structural interaction with other station components have also been design drivers for the radiator configuration. Extensive thermal and power cycle modeling capabilities have been developed which are powerful tools in Station design and analysis, but which prove cumbersome and costly for simple component preliminary design studies. In order to aid in refining the SD radiator to the mature design stage, a simple and flexible numerical model was developed. The model simulates heat transfer and fluid flow performance of the radiator and calculates area mass and impact survivability for many combinations of flow tube and panel configurations, fluid and material properties, and environmental and cycle variations. A brief description and discussion of the numerical model, it's capabilities and limitations, and results of the parametric studies performed is presented.

  5. Preliminary Multivariable Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  6. The cost of colorectal cancer according to the TNM stage.

    PubMed

    Mar, Javier; Errasti, Jose; Soto-Gordoa, Myriam; Mar-Barrutia, Gilen; Martinez-Llorente, José Miguel; Domínguez, Severina; García-Albás, Juan José; Arrospide, Arantzazu

    2017-02-01

    The aim of this study was to measure the cost of treatment of colorectal cancer in the Basque public health system according to the clinical stage. We retrospectively collected demographic data, clinical data and resource use of a sample of 529 patients. For stagesi toiii the initial and follow-up costs were measured. The calculation of cost for stageiv combined generalized linear models to relate the cost to the duration of follow-up based on parametric survival analysis. Unit costs were obtained from the analytical accounting system of the Basque Health Service. The sample included 110 patients with stagei, 171 with stageii, 158 with stageiii and 90 with stageiv colorectal cancer. The initial total cost per patient was 8,644€ for stagei, 12,675€ for stageii and 13,034€ for stageiii. The main component was hospitalization cost. Calculated by extrapolation for stageiv mean survival was 1.27years. Its average annual cost was 22,403€, and 24,509€ to death. The total annual cost for colorectal cancer extrapolated to the whole Spanish health system was 623.9million€. The economic burden of colorectal cancer is important and should be taken into account in decision-making. The combination of generalized linear models and survival analysis allows estimation of the cost of metastatic stage. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  8. A strategy for improved computational efficiency of the method of anchored distributions

    NASA Astrophysics Data System (ADS)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  9. SHIPS: Spectral Hierarchical Clustering for the Inference of Population Structure in Genetic Studies

    PubMed Central

    Bouaziz, Matthieu; Paccard, Caroline; Guedj, Mickael; Ambroise, Christophe

    2012-01-01

    Inferring the structure of populations has many applications for genetic research. In addition to providing information for evolutionary studies, it can be used to account for the bias induced by population stratification in association studies. To this end, many algorithms have been proposed to cluster individuals into genetically homogeneous sub-populations. The parametric algorithms, such as Structure, are very popular but their underlying complexity and their high computational cost led to the development of faster parametric alternatives such as Admixture. Alternatives to these methods are the non-parametric approaches. Among this category, AWclust has proven efficient but fails to properly identify population structure for complex datasets. We present in this article a new clustering algorithm called Spectral Hierarchical clustering for the Inference of Population Structure (SHIPS), based on a divisive hierarchical clustering strategy, allowing a progressive investigation of population structure. This method takes genetic data as input to cluster individuals into homogeneous sub-populations and with the use of the gap statistic estimates the optimal number of such sub-populations. SHIPS was applied to a set of simulated discrete and admixed datasets and to real SNP datasets, that are data from the HapMap and Pan-Asian SNP consortium. The programs Structure, Admixture, AWclust and PCAclust were also investigated in a comparison study. SHIPS and the parametric approach Structure were the most accurate when applied to simulated datasets both in terms of individual assignments and estimation of the correct number of clusters. The analysis of the results on the real datasets highlighted that the clusterings of SHIPS were the more consistent with the population labels or those produced by the Admixture program. The performances of SHIPS when applied to SNP data, along with its relatively low computational cost and its ease of use make this method a promising solution to infer fine-scale genetic patterns. PMID:23077494

  10. Environmental Cost Analysis System (ECAS) Status and Compliance Requirements for EM Consolidated Business Center Contracts - 13204

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanford, P.C.; Moe, M.A.; Hombach, W.G.

    2013-07-01

    The Department of Energy (DOE) Office of Environmental Management (EM) has developed a web-accessible database to collect actual cost data from completed EM projects to support cost estimating and analysis. This Environmental Cost Analysis System (ECAS) database was initially deployed in early 2009 containing the cost and parametric data from 77 decommissioning, restoration, and waste management projects completed under the Rocky Flats Closure Project. In subsequent years we have added many more projects to ECAS and now have a total of 280 projects from 8 major DOE sites. This data is now accessible to DOE users through a web-based reportingmore » tool that allows users to tailor report outputs to meet their specific needs. We are using it as a principal resource supporting the EM Consolidated Business Center (EMCBC) and the EM Applied Cost Engineering (ACE) team cost estimating and analysis efforts across the country. The database has received Government Accountability Office review as supporting its recommended improvements in DOE's cost estimating process, as well as review from the DOE Office of Acquisition and Project Management (APM). Moving forward, the EMCBC has developed a Special Contract Requirement clause or 'H-Clause' to be included in all current and future EMCBC procurements identifying the process that contractors will follow to provide DOE their historical project data in a format compatible with ECAS. Changes to DOE O 413.3B implementation are also in progress to capture historical costs as part of the Critical Decision project closeout process. (authors)« less

  11. Estimating technical efficiency in the hospital sector with panel data: a comparison of parametric and non-parametric techniques.

    PubMed

    Siciliani, Luigi

    2006-01-01

    Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.

  12. Research on the Applicable Method of Valuation of Pure Electric Used vehicles

    NASA Astrophysics Data System (ADS)

    Cai, yun; Tan, zhengping; Wang, yidong; Mao, pan

    2018-03-01

    With the rapid growth in the ownership of pure electric vehicles, the research on the valuation of used electric vehicles has become the key to the development of the pure electric used vehicle market. The paper analyzed the application of the three value assessment methods, current market price method, capitalized earning method and replacement cost method, in pure electric used vehicles, and draws a conclusion that the replacement cost method is more suitable for pure electric used car. At the same time, the article also conducted a parametric correction exploration research, aiming at the characteristics of pure electric vehicles and replacement cost of the constituent factors. Through the analysis of the applicability parameters of physical devaluation, functional devaluation and economic devaluation, the revised replacement cost method can be used for the valuation of purely used electric vehicles for private use.

  13. Applying Statistical Models and Parametric Distance Measures for Music Similarity Search

    NASA Astrophysics Data System (ADS)

    Lukashevich, Hanna; Dittmar, Christian; Bastuck, Christoph

    Automatic deriving of similarity relations between music pieces is an inherent field of music information retrieval research. Due to the nearly unrestricted amount of musical data, the real-world similarity search algorithms have to be highly efficient and scalable. The possible solution is to represent each music excerpt with a statistical model (ex. Gaussian mixture model) and thus to reduce the computational costs by applying the parametric distance measures between the models. In this paper we discuss the combinations of applying different parametric modelling techniques and distance measures and weigh the benefits of each one against the others.

  14. Parametric Study of a YAV-8B Harrier in Ground Effect Using Time-Dependent Navier-Stokes Computations

    NASA Technical Reports Server (NTRS)

    Shishir, Pandya; Chaderjian, Neal; Ahmad, Jsaim; Kwak, Dochan (Technical Monitor)

    2001-01-01

    Flow simulations using the time-dependent Navier-Stokes equations remain a challenge for several reasons. Principal among them are the difficulty to accurately model complex flows, and the time needed to perform the computations. A parametric study of such complex problems is not considered practical due to the large cost associated with computing many time-dependent solutions. The computation time for each solution must be reduced in order to make a parametric study possible. With successful reduction of computation time, the issue of accuracy, and appropriateness of turbulence models will become more tractable.

  15. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  16. Incremental harmonic balance method for predicting amplitudes of a multi-d.o.f. non-linear wheel shimmy system with combined Coulomb and quadratic damping

    NASA Astrophysics Data System (ADS)

    Zhou, J. X.; Zhang, L.

    2005-01-01

    Incremental harmonic balance (IHB) formulations are derived for general multiple degrees of freedom (d.o.f.) non-linear autonomous systems. These formulations are developed for a concerned four-d.o.f. aircraft wheel shimmy system with combined Coulomb and velocity-squared damping. A multi-harmonic analysis is performed and amplitudes of limit cycles are predicted. Within a large range of parametric variations with respect to aircraft taxi velocity, the IHB method can, at a much cheaper cost, give results with high accuracy as compared with numerical results given by a parametric continuation method. In particular, the IHB method avoids the stiff problems emanating from numerical treatment of aircraft wheel shimmy system equations. The development is applicable to other vibration control systems that include commonly used dry friction devices or velocity-squared hydraulic dampers.

  17. Examining the Feasibility and Utility of Estimating Partial Expected Value of Perfect Information (via a Nonparametric Approach) as Part of the Reimbursement Decision-Making Process in Ireland: Application to Drugs for Cancer.

    PubMed

    McCullagh, Laura; Schmitz, Susanne; Barry, Michael; Walsh, Cathal

    2017-11-01

    In Ireland, all new drugs for which reimbursement by the healthcare payer is sought undergo a health technology assessment by the National Centre for Pharmacoeconomics. The National Centre for Pharmacoeconomics estimate expected value of perfect information but not partial expected value of perfect information (owing to computational expense associated with typical methodologies). The objective of this study was to examine the feasibility and utility of estimating partial expected value of perfect information via a computationally efficient, non-parametric regression approach. This was a retrospective analysis of evaluations on drugs for cancer that had been submitted to the National Centre for Pharmacoeconomics (January 2010 to December 2014 inclusive). Drugs were excluded if cost effective at the submitted price. Drugs were excluded if concerns existed regarding the validity of the applicants' submission or if cost-effectiveness model functionality did not allow required modifications to be made. For each included drug (n = 14), value of information was estimated at the final reimbursement price, at a threshold equivalent to the incremental cost-effectiveness ratio at that price. The expected value of perfect information was estimated from probabilistic analysis. Partial expected value of perfect information was estimated via a non-parametric approach. Input parameters with a population value at least €1 million were identified as potential targets for research. All partial estimates were determined within minutes. Thirty parameters (across nine models) each had a value of at least €1 million. These were categorised. Collectively, survival analysis parameters were valued at €19.32 million, health state utility parameters at €15.81 million and parameters associated with the cost of treating adverse effects at €6.64 million. Those associated with drug acquisition costs and with the cost of care were valued at €6.51 million and €5.71 million, respectively. This research demonstrates that the estimation of partial expected value of perfect information via this computationally inexpensive approach could be considered feasible as part of the health technology assessment process for reimbursement purposes within the Irish healthcare system. It might be a useful tool in prioritising future research to decrease decision uncertainty.

  18. Cost-effectiveness Analysis in R Using a Multi-state Modeling Survival Analysis Framework: A Tutorial.

    PubMed

    Williams, Claire; Lewsey, James D; Briggs, Andrew H; Mackay, Daniel F

    2017-05-01

    This tutorial provides a step-by-step guide to performing cost-effectiveness analysis using a multi-state modeling approach. Alongside the tutorial, we provide easy-to-use functions in the statistics package R. We argue that this multi-state modeling approach using a package such as R has advantages over approaches where models are built in a spreadsheet package. In particular, using a syntax-based approach means there is a written record of what was done and the calculations are transparent. Reproducing the analysis is straightforward as the syntax just needs to be run again. The approach can be thought of as an alternative way to build a Markov decision-analytic model, which also has the option to use a state-arrival extended approach. In the state-arrival extended multi-state model, a covariate that represents patients' history is included, allowing the Markov property to be tested. We illustrate the building of multi-state survival models, making predictions from the models and assessing fits. We then proceed to perform a cost-effectiveness analysis, including deterministic and probabilistic sensitivity analyses. Finally, we show how to create 2 common methods of visualizing the results-namely, cost-effectiveness planes and cost-effectiveness acceptability curves. The analysis is implemented entirely within R. It is based on adaptions to functions in the existing R package mstate to accommodate parametric multi-state modeling that facilitates extrapolation of survival curves.

  19. Development of a solar-powered residential air conditioner: Screening analysis

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Screening analysis aimed at the definition of an optimum configuration of a Rankine cycle solar-powered air conditioner designed for residential application were conducted. Initial studies revealed that system performance and cost were extremely sensitive to condensing temperature and to the type of condenser used in the system. Consequently, the screening analyses were concerned with the generation of parametric design data for different condenser approaches; i. e., (1) an ambient air condenser, (2) a humidified ambient air condenser (3) an evaporative condenser, and (4) a water condenser (with a cooling tower). All systems feature a high performance turbocompressor and a single refrigerant (R-11) for the power and refrigeration loops. Data were obtained by computerized methods developed to permit system characterization over a broad range of operating and design conditions. The criteria used for comparison of the candidate system approaches were (1) overall system COP (refrigeration effect/solar heat input), (2) auxiliary electric power for fans and pumps, and (3) system installed cost or cost to the user.

  20. Current status of nuclear cardiology: a limited review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botvinick, E.H.; Dae, M.; Hattner, R.S.

    1985-11-01

    To summarize the current status of nuclear cardiology, the authors will focus on areas that the emphasize the specific advantages of nuclear cardiology methods: (a) their benign, noninvasive nature, (b) their pathophysiologic nature, and (c) the ease of their computer manipulation and analysis, permitting quantitative evaluation. The areas covered include: (a) blood pool scintigraphy and parametric imaging, (b) pharmacologic intervention for the diagnosis of ischemic heart disease, (c) scintigraphic studies for the diagnosis and prognosis of coronary artery disease, and (d) considerations of cost effectiveness.

  1. Analysis of the Effect of Historical Cultural Changes Relative to the Development of Affordability Excursions to Existing Parametric Cost Models

    DTIC Science & Technology

    1988-09-30

    DISTRiUTIONWtAVAu.ASiUTY OF REPORT 2b. DECLASSIFICATON I DOW’NGRADING SCHEDULE 4. PERFORMING ORGANIZATION REPORT NUMBER(S) S. MONITORING ORGANIZATION...REPORT NUMBER(S) 6a. NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION WCW Associates, Inc. Battel le 6c. ADDRESS...Cycle Figure 8-6 Induced Change k Figure 8-7 The Culture- Performance Relationship Figure 8-8 Culture-Productivity Bridge vi Preface Our cultural

  2. Solar heating and cooling technical data and systems analysis

    NASA Technical Reports Server (NTRS)

    Christensen, D. L.

    1976-01-01

    The acquisition and processing of selected parametric data for inclusion in a computerized Data Base using the Marshall Information Retrieval and Data System (MIRADS) developed by NASA-MSFC is discussed. This data base provides extensive technical and socioeconomic information related to solar energy heating and cooling on a national scale. A broadly based research approach was used to assist in the support of program management and the application of a cost-effective program for solar energy development and demonstration.

  3. Incorporating external evidence in trial-based cost-effectiveness analyses: the use of resampling methods

    PubMed Central

    2014-01-01

    Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356

  4. Incorporating external evidence in trial-based cost-effectiveness analyses: the use of resampling methods.

    PubMed

    Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling

    2014-06-03

    Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.

  5. Orthognathic cases: what are the surgical costs?

    PubMed

    Kumar, Sanjay; Williams, Alison C; Ireland, Anthony J; Sandy, Jonathan R

    2008-02-01

    This multicentre, retrospective, study assessed the cost, and factors influencing the cost, of combined orthodontic and surgical treatment for dentofacial deformity. The sample, from a single region in England, comprised 352 subjects treated in 11 hospital orthodontic units who underwent orthognathic surgery between 1 January 1995 and 31 March 2000. Statistical analysis of the data was undertaken using non-parametric tests (Spearman and Wilcoxon signed rank). The average total treatment cost for the tax year from 6 April 2000 to 5 April 2001 was euro6360.19, with costs ranging from euro3835.90 to euro12 150.55. The average operating theatre cost was euro2189.54 and the average inpatient care (including the cost of the intensive care unit and ward stay) was euro1455.20. Joint clinic costs comprised, on average, 10 per cent of the total cost, whereas appointments in other specialities, apart from orthodontics, comprised 2 per cent of the total costs. Differences in the observed costings between the units were unexplained but may reflect surgical difficulties, differences in clinical practice, or efficiency of patient care. These indicators need to be considered in future outcome studies for orthognathic patients.

  6. Phase noise suppression through parametric filtering

    NASA Astrophysics Data System (ADS)

    Cassella, Cristian; Strachan, Scott; Shaw, Steven W.; Piazza, Gianluca

    2017-02-01

    In this work, we introduce and experimentally demonstrate a parametric phase noise suppression technique, which we call "parametric phase noise filtering." This technique is based on the use of a solid-state parametric amplifier operating in its instability region and included in a non-autonomous feedback loop connected at the output of a noisy oscillator. We demonstrate that such a system behaves as a parametrically driven Duffing resonator and can operate at special points where it becomes largely immune to the phase fluctuations that affect the oscillator output signal. A prototype of a parametric phase noise filter (PFIL) was designed and fabricated to operate in the very-high-frequency range. The PFIL prototype allowed us to significantly reduce the phase noise at the output of a commercial signal generator operating around 220 MHz. Noise reduction of 16 dB (40×) and 13 dB (20×) were obtained, respectively, at 1 and 10 kHz offsets from the carrier frequency. The demonstration of this phase noise suppression technique opens up scenarios in the development of passive and low-cost phase noise cancellation circuits for any application demanding high quality frequency generation.

  7. Conceptual design of reduced energy transports

    NASA Technical Reports Server (NTRS)

    Ardema, M. D.; Harper, M.; Smith, C. L.; Waters, M. H.; Williams, L. J.

    1975-01-01

    This paper reports the results of a conceptual design study of new, near-term fuel-conservative aircraft. A parametric study was made to determine the effects of cruise Mach number and fuel cost on the 'optimum' configuration characteristics and on economic performance. Supercritical wing technology and advanced engine cycles were assumed. For each design, the wing geometry was optimized to give maximum return on investment at a particular fuel cost. Based on the results of the parametric study, a reduced energy configuration was selected. Compared with existing transport designs, the reduced energy design has a higher aspect ratio wing with lower sweep, and cruises at a lower Mach number. It yields about 30% more seat-miles/gal than current wide-body aircraft. At the higher fuel costs anticipated in the future, the reduced energy design has about the same economic performance as existing designs.

  8. Summary and evaluation of the parametric study of potential early commercial MHD power plants (PSPEC)

    NASA Technical Reports Server (NTRS)

    Staigner, P. J.; Abbott, J. M.

    1980-01-01

    Two parallel contracted studies were conducted. Each contractor investigated three base cases and parametric variations about these base cases. Each contractor concluded that two of the base cases (a plant using separate firing of an advanced high temperature regenerative air heater with fuel from an advanced coal gasifier and a plant using an intermediate temperature metallic recuperative heat exchanger to heat oxygen enriched combustion air) were comparable in both performance and cost of electricity. The contractors differed in the level of their cost estimates with the capital cost estimates for the MHD topping cycle and the magnet subsystem in particular accounting for a significant part of the difference. The impact of the study on the decision to pursue a course which leads to an oxygen enriched plant as the first commercial MHD plant is described.

  9. A Cartesian parametrization for the numerical analysis of material instability

    DOE PAGES

    Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.; ...

    2016-02-25

    We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less

  10. A Cartesian parametrization for the numerical analysis of material instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mota, Alejandro; Chen, Qiushi; Foulk, III, James W.

    We examine four parametrizations of the unit sphere in the context of material stability analysis by means of the singularity of the acoustic tensor. We then propose a Cartesian parametrization for vectors that lie a cube of side length two and use these vectors in lieu of unit normals to test for the loss of the ellipticity condition. This parametrization is then used to construct a tensor akin to the acoustic tensor. It is shown that both of these tensors become singular at the same time and in the same planes in the presence of a material instability. Furthermore, themore » performance of the Cartesian parametrization is compared against the other parametrizations, with the results of these comparisons showing that in general, the Cartesian parametrization is more robust and more numerically efficient than the others.« less

  11. Cost Estimation of Naval Ship Acquisition.

    DTIC Science & Technology

    1983-12-01

    one a 9-sub- system model , the other a single total cost model . The models were developed using the linear least squares regression tech- nique with...to Linear Statistical Models , McGraw-Hill, 1961. 11. Helmer, F. T., Bibliography on Pricing Methodology and Cost Estimating, Dept. of Economics and...SUPPI.EMSaTARY NOTES IS. KWRo" (Cowaft. en tever aide of ..aesep M’ Idab~t 6 Week ONNa.) Cost estimation; Acquisition; Parametric cost estimate; linear

  12. Parametric Methods for Dynamic 11C-Phenytoin PET Studies.

    PubMed

    Mansor, Syahir; Yaqub, Maqsood; Boellaard, Ronald; Froklage, Femke E; de Vries, Anke; Bakker, Esther D M; Voskuyl, Rob A; Eriksson, Jonas; Schwarte, Lothar A; Verbeek, Joost; Windhorst, Albert D; Lammertsma, Adriaan A

    2017-03-01

    In this study, the performance of various methods for generating quantitative parametric images of dynamic 11 C-phenytoin PET studies was evaluated. Methods: Double-baseline 60-min dynamic 11 C-phenytoin PET studies, including online arterial sampling, were acquired for 6 healthy subjects. Parametric images were generated using Logan plot analysis, a basis function method, and spectral analysis. Parametric distribution volume (V T ) and influx rate ( K 1 ) were compared with those obtained from nonlinear regression analysis of time-activity curves. In addition, global and regional test-retest (TRT) variability was determined for parametric K 1 and V T values. Results: Biases in V T observed with all parametric methods were less than 5%. For K 1 , spectral analysis showed a negative bias of 16%. The mean TRT variabilities of V T and K 1 were less than 10% for all methods. Shortening the scan duration to 45 min provided similar V T and K 1 with comparable TRT performance compared with 60-min data. Conclusion: Among the various parametric methods tested, the basis function method provided parametric V T and K 1 values with the least bias compared with nonlinear regression data and showed TRT variabilities lower than 5%, also for smaller volume-of-interest sizes (i.e., higher noise levels) and shorter scan duration. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  13. Fuel cell on-site integrated energy system parametric analysis of a residential complex

    NASA Technical Reports Server (NTRS)

    Simons, S. N.

    1977-01-01

    A parametric energy-use analysis was performed for a large apartment complex served by a fuel cell on-site integrated energy system (OS/IES). The variables parameterized include operating characteristics for four phosphoric acid fuel cells, eight OS/IES energy recovery systems, and four climatic locations. The annual fuel consumption for selected parametric combinations are presented and a breakeven economic analysis is presented for one parametric combination. The results show fuel cell electrical efficiency and system component choice have the greatest effect on annual fuel consumption; fuel cell thermal efficiency and geographic location have less of an effect.

  14. Cost Modeling for Space Telescope

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2011-01-01

    Parametric cost models are an important tool for planning missions, compare concepts and justify technology investments. This paper presents on-going efforts to develop single variable and multi-variable cost models for space telescope optical telescope assembly (OTA). These models are based on data collected from historical space telescope missions. Standard statistical methods are used to derive CERs for OTA cost versus aperture diameter and mass. The results are compared with previously published models.

  15. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  16. Cost analysis of a coal-fired power plant using the NPV method

    NASA Astrophysics Data System (ADS)

    Kumar, Ravinder; Sharma, Avdhesh Kr.; Tewari, P. C.

    2015-12-01

    The present study investigates the impact of various factors affecting coal-fired power plant economics of 210 MW subcritical unit situated in north India for electricity generation. In this paper, the cost data of various units of thermal power plant in terms of power output capacity have been fitted using power law with the help of the data collected from a literature search. To have a realistic estimate of primary components or equipment, it is necessary to include the latest cost of these components. The cost analysis of the plant was carried out on the basis of total capital investment, operating cost and revenue. The total capital investment includes the total direct plant cost and total indirect plant cost. Total direct plant cost involves the cost of equipment (i.e. boiler, steam turbine, condenser, generator and auxiliary equipment including condensate extraction pump, feed water pump, etc.) and other costs associated with piping, electrical, civil works, direct installation cost, auxiliary services, instrumentation and controls, and site preparation. The total indirect plant cost includes the cost of engineering and set-up. The net present value method was adopted for the present study. The work presented in this paper is an endeavour to study the influence of some of the important parameters on the lifetime costs of a coal-fired power plant. For this purpose, parametric study with and without escalation rates for a period of 35 years plant life was evaluated. The results predicted that plant life, interest rate and the escalation rate were observed to be very sensitive on plant economics in comparison to other factors under study.

  17. Application of Climate Impact Metrics to Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    Russell, Carl; Johnson, Wayne

    2013-01-01

    Multiple metrics are applied to the design of large civil rotorcraft, integrating minimum cost and minimum environmental impact. The design mission is passenger transport with similar range and capacity to a regional jet. Separate aircraft designs are generated for minimum empty weight, fuel burn, and environmental impact. A metric specifically developed for the design of aircraft is employed to evaluate emissions. The designs are generated using the NDARC rotorcraft sizing code, and rotor analysis is performed with the CAMRAD II aeromechanics code. Design and mission parameters such as wing loading, disk loading, and cruise altitude are varied to minimize both cost and environmental impact metrics. This paper presents the results of these parametric sweeps as well as the final aircraft designs.

  18. Application of Climate Impact Metrics to Civil Tiltrotor Design

    NASA Technical Reports Server (NTRS)

    Russell, Carl R.; Johnson, Wayne

    2013-01-01

    Multiple metrics are applied to the design of a large civil tiltrotor, integrating minimum cost and minimum environmental impact. The design mission is passenger transport with similar range and capacity to a regional jet. Separate aircraft designs are generated for minimum empty weight, fuel burn, and environmental impact. A metric specifically developed for the design of aircraft is employed to evaluate emissions. The designs are generated using the NDARC rotorcraft sizing code, and rotor analysis is performed with the CAMRAD II aeromechanics code. Design and mission parameters such as wing loading, disk loading, and cruise altitude are varied to minimize both cost and environmental impact metrics. This paper presents the results of these parametric sweeps as well as the final aircraft designs.

  19. The gate studies: Assessing the potential of future small general aviation turbine engines

    NASA Technical Reports Server (NTRS)

    Strack, W. C.

    1979-01-01

    Four studies were completed that explore the opportunities for future General Aviation turbine engines (GATE) in the 150-1000 SHP class. These studies forecasted the potential impact of advanced technology turbine engines in the post-1988 market, identified important aircraft and missions, desirable engine sizes, engine performance, and cost goals. Parametric evaluations of various engine cycles, configurations, design features, and advanced technology elements defined baseline conceptual engines for each of the important missions identified by the market analysis. Both fixed-wing and helicopter aircraft, and turboshaft, turboprop, and turbofan engines were considered. Sizable performance gains (e.g., 20% SFC decrease), and large engine cost reductions of sufficient magnitude to challenge the reciprocating engine in the 300-500 SHP class were predicted.

  20. Cost-effectiveness of early intervention in first-episode psychosis: economic evaluation of a randomised controlled trial (the OPUS study).

    PubMed

    Hastrup, Lene Halling; Kronborg, Christian; Bertelsen, Mette; Jeppesen, Pia; Jorgensen, Per; Petersen, Lone; Thorup, Anne; Simonsen, Erik; Nordentoft, Merete

    2013-01-01

    Information about the cost-effectiveness of early intervention programmes for first-episode psychosis is limited. To evaluate the cost-effectiveness of an intensive early-intervention programme (called OPUS) (trial registration NCT00157313) consisting of enriched assertive community treatment, psychoeducational family treatment and social skills training for individuals with first-episode psychosis compared with standard treatment. An incremental cost-effectiveness analysis of a randomised controlled trial, adopting a public sector perspective was undertaken. The mean total costs of OPUS over 5 years (€123,683, s.e. = 8970) were not significantly different from that of standard treatment (€148,751, s.e. = 13073). At 2-year follow-up the mean Global Assessment of Functioning (GAF) score in the OPUS group (55.16, s.d. = 15.15) was significantly higher than in standard treatment group (51.13, s.d. = 15.92). However, the mean GAF did not differ significantly between the groups at 5-year follow-up (55.35 (s.d. = 18.28) and 54.16 (s.d. = 18.41), respectively). Cost-effectiveness planes based on non-parametric bootstrapping showed that OPUS was less costly and more effective in 70% of the replications. For a willingness-to-pay up to €50,000 the probability that OPUS was cost-effective was more than 80%. The incremental cost-effectiveness analysis showed that there was a high probability of OPUS being cost-effective compared with standard treatment.

  1. NASA/Air Force Cost Model: NAFCOM

    NASA Technical Reports Server (NTRS)

    Winn, Sharon D.; Hamcher, John W. (Technical Monitor)

    2002-01-01

    The NASA/Air Force Cost Model (NAFCOM) is a parametric estimating tool for space hardware. It is based on historical NASA and Air Force space projects and is primarily used in the very early phases of a development project. NAFCOM can be used at the subsystem or component levels.

  2. Exergy & economic analysis of biogas fueled solid oxide fuel cell systems

    NASA Astrophysics Data System (ADS)

    Siefert, Nicholas S.; Litster, Shawn

    2014-12-01

    We present an exergy and an economic analysis of a power plant that uses biogas produced from a thermophilic anaerobic digester (AD) to fuel a solid oxide fuel cell (SOFC). We performed a 4-variable parametric analysis of the AD-SOFC system in order to determine the optimal design operation conditions, depending on the objective function of interest. We present results on the exergy efficiency (%), power normalized capital cost ( kW-1), and the internal rate of return on investment, IRR, (% yr-1) as a function of the current density, the stack pressure, the fuel utilization, and the total air stoichiometric ratio. To the authors' knowledge, this is the first AD-SOFC paper to include the cost of the AD when conducting economic optimization of the AD-SOFC plant. Our calculations show that adding a new AD-SOFC system to an existing waste water treatment (WWT) plant could yield positives values of IRR at today's average electricity prices and could significantly out-compete other options for using biogas to generate electricity. AD-SOFC systems could likely convert WWT plants into net generators of electricity rather than net consumers of electricity while generating economically viable rates of return on investment if the costs of SOFC systems are within a factor of two of the DOE/SECA cost targets.

  3. Preliminary Multi-Variable Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. This paper reviews the methodology used to develop space telescope cost models; summarizes recently published single variable models; and presents preliminary results for two and three variable cost models. Some of the findings are that increasing mass reduces cost; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and technology development as a function of time reduces cost at the rate of 50% per 17 years.

  4. Structural Mass Saving Potential of a 5-MW Direct-Drive Generator Designed for Additive Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethuraman, Latha; Fingersh, Lee J; Dykes, Katherine L

    As wind turbine blade diameters and tower height increase to capture more energy in the wind, higher structural loads results in more structural support material increasing the cost of scaling. Weight reductions in the generator transfer to overall cost savings of the system. Additive manufacturing facilitates a design-for-functionality approach, thereby removing traditional manufacturing constraints and labor costs. The most feasible additive manufacturing technology identified for large, direct-drive generators in this study is powder-binder jetting of a sand cast mold. A parametric finite element analysis optimization study is performed, optimizing for mass and deformation. Also, topology optimization is employed for eachmore » parameter-optimized design.The optimized U-beam spoked web design results in a 24 percent reduction in structural mass of the rotor and 60 percent reduction in radial deflection.« less

  5. Online Detection of Broken Rotor Bar Fault in Induction Motors by Combining Estimation of Signal Parameters via Min-norm Algorithm and Least Square Method

    NASA Astrophysics Data System (ADS)

    Wang, Pan-Pan; Yu, Qiang; Hu, Yong-Jun; Miao, Chang-Xin

    2017-11-01

    Current research in broken rotor bar (BRB) fault detection in induction motors is primarily focused on a high-frequency resolution analysis of the stator current. Compared with a discrete Fourier transformation, the parametric spectrum estimation technique has a higher frequency accuracy and resolution. However, the existing detection methods based on parametric spectrum estimation cannot realize online detection, owing to the large computational cost. To improve the efficiency of BRB fault detection, a new detection method based on the min-norm algorithm and least square estimation is proposed in this paper. First, the stator current is filtered using a band-pass filter and divided into short overlapped data windows. The min-norm algorithm is then applied to determine the frequencies of the fundamental and fault characteristic components with each overlapped data window. Next, based on the frequency values obtained, a model of the fault current signal is constructed. Subsequently, a linear least squares problem solved through singular value decomposition is designed to estimate the amplitudes and phases of the related components. Finally, the proposed method is applied to a simulated current and an actual motor, the results of which indicate that, not only parametric spectrum estimation technique.

  6. Model-based approach for design verification and co-optimization of catastrophic and parametric-related defects due to systematic manufacturing variations

    NASA Astrophysics Data System (ADS)

    Perry, Dan; Nakamoto, Mark; Verghese, Nishath; Hurat, Philippe; Rouse, Rich

    2007-03-01

    Model-based hotspot detection and silicon-aware parametric analysis help designers optimize their chips for yield, area and performance without the high cost of applying foundries' recommended design rules. This set of DFM/ recommended rules is primarily litho-driven, but cannot guarantee a manufacturable design without imposing overly restrictive design requirements. This rule-based methodology of making design decisions based on idealized polygons that no longer represent what is on silicon needs to be replaced. Using model-based simulation of the lithography, OPC, RET and etch effects, followed by electrical evaluation of the resulting shapes, leads to a more realistic and accurate analysis. This analysis can be used to evaluate intelligent design trade-offs and identify potential failures due to systematic manufacturing defects during the design phase. The successful DFM design methodology consists of three parts: 1. Achieve a more aggressive layout through limited usage of litho-related recommended design rules. A 10% to 15% area reduction is achieved by using more aggressive design rules. DFM/recommended design rules are used only if there is no impact on cell size. 2. Identify and fix hotspots using a model-based layout printability checker. Model-based litho and etch simulation are done at the cell level to identify hotspots. Violations of recommended rules may cause additional hotspots, which are then fixed. The resulting design is ready for step 3. 3. Improve timing accuracy with a process-aware parametric analysis tool for transistors and interconnect. Contours of diffusion, poly and metal layers are used for parametric analysis. In this paper, we show the results of this physical and electrical DFM methodology at Qualcomm. We describe how Qualcomm was able to develop more aggressive cell designs that yielded a 10% to 15% area reduction using this methodology. Model-based shape simulation was employed during library development to validate architecture choices and to optimize cell layout. At the physical verification stage, the shape simulator was run at full-chip level to identify and fix residual hotspots on interconnect layers, on poly or metal 1 due to interaction between adjacent cells, or on metal 1 due to interaction between routing (via and via cover) and cell geometry. To determine an appropriate electrical DFM solution, Qualcomm developed an experiment to examine various electrical effects. After reporting the silicon results of this experiment, which showed sizeable delay variations due to lithography-related systematic effects, we also explain how contours of diffusion, poly and metal can be used for silicon-aware parametric analysis of transistors and interconnect at the cell-, block- and chip-level.

  7. A Review of Australian Investigations on Aeronautical Fatigue during the Period April 1979 to March 1981.

    DTIC Science & Technology

    1981-03-01

    RD73 9. COST CODE: b. Sponsoring Agency: 27003 SUPPLY 50/2 10. IMPRINT: 11. COMPUTER PROGRAM(S) Aeronautical Research (Title(s) and language(s...laminates. 9/24 An advanced iso -parametric element is also being Jeveloped specifically for the analysis of disbonds and internal flaws in composite...FAILURE - STATION 119 iso I f FIG. 9.3 NOMAD STRLFCI URAl I AlT 10(L TESI FIG. 9.4 FAILED NOMAD STRUT UPPER END FITTING FIG. 9.5 FRACTURE FACES OF FAILED

  8. Evaluation of automobiles with alternative fuels utilizing multicriteria techniques

    NASA Astrophysics Data System (ADS)

    Brey, J. J.; Contreras, I.; Carazo, A. F.; Brey, R.; Hernández-Díaz, A. G.; Castro, A.

    This work applies the non-parametric technique of Data Envelopment Analysis (DEA) to conduct a multicriteria comparison of some existing and under development technologies in the automotive sector. The results indicate that some of the technologies under development, such as hydrogen fuel cell vehicles, can be classified as efficient when evaluated in function of environmental and economic criteria, with greater importance being given to the environmental criteria. The article also demonstrates the need to improve the hydrogen-based technology, in comparison with the others, in aspects such as vehicle sale costs and fuel price.

  9. Concept definition study of small Brayton cycle engines for dispersed solar electric power systems

    NASA Technical Reports Server (NTRS)

    Six, L. D.; Ashe, T. L.; Dobler, F. X.; Elkins, R. T.

    1980-01-01

    Three first-generation Brayton cycle engine types were studied for solar application: a near-term open cycle (configuration A), a near-term closed cycle (configuration B), and a longer-term open cycle (configuration C). A parametric performance analysis was carried out to select engine designs for the three configurations. The interface requirements for the Brayton cycle engine/generator and solar receivers were determined. A technology assessment was then carried out to define production costs, durability, and growth potential for the selected engine types.

  10. A cost-consequences analysis of an adherence focused pharmacist-led medication review service.

    PubMed

    Desborough, James A; Sach, Tracey; Bhattacharya, Debi; Holland, Richard C; Wright, David J

    2012-02-01

    The aim of this project was to conduct an economic evaluation of the Norfolk Medicines Support Service (NMSS), a pharmacist-led medication review service for patients identified in primary care as non-adherent. The cost-consequences analysis was based on a before and after evaluation of the NMSS. Participants completed a self-reported adherence and health-related quality of life questionnaire prior to the review, at 6 weeks and 6 months. Service provision, prescribing and secondary care costs were considered and the mean cost before and after the intervention was calculated. One-hundred and seventeen patients were included in the evaluation. The mean cost per patient of prescribing and hospital admissions in the 6 months prior to the intervention was £2190 and in the 6 months after intervention £1883. This equates to a mean cost saving of £307 per patient (parametric 95% confidence interval: £1269 to £655). The intervention reduced emergency hospital admissions and increased medication adherence but no significant change in health-related quality of life was observed. The costs of providing this medication review service were offset by the reduction in emergency hospital admissions and savings in medication cost, assuming the findings of the evaluation were real and the regression to the mean phenomenon was not involved. This cost-consequences approach provides a transparent descriptive summary for decision-makers to use as the basis for resource allocation decisions. © 2011 The Authors. IJPP © 2011 Royal Pharmaceutical Society.

  11. Preliminary design study of advanced multistage axial flow core compressors

    NASA Technical Reports Server (NTRS)

    Wisler, D. C.; Koch, C. C.; Smith, L. H., Jr.

    1977-01-01

    A preliminary design study was conducted to identify an advanced core compressor for use in new high-bypass-ratio turbofan engines to be introduced into commercial service in the 1980's. An evaluation of anticipated compressor and related component 1985 state-of-the-art technology was conducted. A parametric screening study covering a large number of compressor designs was conducted to determine the influence of the major compressor design features on efficiency, weight, cost, blade life, aircraft direct operating cost, and fuel usage. The trends observed in the parametric screening study were used to develop three high-efficiency, high-economic-payoff compressor designs. These three compressors were studied in greater detail to better evaluate their aerodynamic and mechanical feasibility.

  12. Cost Modeling for Space Optical Telescope Assemblies

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2011-01-01

    Parametric cost models are used to plan missions, compare concepts and justify technology investments. This paper reviews an on-going effort to develop cost modes for space telescopes. This paper summarizes the methodology used to develop cost models and documents how changes to the database have changed previously published preliminary cost models. While the cost models are evolving, the previously published findings remain valid: it costs less per square meter of collecting aperture to build a large telescope than a small telescope; technology development as a function of time reduces cost; and lower areal density telescopes cost more than more massive telescopes.

  13. Analysis of a fuel cell on-site integrated energy system for a residential complex

    NASA Technical Reports Server (NTRS)

    Simons, S. N.; Maag, W. L.

    1979-01-01

    The energy use and costs of the on-site integrated energy system (OS/IES) which provides electric power from an on-site power plant and recovers heat that would normally be rejected to the environment is compared to a conventional system purchasing electricity from a utility and a phosphoric acid fuel cell powered system. The analysis showed that for a 500-unit apartment complex a fuel OS/IES would be about 10% more energy conservative in terms of total coal consumption than a diesel OS/IES system or a conventional system. The fuel cell OS/IES capital costs could be 30 to 55% greater than the diesel OS/IES capital costs for the same life cycle costs. The life cycle cost of a fuel cell OS/IES would be lower than that for a conventional system as long as the cost of electricity is greater than $0.05 to $0.065/kWh. An analysis of several parametric combinations of fuel cell power plant and state-of-art energy recovery systems and annual fuel requirement calculations for four locations were made. It was shown that OS/IES component choices are a major factor in fuel consumption, with the least efficient system using 25% more fuel than the most efficient. Central air conditioning and heat pumps result in minimum fuel consumption while individual air conditioning units increase it, and in general the fuel cell of highest electrical efficiency has the lowest fuel consumption.

  14. Problems of the design of low-noise input devices. [parametric amplifiers

    NASA Technical Reports Server (NTRS)

    Manokhin, V. M.; Nemlikher, Y. A.; Strukov, I. A.; Sharfov, Y. A.

    1974-01-01

    An analysis is given of the requirements placed on the elements of parametric centimeter waveband amplifiers for achievement of minimal noise temperatures. A low-noise semiconductor parametric amplifier using germanium parametric diodes for a receiver operating in the 4 GHz band was developed and tested confirming the possibility of satisfying all requirements.

  15. Thin-Film Photovoltaic Solar Array Parametric Assessment

    NASA Technical Reports Server (NTRS)

    Hoffman, David J.; Kerslake, Thomas W.; Hepp, Aloysius F.; Jacobs, Mark K.; Ponnusamy, Deva

    2000-01-01

    This paper summarizes a study that had the objective to develop a model and parametrically determine the circumstances for which lightweight thin-film photovoltaic solar arrays would be more beneficial, in terms of mass and cost, than arrays using high-efficiency crystalline solar cells. Previous studies considering arrays with near-term thin-film technology for Earth orbiting applications are briefly reviewed. The present study uses a parametric approach that evaluated the performance of lightweight thin-film arrays with cell efficiencies ranging from 5 to 20 percent. The model developed for this study is described in some detail. Similar mass and cost trends for each array option were found across eight missions of various power levels in locations ranging from Venus to Jupiter. The results for one specific mission, a main belt asteroid tour, indicate that only moderate thin-film cell efficiency (approx. 12 percent) is necessary to match the mass of arrays using crystalline cells with much greater efficiency (35 percent multi-junction GaAs based and 20 percent thin-silicon). Regarding cost, a 12 percent efficient thin-film array is projected to cost about half is much as a 4-junction GaAs array. While efficiency improvements beyond 12 percent did not significantly further improve the mass and cost benefits for thin-film arrays, higher efficiency will be needed to mitigate the spacecraft-level impacts associated with large deployed array areas. A low-temperature approach to depositing thin-film cells on lightweight, flexible plastic substrates is briefly described. The paper concludes with the observation that with the characteristics assumed for this study, ultra-lightweight arrays using efficient, thin-film cells on flexible substrates may become a leading alternative for a wide variety of space missions.

  16. Cost-effectiveness of cerebrospinal biomarkers for the diagnosis of Alzheimer's disease.

    PubMed

    Lee, Spencer A W; Sposato, Luciano A; Hachinski, Vladimir; Cipriano, Lauren E

    2017-03-16

    Accurate and timely diagnosis of Alzheimer's disease (AD) is important for prompt initiation of treatment in patients with AD and to avoid inappropriate treatment of patients with false-positive diagnoses. Using a Markov model, we estimated the lifetime costs and quality-adjusted life-years (QALYs) of cerebrospinal fluid biomarker analysis in a cohort of patients referred to a neurologist or memory clinic with suspected AD who remained without a definitive diagnosis of AD or another condition after neuroimaging. Parametric values were estimated from previous health economic models and the medical literature. Extensive deterministic and probabilistic sensitivity analyses were performed to evaluate the robustness of the results. At a 12.7% pretest probability of AD, biomarker analysis after normal neuroimaging findings has an incremental cost-effectiveness ratio (ICER) of $11,032 per QALY gained. Results were sensitive to the pretest prevalence of AD, and the ICER increased to over $50,000 per QALY when the prevalence of AD fell below 9%. Results were also sensitive to patient age (biomarkers are less cost-effective in older cohorts), treatment uptake and adherence, biomarker test characteristics, and the degree to which patients with suspected AD who do not have AD benefit from AD treatment when they are falsely diagnosed. The cost-effectiveness of biomarker analysis depends critically on the prevalence of AD in the tested population. In general practice, where the prevalence of AD after clinical assessment and normal neuroimaging findings may be low, biomarker analysis is unlikely to be cost-effective at a willingness-to-pay threshold of $50,000 per QALY gained. However, when at least 1 in 11 patients has AD after normal neuroimaging findings, biomarker analysis is likely cost-effective. Specifically, for patients referred to memory clinics with memory impairment who do not present neuroimaging evidence of medial temporal lobe atrophy, pretest prevalence of AD may exceed 15%. Biomarker analysis is a potentially cost-saving diagnostic method and should be considered for adoption in high-prevalence centers.

  17. [Detection of quadratic phase coupling between EEG signal components by nonparamatric and parametric methods of bispectral analysis].

    PubMed

    Schmidt, K; Witte, H

    1999-11-01

    Recently the assumption of the independence of individual frequency components in a signal has been rejected, for example, for the EEG during defined physiological states such as sleep or sedation [9, 10]. Thus, the use of higher-order spectral analysis capable of detecting interrelations between individual signal components has proved useful. The aim of the present study was to investigate the quality of various non-parametric and parametric estimation algorithms using simulated as well as true physiological data. We employed standard algorithms available for the MATLAB. The results clearly show that parametric bispectral estimation is superior to non-parametric estimation in terms of the quality of peak localisation and the discrimination from other peaks.

  18. Technology Estimating 2: A Process to Determine the Cost and Schedule of Space Technology Research and Development

    NASA Technical Reports Server (NTRS)

    Cole, Stuart K.; Wallace, Jon; Schaffer, Mark; May, M. Scott; Greenberg, Marc W.

    2014-01-01

    As a leader in space technology research and development, NASA is continuing in the development of the Technology Estimating process, initiated in 2012, for estimating the cost and schedule of low maturity technology research and development, where the Technology Readiness Level is less than TRL 6. NASA' s Technology Roadmap areas consist of 14 technology areas. The focus of this continuing Technology Estimating effort included four Technology Areas (TA): TA3 Space Power and Energy Storage, TA4 Robotics, TA8 Instruments, and TA12 Materials, to confine the research to the most abundant data pool. This research report continues the development of technology estimating efforts completed during 2013-2014, and addresses the refinement of parameters selected and recommended for use in the estimating process, where the parameters developed are applicable to Cost Estimating Relationships (CERs) used in the parametric cost estimating analysis. This research addresses the architecture for administration of the Technology Cost and Scheduling Estimating tool, the parameters suggested for computer software adjunct to any technology area, and the identification of gaps in the Technology Estimating process.

  19. Antenna concepts for interstellar search systems

    NASA Technical Reports Server (NTRS)

    Basler, R. P.; Johnson, G. L.; Vondrak, R. R.

    1977-01-01

    An evaluation is made of microwave receiving systems designed to search for signals from extraterrestrial intelligence. Specific design concepts are analyzed parametrically to determine whether the optimum antenna system location is on earth, in space, or on the moon. Parameters considered include the hypothesized number of transmitting civilizations, the number of stars that must be searched to give any desired probability of receiving a signal, the antenna collecting area, the search time, the search range, and the cost. This analysis suggests that (1) search systems based on the moon are not cost-competitive, (2) if the search is extended only a few hundred light years from the earth, a Cyclops-type array on earth may be the most cost-effective system, (3) for a search extending to 500 light years or more, a substantial cost and search-time advantage can be achieved with a large spherical reflector in space with multiple feeds, (4) radio frequency interference shields can be provided for space systems, and (5) cost can range from a few hundred million to tens of billions of dollars, depending on the parameter values assumed.

  20. Using Parametric Cost Models to Estimate Engineering and Installation Costs of Selected Electronic Communications Systems

    DTIC Science & Technology

    1994-09-01

    Institute of Technology, Wright- Patterson AFB OH, January 1994. 4. Neter, John and others. Applied Linear Regression Models. Boston: Irwin, 1989. 5...Technology, Wright-Patterson AFB OH 5 April 1994. 29. Neter, John and others. Applied Linear Regression Models. Boston: Irwin, 1989. 30. Office of

  1. Review of Statistical Methods for Analysing Healthcare Resources and Costs

    PubMed Central

    Mihaylova, Borislava; Briggs, Andrew; O'Hagan, Anthony; Thompson, Simon G

    2011-01-01

    We review statistical methods for analysing healthcare resource use and costs, their ability to address skewness, excess zeros, multimodality and heavy right tails, and their ease for general use. We aim to provide guidance on analysing resource use and costs focusing on randomised trials, although methods often have wider applicability. Twelve broad categories of methods were identified: (I) methods based on the normal distribution, (II) methods following transformation of data, (III) single-distribution generalized linear models (GLMs), (IV) parametric models based on skewed distributions outside the GLM family, (V) models based on mixtures of parametric distributions, (VI) two (or multi)-part and Tobit models, (VII) survival methods, (VIII) non-parametric methods, (IX) methods based on truncation or trimming of data, (X) data components models, (XI) methods based on averaging across models, and (XII) Markov chain methods. Based on this review, our recommendations are that, first, simple methods are preferred in large samples where the near-normality of sample means is assured. Second, in somewhat smaller samples, relatively simple methods, able to deal with one or two of above data characteristics, may be preferable but checking sensitivity to assumptions is necessary. Finally, some more complex methods hold promise, but are relatively untried; their implementation requires substantial expertise and they are not currently recommended for wider applied work. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20799344

  2. First-Order Parametric Model of Reflectance Spectra for Dyed Fabrics

    DTIC Science & Technology

    2016-02-19

    Unclassified Unlimited 31 Daniel Aiken (202) 279-5293 Parametric modeling Inverse /direct analysis This report describes a first-order parametric model of...Appendix: Dielectric Response Functions for Dyes Obtained by Inverse Analysis ……………………………...…………………………………………………….19 1 First-Order Parametric...which provides for both their inverse and direct modeling1. The dyes considered contain spectral features that are of interest to the U.S. Navy for

  3. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography.

    PubMed

    Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-06-01

    Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.

  4. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography

    PubMed Central

    Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-01-01

    Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477

  5. Numerical model of solar dynamic radiator for parametric analysis

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer L.

    1989-01-01

    Growth power requirements for Space Station Freedom will be met through addition of 25 kW solar dynamic (SD) power modules. Extensive thermal and power cycle modeling capabilities have been developed which are powerful tools in Station design and analysis, but which prove cumbersome and costly for simple component preliminary design studies. In order to aid in refining the SD radiator to the mature design stage, a simple and flexible numerical model was developed. The model simulates heat transfer and fluid flow performance of the radiator and calculates area mass and impact survivability for many combinations of flow tube and panel configurations, fluid and material properties, and environmental and cycle variations.

  6. Advanced Transportation System Studies. Technical Area 3: Alternate Propulsion Subsystem Concepts. Volume 1; Executive Summary

    NASA Technical Reports Server (NTRS)

    Levack, Daniel J. H.

    2000-01-01

    The Alternate Propulsion Subsystem Concepts contract had seven tasks defined that are reported under this contract deliverable. The tasks were: FAA Restart Study, J-2S Restart Study, Propulsion Database Development. SSME Upper Stage Use. CERs for Liquid Propellant Rocket Engines. Advanced Low Cost Engines, and Tripropellant Comparison Study. The two restart studies, F-1A and J-2S, generated program plans for restarting production of each engine. Special emphasis was placed on determining changes to individual parts due to obsolete materials, changes in OSHA and environmental concerns, new processes available, and any configuration changes to the engines. The Propulsion Database Development task developed a database structure and format which is easy to use and modify while also being comprehensive in the level of detail available. The database structure included extensive engine information and allows for parametric data generation for conceptual engine concepts. The SSME Upper Stage Use task examined the changes needed or desirable to use the SSME as an upper stage engine both in a second stage and in a translunar injection stage. The CERs for Liquid Engines task developed qualitative parametric cost estimating relationships at the engine and major subassembly level for estimating development and production costs of chemical propulsion liquid rocket engines. The Advanced Low Cost Engines task examined propulsion systems for SSTO applications including engine concept definition, mission analysis. trade studies. operating point selection, turbomachinery alternatives, life cycle cost, weight definition. and point design conceptual drawings and component design. The task concentrated on bipropellant engines, but also examined tripropellant engines. The Tripropellant Comparison Study task provided an unambiguous comparison among various tripropellant implementation approaches and cycle choices, and then compared them to similarly designed bipropellant engines in the SSTO mission This volume overviews each of the tasks giving its objectives, main results. and conclusions. More detailed Final Task Reports are available on each individual task.

  7. EEG Correlates of Fluctuation in Cognitive Performance in an Air Traffic Control Task

    DTIC Science & Technology

    2014-11-01

    using non-parametric statistical analysis to identify neurophysiological patterns due to the time-on-task effect. Significant changes in EEG power...EEG, Cognitive Performance, Power Spectral Analysis , Non-Parametric Analysis Document is available to the public through the Internet...3 Performance Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 EEG

  8. A population-based study of hospital care costs during five years after TIA and stroke

    PubMed Central

    Luengo-Fernandez, Ramon; Gray, Alastair M.; Rothwell, Peter M.

    2016-01-01

    Background and Purpose Few studies have evaluated long-term costs after stroke onset, with almost no cost data for TIA. We studied hospital costs during the 5 years after TIA or stroke in a population-based study. Methods Patients from a UK population-based cohort study (Oxford Vascular Study) were recruited from 2002 to 2007. Analysis was based on follow-up until 2010. Hospital resource usage was obtained from patients’ hospital records and valued using 2008/09 unit costs. As not all patients had full 5-year follow-up, we used non-parametric censoring techniques. Results Among 485 TIA and 729 stroke patients ascertained and included, mean censor-adjusted 5-year hospital costs after index stroke were $25,741 (95% CI: 23,659-27,914), with costs varying considerably by severity: $21,134 after minor stroke, $33,119 after moderate stroke, and $28,552 after severe stroke. For the 239 surviving stroke patients who had reached final follow-up, mean costs were $24,383 (20,156-28,595), with over half of costs ($12,972) being incurred in the first year after the event. After index TIA, the mean censor-adjusted 5-year costs were $18,091 (15,947-20,258). A multivariate analysis showed that event severity, recurrent stroke and coronary events after the index event were independent predictors of 5-year costs. Differences by stroke subtype were mostly explained by stroke severity and subsequent events. Conclusions Long-term hospital costs after TIA and stroke are considerable, but are mainly incurred over the first year after the index event. Event severity and suffering subsequent stroke and coronary events after the index event accounted for much of the increase in costs. PMID:23160884

  9. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  10. Parametric mapping using spectral analysis for 11C-PBR28 PET reveals neuroinflammation in mild cognitive impairment subjects.

    PubMed

    Fan, Zhen; Dani, Melanie; Femminella, Grazia D; Wood, Melanie; Calsolaro, Valeria; Veronese, Mattia; Turkheimer, Federico; Gentleman, Steve; Brooks, David J; Hinz, Rainer; Edison, Paul

    2018-07-01

    Neuroinflammation and microglial activation play an important role in amnestic mild cognitive impairment (MCI) and Alzheimer's disease. In this study, we investigated the spatial distribution of neuroinflammation in MCI subjects, using spectral analysis (SA) to generate parametric maps and quantify 11 C-PBR28 PET, and compared these with compartmental and other kinetic models of quantification. Thirteen MCI and nine healthy controls were enrolled in this study. Subjects underwent 11 C-PBR28 PET scans with arterial cannulation. Spectral analysis with an arterial plasma input function was used to generate 11 C-PBR28 parametric maps. These maps were then compared with regional 11 C-PBR28 V T (volume of distribution) using a two-tissue compartment model and Logan graphic analysis. Amyloid load was also assessed with 18 F-Flutemetamol PET. With SA, three component peaks were identified in addition to blood volume. The 11 C-PBR28 impulse response function (IRF) at 90 min produced the lowest coefficient of variation. Single-subject analysis using this IRF demonstrated microglial activation in five out of seven amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake revealed a group-wise significant increase in neuroinflammation in amyloid-positive MCI subjects versus HC in multiple cortical association areas, and particularly in the temporal lobe. Interestingly, compartmental analysis detected group-wise increase in 11 C-PBR28 binding in the thalamus of amyloid-positive MCI subjects, while Logan parametric maps did not perform well. This study demonstrates for the first time that spectral analysis can be used to generate parametric maps of 11 C-PBR28 uptake, and is able to detect microglial activation in amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake allow voxel-wise single-subject analysis and could be used to evaluate microglial activation in individual subjects.

  11. Regression analysis on the variation in efficiency frontiers for prevention stage of HIV/AIDS.

    PubMed

    Kamae, Maki S; Kamae, Isao; Cohen, Joshua T; Neumann, Peter J

    2011-01-01

    To investigate how the cost effectiveness of preventing HIV/AIDS varies across possible efficiency frontiers (EFs) by taking into account potentially relevant external factors, such as prevention stage, and how the EFs can be characterized using regression analysis given uncertainty of the QALY-cost estimates. We reviewed cost-effectiveness estimates for the prevention and treatment of HIV/AIDS published from 2002-2007 and catalogued in the Tufts Medical Center Cost-Effectiveness Analysis (CEA) Registry. We constructed efficiency frontier (EF) curves by plotting QALYs against costs, using methods used by the Institute for Quality and Efficiency in Health Care (IQWiG) in Germany. We stratified the QALY-cost ratios by prevention stage, country of study, and payer perspective, and estimated EF equations using log and square-root models. A total of 53 QALY-cost ratios were identified for HIV/AIDS in the Tufts CEA Registry. Plotted ratios stratified by prevention stage were visually grouped into a cluster consisting of primary/secondary prevention measures and a cluster consisting of tertiary measures. Correlation coefficients for each cluster were statistically significant. For each cluster, we derived two EF equations - one based on the log model, and one based on the square-root model. Our findings indicate that stratification of HIV/AIDS interventions by prevention stage can yield distinct EFs, and that the correlation and regression analyses are useful for parametrically characterizing EF equations. Our study has certain limitations, such as the small number of included articles and the potential for study populations to be non-representative of countries of interest. Nonetheless, our approach could help develop a deeper appreciation of cost effectiveness beyond the deterministic approach developed by IQWiG.

  12. Cost effectiveness of a systematic guidelines-based approach to the prevention and management of vascular disease in a primary care setting.

    PubMed

    Kamboj, Laveena; Oh, Paul; Levine, Mitchell; Kammila, Srinu; Casey, William; Harterre, Don; Goeree, Ron

    2016-01-15

    In Ontario, Canada, the Comprehensive Vascular Disease Prevention and Management Initiative (CVDPMI) was undertaken to improve the vascular health in communities. The CVDPMI significantly improved cardiovascular (CV) risk factor profiles from baseline to follow-up visits including the 10 year Framingham Risk Score (FRS). Although the CVDPMI improved CV risk, the economic value of this program had not been evaluated. We examined the cost effectiveness of the CVDPMI program compared to no CVDPMI program in adult patients identified at risk for an initial or subsequent vascular event in a primary care setting. A one year and a ten year cost effectiveness analyses were conducted. To determine the uncertainty around the cost per life year gained ratio, a non-parametric bootstrap analysis was conducted. The overall population base case analysis at one year resulted in a cost per CV event avoided of $70,423. FRS subgroup analyses showed the high risk cohort (FRS >20%) had an incremental cost effectiveness ratio (ICER) that was dominant. In the moderate risk subgroup (FRS 10%-20%) the ICER was $47,439 per CV event avoided and the low risk subgroup (FRS <10%) showed a highly cost ineffective result of greater than $5 million per CV event avoided. The ten year analysis resulted in a dominant ICER. At one year, the CVDPMI program is economically acceptable for patients at moderate to high risk for CV events. The CVDPMI results in increased life expectancy at an incremental cost saving to the healthcare system over a ten year period. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Parametric Modelling of As-Built Beam Framed Structure in Bim Environment

    NASA Astrophysics Data System (ADS)

    Yang, X.; Koehl, M.; Grussenmeyer, P.

    2017-02-01

    A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.

  14. SOSPAC- SOLAR SPACE POWER ANALYSIS CODE

    NASA Technical Reports Server (NTRS)

    Selcuk, M. K.

    1994-01-01

    The Solar Space Power Analysis Code, SOSPAC, was developed to examine the solar thermal and photovoltaic power generation options available for a satellite or spacecraft in low earth orbit. SOSPAC is a preliminary systems analysis tool and enables the engineer to compare the areas, weights, and costs of several candidate electric and thermal power systems. The configurations studied include photovoltaic arrays and parabolic dish systems to produce electricity only, and in various combinations to provide both thermal and electric power. SOSPAC has been used for comparison and parametric studies of proposed power systems for the NASA Space Station. The initial requirements are projected to be about 40 kW of electrical power, and a similar amount of thermal power with temperatures above 1000 degrees Centigrade. For objects in low earth orbit, the aerodynamic drag caused by suitably large photovoltaic arrays is very substantial. Smaller parabolic dishes can provide thermal energy at a collection efficiency of about 80%, but at increased cost. SOSPAC allows an analysis of cost and performance factors of five hybrid power generating systems. Input includes electrical and thermal power requirements, sun and shade durations for the satellite, and unit weight and cost for subsystems and components. Performance equations of the five configurations are derived, and the output tabulates total weights of the power plant assemblies, area of the arrays, efficiencies, and costs. SOSPAC is written in FORTRAN IV for batch execution and has been implemented on an IBM PC computer operating under DOS with a central memory requirement of approximately 60K of 8 bit bytes. This program was developed in 1985.

  15. Development and validation of chemistry agnostic flow battery cost performance model and application to nonaqueous electrolyte systems: Chemistry agnostic flow battery cost performance model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, Alasdair; Thomsen, Edwin; Reed, David

    2016-04-20

    A chemistry agnostic cost performance model is described for a nonaqueous flow battery. The model predicts flow battery performance by estimating the active reaction zone thickness at each electrode as a function of current density, state of charge, and flow rate using measured data for electrode kinetics, electrolyte conductivity, and electrode-specific surface area. Validation of the model is conducted using a 4kW stack data at various current densities and flow rates. This model is used to estimate the performance of a nonaqueous flow battery with electrode and electrolyte properties used from the literature. The optimized cost for this system ismore » estimated for various power and energy levels using component costs provided by vendors. The model allows optimization of design parameters such as electrode thickness, area, flow path design, and operating parameters such as power density, flow rate, and operating SOC range for various application duty cycles. A parametric analysis is done to identify components and electrode/electrolyte properties with the highest impact on system cost for various application durations. A pathway to 100$kWh -1 for the storage system is identified.« less

  16. Economic evaluation of genomic test-directed chemotherapy for early-stage lymph node-positive breast cancer.

    PubMed

    Hall, Peter S; McCabe, Christopher; Stein, Robert C; Cameron, David

    2012-01-04

    Multi-parameter genomic tests identify patients with early-stage breast cancer who are likely to derive little benefit from adjuvant chemotherapy. These tests can potentially spare patients the morbidity from unnecessary chemotherapy and reduce costs. However, the costs of the test must be balanced against the health benefits and cost savings produced. This economic evaluation compared genomic test-directed chemotherapy using the Oncotype DX 21-gene assay with chemotherapy for all eligible patients with lymph node-positive, estrogen receptor-positive early-stage breast cancer. We performed a cost-utility analysis using a state transition model to calculate expected costs and benefits over the lifetime of a cohort of women with estrogen receptor-positive lymph node-positive breast cancer from a UK perspective. Recurrence rates for Oncotype DX-selected risk groups were derived from parametric survival models fitted to data from the Southwest Oncology Group 8814 trial. The primary outcome was the incremental cost-effectiveness ratio, expressed as the cost (in 2011 GBP) per quality-adjusted life-year (QALY). Confidence in the incremental cost-effectiveness ratio was expressed as a probability of cost-effectiveness and was calculated using Monte Carlo simulation. Model parameters were varied deterministically and probabilistically in sensitivity analysis. Value of information analysis was used to rank priorities for further research. The incremental cost-effectiveness ratio for Oncotype DX-directed chemotherapy using a recurrence score cutoff of 18 was £5529 (US $8852) per QALY. The probability that test-directed chemotherapy is cost-effective was 0.61 at a willingness-to-pay threshold of £30 000 per QALY. Results were sensitive to the recurrence rate, long-term anthracycline-related cardiac toxicity, quality of life, test cost, and the time horizon. The highest priority for further research identified by value of information analysis is the recurrence rate in test-selected subgroups. There is substantial uncertainty regarding the cost-effectiveness of Oncotype DX-directed chemotherapy. It is particularly important that future research studies to inform cost-effectiveness-based decisions collect long-term outcome data.

  17. Predicting Production Costs for Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.; Weston, R. P.

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This paper outlines the development of a process-based cost model in which the physical elements of the vehicle are soared according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this paper is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool.

  18. Cost analysis of one of the first outpatient wound clinics in the Netherlands.

    PubMed

    Rondas, A A L M; Schols, J M G; Halfens, R J G; Hull, H R; Stobberingh, E E; Evers, S M A A

    2015-09-01

    To perform, from an insurance perspective, a cost analysis of one of the outpatient community wound care clinics in the Netherlands, the Knowledge Centre in Wound Care (KCWC) at Venray. This study involved a cost analysis based on an observational cohort study with a one-year pre-admission and a one-year post-admission comparison of costs. Patients were included when they first consulted the outpatient wound care clinic. Participants were all insured by the same health insurance company, Coöperatie Volksgezondheidszorg (VGZ). A standard six-step procedure for performing cost studies was used to calculate the costs. Given the skewed cost data, non-parametric bootstrapping was used to test for statistical differences. There were 172 patients included in this study. The difference in costs related to wound care between the year before and the year after initial admission to the wound clinic amounted to an average reduction of €2621 (£1873) per patient in the base case analysis. The categories 'general practitioner', 'hospital care', 'mental health care' and 'transport' scored lower, indicating lower costs, in the year after admission to the wound clinic. In this study, only the reimbursement data of patients of one health insurance company, and specifically only those made under the 2006 Dutch Health Insurance Act, were available. Because of the observational design, definitive conclusions cannot be made regarding a demonstrated reduction of costs in the year post admission. Nevertheless, this study is a first attempt of a cost analysis of an equipped outpatient wound clinic as an innovative way of responding to the increasing number of chronic wounds in the Netherlands. The calculations show that savings in wound care are possible. A possible conflict of interest should be mentioned. First author AALM Rondas, PhD student at Maastricht University, is working at the KCWC wound clinic at Venray in the Netherlands as a physician. However, the research data were provided externally by Coöperatie Volksgezondheidszorg (VGZ) and checked by the academic co-authors, none of whom have a conflict of interest. The authors have no financial or commercial interest to declare.

  19. Integrated versus nOn-integrated Peripheral inTravenous catheter. Which Is the most effective systeM for peripheral intravenoUs catheter Management? (The OPTIMUM study): a randomised controlled trial protocol.

    PubMed

    Castillo, Maria Isabel; Larsen, Emily; Cooke, Marie; Marsh, Nicole M; Wallis, Marianne C; Finucane, Julie; Brown, Peter; Mihala, Gabor; Carr, Peter J; Byrnes, Joshua; Walker, Rachel; Cable, Prudence; Zhang, Li; Sear, Candi; Jackson, Gavin; Rowsome, Anna; Ryan, Alison; Humphries, Julie C; Sivyer, Susan; Flanigan, Kathy; Rickard, Claire M

    2018-05-14

    Peripheral intravenous catheters (PIVCs) are frequently used in hospitals. However, PIVC complications are common, with failures leading to treatment delays, additional procedures, patient pain and discomfort, increased clinician workload and substantially increased healthcare costs. Recent evidence suggests integrated PIVC systems may be more effective than traditional non-integrated PIVC systems in reducing phlebitis, infiltration and costs and increasing functional dwell time. The study aim is to determine the efficacy, cost-utility and acceptability to patients and professionals of an integrated PIVC system compared with a non-integrated PIVC system. Two-arm, multicentre, randomised controlled superiority trial of integrated versus non-integrated PIVC systems to compare effectiveness on clinical and economic outcomes. Recruitment of 1560 patients over 2 years, with randomisation by a centralised service ensuring allocation concealment. Primary outcomes: catheter failure (composite endpoint) for reasons of: occlusion, infiltration/extravasation, phlebitis/thrombophlebitis, dislodgement, localised or catheter-associated bloodstream infections. first time insertion success, types of PIVC failure, device colonisation, insertion pain, functional dwell time, adverse events, mortality, cost-utility and consumer acceptability. One PIVC per patient will be included, with intention-to-treat analysis. Baseline group comparisons will be made for potentially clinically important confounders. The proportional hazards assumption will be checked, and Cox regression will test the effect of group, patient, device and clinical variables on failure. An as-treated analysis will assess the effect of protocol violations. Kaplan-Meier survival curves with log-rank tests will compare failure by group over time. Secondary endpoints will be compared between groups using parametric/non-parametric techniques. Ethical approval from the Royal Brisbane and Women's Hospital Human Research Ethics Committee (HREC/16/QRBW/527), Griffith University Human Research Ethics Committee (Ref No. 2017/002) and the South Metropolitan Health Services Human Research Ethics Committee (Ref No. 2016-239). Results will be published in peer-reviewed journals. ACTRN12617000089336. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. A new simple form of quark mixing matrix

    NASA Astrophysics Data System (ADS)

    Qin, Nan; Ma, Bo-Qiang

    2011-01-01

    Although different parametrizations of quark mixing matrix are mathematically equivalent, the consequences of experimental analysis may be distinct. Based on the triminimal expansion of Kobayashi-Maskawa matrix around the unit matrix, we propose a new simple parametrization. Compared with the Wolfenstein parametrization, we find that the new form is not only consistent with the original one in the hierarchical structure, but also more convenient for numerical analysis and measurement of the CP-violating phase. By discussing the relation between our new form and the unitarity boomerang, we point out that along with the unitarity boomerang, this new parametrization is useful in hunting for new physics.

  1. A study of performance and cost improvement potential of the 120 inch (3.05 m) diameter solid rocket motor. Volume 1: Summary report

    NASA Technical Reports Server (NTRS)

    Backlund, S. J.; Rossen, J. N.

    1971-01-01

    A parametric study of ballistic modifications to the 120 inch diameter solid propellant rocket engine which forms part of the Air Force Titan 3 system is presented. 576 separate designs were defined and 24 were selected for detailed analysis. Detailed design descriptions, ballistic performance, and mass property data were prepared for each design. It was determined that a relatively simple change in design parameters could provide a wide range of solid propellant rocket engine ballistic characteristics for future launch vehicle applications.

  2. Structural arrangement trade study. Volume 3: Reusable Hydrogen Composite Tank System (RHCTS) and Graphite Composite Primary Structures (GCPS). Addendum

    NASA Astrophysics Data System (ADS)

    1995-03-01

    This volume is the third of a 3 volume set that addresses the structural trade study plan that will identify the most suitable structural configuration for an SSTO winged vehicle capable of delivering 25,000 lbs to a 220 nm circular orbit at 51.6 deg inclination. The most suitable Reusable Hydrogen Composite Tank System (RHCTS), and Graphite Composite Tank System (GCPS) composite materials for intertank, wing and thrust structures are identified. Vehicle resizing charts, selection criteria and back-up charts, parametric costing approach and the finite element method analysis are discussed.

  3. Accelerated stress testing of terrestrial solar cells

    NASA Technical Reports Server (NTRS)

    Prince, J. L.; Lathrop, J. W.

    1979-01-01

    A program to investigate the reliability characteristics of unencapsulated low-cost terrestrial solar cells using accelerated stress testing is described. Reliability (or parametric degradation) factors appropriate to the cell technologies and use conditions were studied and a series of accelerated stress tests was synthesized. An electrical measurement procedure and a data analysis and management system was derived, and stress test fixturing and material flow procedures were set up after consideration was given to the number of cells to be stress tested and measured and the nature of the information to be obtained from the process. Selected results and conclusions are presented.

  4. Structural arrangement trade study. Volume 3: Reusable Hydrogen Composite Tank System (RHCTS) and Graphite Composite Primary Structures (GCPS). Addendum

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This volume is the third of a 3 volume set that addresses the structural trade study plan that will identify the most suitable structural configuration for an SSTO winged vehicle capable of delivering 25,000 lbs to a 220 nm circular orbit at 51.6 deg inclination. The most suitable Reusable Hydrogen Composite Tank System (RHCTS), and Graphite Composite Tank System (GCPS) composite materials for intertank, wing and thrust structures are identified. Vehicle resizing charts, selection criteria and back-up charts, parametric costing approach and the finite element method analysis are discussed.

  5. NASA's Human Mission to a Near-Earth Asteroid: Landing on a Moving Target

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Lincoln, William P.; Weisbin, Charles R.

    2011-01-01

    This paper describes a Bayesian approach for comparing the productivity and cost-risk tradeoffs of sending versus not sending one or more robotic surveyor missions prior to a human mission to land on an asteroid. The expected value of sample information based on productivity combined with parametric variations in the prior probability an asteroid might be found suitable for landing were used to assess the optimal number of spacecraft and asteroids to survey. The analysis supports the value of surveyor missions to asteroids and indicates one launch with two spacecraft going simultaneously to two independent asteroids appears optimal.

  6. Robust Machine Learning Variable Importance Analyses of Medical Conditions for Health Care Spending.

    PubMed

    Rose, Sherri

    2018-03-11

    To propose nonparametric double robust machine learning in variable importance analyses of medical conditions for health spending. 2011-2012 Truven MarketScan database. I evaluate how much more, on average, commercially insured enrollees with each of 26 of the most prevalent medical conditions cost per year after controlling for demographics and other medical conditions. This is accomplished within the nonparametric targeted learning framework, which incorporates ensemble machine learning. Previous literature studying the impact of medical conditions on health care spending has almost exclusively focused on parametric risk adjustment; thus, I compare my approach to parametric regression. My results demonstrate that multiple sclerosis, congestive heart failure, severe cancers, major depression and bipolar disorders, and chronic hepatitis are the most costly medical conditions on average per individual. These findings differed from those obtained using parametric regression. The literature may be underestimating the spending contributions of several medical conditions, which is a potentially critical oversight. If current methods are not capturing the true incremental effect of medical conditions, undesirable incentives related to care may remain. Further work is needed to directly study these issues in the context of federal formulas. © Health Research and Educational Trust.

  7. Impact of public electric vehicle charging infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levinson, Rebecca S.; West, Todd H.

    Our work uses market analysis and simulation to explore the potential of public charging infrastructure to spur US battery electric vehicle (BEV) sales, increase national electrified mileage, and lower greenhouse gas (GHG) emissions. By employing both scenario and parametric analysis for policy driven injection of public charging stations we find the following: (1) For large deployments of public chargers, DC fast chargers are more effective than level 2 chargers at increasing BEV sales, increasing electrified mileage, and lowering GHG emissions, even if only one DC fast charging station can be built for every ten level 2 charging stations. (2) Amore » national initiative to build DC fast charging infrastructure will see diminishing returns on investment at approximately 30,000 stations. (3) Some infrastructure deployment costs can be defrayed by passing them back to electric vehicle consumers, but once those costs to the consumer reach the equivalent of approximately 12¢/kWh for all miles driven, almost all gains to BEV sales and GHG emissions reductions from infrastructure construction are lost.« less

  8. Impact of public electric vehicle charging infrastructure

    DOE PAGES

    Levinson, Rebecca S.; West, Todd H.

    2017-10-16

    Our work uses market analysis and simulation to explore the potential of public charging infrastructure to spur US battery electric vehicle (BEV) sales, increase national electrified mileage, and lower greenhouse gas (GHG) emissions. By employing both scenario and parametric analysis for policy driven injection of public charging stations we find the following: (1) For large deployments of public chargers, DC fast chargers are more effective than level 2 chargers at increasing BEV sales, increasing electrified mileage, and lowering GHG emissions, even if only one DC fast charging station can be built for every ten level 2 charging stations. (2) Amore » national initiative to build DC fast charging infrastructure will see diminishing returns on investment at approximately 30,000 stations. (3) Some infrastructure deployment costs can be defrayed by passing them back to electric vehicle consumers, but once those costs to the consumer reach the equivalent of approximately 12¢/kWh for all miles driven, almost all gains to BEV sales and GHG emissions reductions from infrastructure construction are lost.« less

  9. Technical and Economical Feasibility of SSTO and TSTO Launch Vehicles

    NASA Astrophysics Data System (ADS)

    Lerch, Jens

    This paper discusses whether it is more cost effective to launch to low earth orbit in one or two stages, assuming current or near future technologies. First the paper provides an overview of the current state of the launch market and the hurdles to introducing new launch vehicles capable of significantly lowering the cost of access to space and discusses possible routes to solve those problems. It is assumed that reducing the complexity of launchers by reducing the number of stages and engines, and introducing reusability will result in lower launch costs. A number of operational and historic launch vehicle stages capable of near single stage to orbit (SSTO) performance are presented and the necessary steps to modify them into an expendable SSTO launcher and an optimized two stage to orbit (TSTO) launcher are shown, through parametric analysis. Then a ballistic reentry and recovery system is added to show that reusable SSTO and TSTO vehicles are also within the current state of the art. The development and recurring costs of the SSTO and the TSTO systems are estimated and compared. This analysis shows whether it is more economical to develop and operate expendable or reusable SSTO or TSTO systems under different assumption for launch rate and initial investment.

  10. Exercise-based cardiac rehabilitation after heart valve surgery: cost analysis of healthcare use and sick leave.

    PubMed

    Hansen, T B; Zwisler, A D; Berg, S K; Sibilitz, K L; Thygesen, L C; Doherty, P; Søgaard, R

    2015-01-01

    Owing to a lack of evidence, patients undergoing heart valve surgery have been offered exercise-based cardiac rehabilitation (CR) since 2009 based on recommendations for patients with ischaemic heart disease in Denmark. The aim of this study was to investigate the impact of CR on the costs of healthcare use and sick leave among heart valve surgery patients over 12 months post surgery. We conducted a nationwide survey on the CR participation of all patients having undergone valve surgery between 1 January 2011 and 30 June 2011 (n=667). Among the responders (n=500, 75%), the resource use categories of primary and secondary healthcare, prescription medication and sick leave were analysed for CR participants (n=277) and non-participants (n=223) over 12 months. A difference-in-difference analysis was undertaken. All estimates were presented as the means per patient (95% CI) based on non-parametric bootstrapping of SEs. Total costs during the 12 months following surgery were €16 065 per patient (95% CI 13 730 to 18 399) in the CR group and €15 182 (12 695 to 17 670) in the non-CR group. CR led to 5.6 (2.9 to 8.3, p<0.01) more outpatient visits per patient. No statistically significant differences in other cost categories or total costs €1330 (-4427 to 7086, p=0.65) were found between the groups. CR, as provided in Denmark, can be considered cost neutral. CR is associated with more outpatient visits, but CR participation potentially offsets more expensive outpatient visits. Further studies should investigate the benefits of CR to heart valve surgery patients as part of a formal cost-utility analysis.

  11. Guidance, navigation, and control trades for an Electric Orbit Transfer Vehicle

    NASA Astrophysics Data System (ADS)

    Zondervan, K. P.; Bauer, T. A.; Jenkin, A. B.; Metzler, R. A.; Shieh, R. A.

    The USAF Space Division initiated the Electric Insertion Transfer Experiment (ELITE) in the fall of 1988. The ELITE space mission is planned for the mid 1990s and will demonstrate technological readiness for the development of operational solar-powered electric orbit transfer vehicles (EOTVs). To minimize the cost of ground operations, autonomous flight is desirable. Thus, the guidance, navigation, and control (GNC) functions of an EOTV should reside on board. In order to define GNC requirements for ELITE, parametric trades must be performed for an operational solar-powered EOTV so that a clearer understanding of the performance aspects is obtained. Parametric trades for the GNC subsystems have provided insight into the relationship between pointing accuracy, transfer time, and propellant utilization. Additional trades need to be performed, taking into account weight, cost, and degree of autonomy.

  12. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    PubMed

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Data processing workflows from low-cost digital survey to various applications: three case studies of Chinese historic architecture

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Cao, Y. K.

    2015-08-01

    The paper focuses on the versatility of data processing workflows ranging from BIM-based survey to structural analysis and reverse modeling. In China nowadays, a large number of historic architecture are in need of restoration, reinforcement and renovation. But the architects are not prepared for the conversion from the booming AEC industry to architectural preservation. As surveyors working with architects in such projects, we have to develop efficient low-cost digital survey workflow robust to various types of architecture, and to process the captured data for architects. Although laser scanning yields high accuracy in architectural heritage documentation and the workflow is quite straightforward, the cost and portability hinder it from being used in projects where budget and efficiency are of prime concern. We integrate Structure from Motion techniques with UAV and total station in data acquisition. The captured data is processed for various purposes illustrated with three case studies: the first one is as-built BIM for a historic building based on registered point clouds according to Ground Control Points; The second one concerns structural analysis for a damaged bridge using Finite Element Analysis software; The last one relates to parametric automated feature extraction from captured point clouds for reverse modeling and fabrication.

  14. The cost-effectiveness of a patient centred pressure ulcer prevention care bundle: Findings from the INTACT cluster randomised trial.

    PubMed

    Whitty, Jennifer A; McInnes, Elizabeth; Bucknall, Tracey; Webster, Joan; Gillespie, Brigid M; Banks, Merrilyn; Thalib, Lukman; Wallis, Marianne; Cumsille, Jose; Roberts, Shelley; Chaboyer, Wendy

    2017-10-01

    Pressure ulcers are serious, avoidable, costly and common adverse outcomes of healthcare. To evaluate the cost-effectiveness of a patient-centred pressure ulcer prevention care bundle compared to standard care. Cost-effectiveness and cost-benefit analyses of pressure ulcer prevention performed from the health system perspective using data collected alongside a cluster-randomised trial. Eight tertiary hospitals in Australia. Adult patients receiving either a patient-centred pressure ulcer prevention care bundle (n=799) or standard care (n=799). Direct costs related to the intervention and preventative strategies were collected from trial data and supplemented by micro-costing data on patient turning and skin care from a 4-week substudy (n=317). The time horizon for the economic evaluation matched the trial duration, with the endpoint being diagnosis of a new pressure ulcer, hospital discharge/transfer or 28days; whichever occurred first. For the cost-effectiveness analysis, the primary outcome was the incremental costs of prevention per additional hospital acquired pressure ulcer case avoided, estimated using a two-stage cluster-adjusted non-parametric bootstrap method. The cost-benefit analysis estimated net monetary benefit, which considered both the costs of prevention and any difference in length of stay. All costs are reported in AU$(2015). The care bundle cost AU$144.91 (95%CI: $74.96 to $246.08) more per patient than standard care. The largest contributors to cost were clinical nurse time for repositioning and skin inspection. In the cost-effectiveness analysis, the care bundle was estimated to cost an additional $3296 (95%CI: dominant to $144,525) per pressure ulcer avoided. This estimate is highly uncertain. Length of stay was unexpectedly higher in the care bundle group. In a cost-benefit analysis which considered length of stay, the net monetary benefit for the care bundle was estimated to be -$2320 (95%CI -$3900, -$1175) per patient, suggesting the care bundle was not a cost-effective use of resources. A pressure ulcer prevention care bundle consisting of multicomponent nurse training and patient education may promote best practice nursing care but may not be cost-effective in preventing hospital acquired pressure ulcer. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Climate change and vector-borne diseases: an economic impact analysis of malaria in Africa.

    PubMed

    Egbendewe-Mondzozo, Aklesso; Musumba, Mark; McCarl, Bruce A; Wu, Ximing

    2011-03-01

    A semi-parametric econometric model is used to study the relationship between malaria cases and climatic factors in 25 African countries. Results show that a marginal change in temperature and precipitation levels would lead to a significant change in the number of malaria cases for most countries by the end of the century. Consistent with the existing biophysical malaria model results, the projected effects of climate change are mixed. Our model projects that some countries will see an increase in malaria cases but others will see a decrease. We estimate projected malaria inpatient and outpatient treatment costs as a proportion of annual 2000 health expenditures per 1,000 people. We found that even under minimal climate change scenario, some countries may see their inpatient treatment cost of malaria increase more than 20%.

  16. Modeling integrated water user decisions in intermittent supply systems

    NASA Astrophysics Data System (ADS)

    Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.

    2007-07-01

    We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.

  17. A study on technical efficiency of a DMU (review of literature)

    NASA Astrophysics Data System (ADS)

    Venkateswarlu, B.; Mahaboob, B.; Subbarami Reddy, C.; Sankar, J. Ravi

    2017-11-01

    In this research paper the concept of technical efficiency (due to Farell) [1] of a decision making unit (DMU) has been introduced and the measure of technical and cost efficiencies are derived. Timmer’s [2] deterministic approach to estimate the Cobb-Douglas production frontier has been proposed. The idea of extension of Timmer’s [2] method to any production frontier which is linear in parameters has been presented here. The estimation of parameters of Cobb-Douglas production frontier by linear programming approach has been discussed in this paper. Mark et al. [3] proposed a non-parametric method to assess efficiency. Nuti et al. [4] investigated the relationships among technical efficiency scores, weighted per capita cost and overall performance Gahe Zing Samuel Yank et al. [5] used Data envelopment analysis to assess technical assessment in banking sectors.

  18. Techno-economic assessment of polymer membrane systems for postcombustion carbon capture at coal-fired power plants.

    PubMed

    Zhai, Haibo; Rubin, Edward S

    2013-03-19

    This study investigates the feasibility of polymer membrane systems for postcombustion carbon dioxide (CO(2)) capture at coal-fired power plants. Using newly developed performance and cost models, our analysis shows that membrane systems configured with multiple stages or steps are capable of meeting capture targets of 90% CO(2) removal efficiency and 95+% product purity. A combined driving force design using both compressors and vacuum pumps is most effective for reducing the cost of CO(2) avoided. Further reductions in the overall system energy penalty and cost can be obtained by recycling a portion of CO(2) via a two-stage, two-step membrane configuration with air sweep to increase the CO(2) partial pressure of feed flue gas. For a typical plant with carbon capture and storage, this yielded a 15% lower cost per metric ton of CO(2) avoided compared to a plant using a current amine-based capture system. A series of parametric analyses also is undertaken to identify paths for enhancing the viability of membrane-based capture technology.

  19. Parametric Mass Modeling for Mars Entry, Descent and Landing System Analysis Study

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Komar, D. R.

    2011-01-01

    This paper provides an overview of the parametric mass models used for the Entry, Descent, and Landing Systems Analysis study conducted by NASA in FY2009-2010. The study examined eight unique exploration class architectures that included elements such as a rigid mid-L/D aeroshell, a lifting hypersonic inflatable decelerator, a drag supersonic inflatable decelerator, a lifting supersonic inflatable decelerator implemented with a skirt, and subsonic/supersonic retro-propulsion. Parametric models used in this study relate the component mass to vehicle dimensions and mission key environmental parameters such as maximum deceleration and total heat load. The use of a parametric mass model allows the simultaneous optimization of trajectory and mass sizing parameters.

  20. Cost-effective conservation of an endangered frog under uncertainty.

    PubMed

    Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A

    2016-04-01

    How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost-effective conservation under uncertainty. © 2015 Society for Conservation Biology.

  1. Integrated propulsion for near-Earth space missions. Volume 2: Technical

    NASA Technical Reports Server (NTRS)

    Dailey, C. L.; Meissinger, H. F.; Lovberg, R. H.; Zafran, S.

    1981-01-01

    The calculation approach is described for parametric analysis of candidate electric propulsion systems employed in LEO to GEO missions. Occultation relations, atmospheric density effects, and natural radiation effects are presented. A solar cell cover glass tradeoff is performed to determine optimum glass thickness. Solar array and spacecraft pointing strategies are described for low altitude flight and for optimum array illumination during ascent. Mass ratio tradeoffs versus transfer time provide direction for thruster technology improvements. Integrated electric propulsion analysis is performed for orbit boosting, inclination change, attitude control, stationkeeping, repositioning, and disposal functions as well as power sharing with payload on orbit. Comparison with chemical auxiliary propulsion is made to quantify the advantages of integrated propulsion in terms of weight savings and concomittant launch cost savings.

  2. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. NASA Air Force Cost Model (NAFCOM): Capabilities and Results

    NASA Technical Reports Server (NTRS)

    McAfee, Julie; Culver, George; Naderi, Mahmoud

    2011-01-01

    NAFCOM is a parametric estimating tool for space hardware. Uses cost estimating relationships (CERs) which correlate historical costs to mission characteristics to predict new project costs. It is based on historical NASA and Air Force space projects. It is intended to be used in the very early phases of a development project. NAFCOM can be used at the subsystem or component levels and estimates development and production costs. NAFCOM is applicable to various types of missions (crewed spacecraft, uncrewed spacecraft, and launch vehicles). There are two versions of the model: a government version that is restricted and a contractor releasable version.

  4. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  5. A Semi-parametric Transformation Frailty Model for Semi-competing Risks Survival Data

    PubMed Central

    Jiang, Fei; Haneuse, Sebastien

    2016-01-01

    In the analysis of semi-competing risks data interest lies in estimation and inference with respect to a so-called non-terminal event, the observation of which is subject to a terminal event. Multi-state models are commonly used to analyse such data, with covariate effects on the transition/intensity functions typically specified via the Cox model and dependence between the non-terminal and terminal events specified, in part, by a unit-specific shared frailty term. To ensure identifiability, the frailties are typically assumed to arise from a parametric distribution, specifically a Gamma distribution with mean 1.0 and variance, say, σ2. When the frailty distribution is misspecified, however, the resulting estimator is not guaranteed to be consistent, with the extent of asymptotic bias depending on the discrepancy between the assumed and true frailty distributions. In this paper, we propose a novel class of transformation models for semi-competing risks analysis that permit the non-parametric specification of the frailty distribution. To ensure identifiability, the class restricts to parametric specifications of the transformation and the error distribution; the latter are flexible, however, and cover a broad range of possible specifications. We also derive the semi-parametric efficient score under the complete data setting and propose a non-parametric score imputation method to handle right censoring; consistency and asymptotic normality of the resulting estimators is derived and small-sample operating characteristics evaluated via simulation. Although the proposed semi-parametric transformation model and non-parametric score imputation method are motivated by the analysis of semi-competing risks data, they are broadly applicable to any analysis of multivariate time-to-event outcomes in which a unit-specific shared frailty is used to account for correlation. Finally, the proposed model and estimation procedures are applied to a study of hospital readmission among patients diagnosed with pancreatic cancer. PMID:28439147

  6. The 25 kW power module evolution study. Part 3: Conceptual designs for power module evolutions. Volume 3: Cost estimates

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Cost data generated for the evolutionary power module concepts selected are reported. The initial acquisition costs (design, development, and protoflight unit test costs) were defined and modeled for the baseline 25 kW power module configurations. By building a parametric model of this initial building block, the cost of the 50 kW and the 100 kW power modules were derived by defining only their configuration and programmatic differences from the 25 kW baseline module. Variations in cost for the quantities needed to fulfill the mission scenarios were derived by applying appropriate learning curves.

  7. Sensitivity analysis for parametric generalized implicit quasi-variational-like inclusions involving P-[eta]-accretive mappings

    NASA Astrophysics Data System (ADS)

    Kazmi, K. R.; Khan, F. A.

    2008-01-01

    In this paper, using proximal-point mapping technique of P-[eta]-accretive mapping and the property of the fixed-point set of set-valued contractive mappings, we study the behavior and sensitivity analysis of the solution set of a parametric generalized implicit quasi-variational-like inclusion involving P-[eta]-accretive mapping in real uniformly smooth Banach space. Further, under suitable conditions, we discuss the Lipschitz continuity of the solution set with respect to the parameter. The technique and results presented in this paper can be viewed as extension of the techniques and corresponding results given in [R.P. Agarwal, Y.-J. Cho, N.-J. Huang, Sensitivity analysis for strongly nonlinear quasi-variational inclusions, Appl. MathE Lett. 13 (2002) 19-24; S. Dafermos, Sensitivity analysis in variational inequalities, Math. Oper. Res. 13 (1988) 421-434; X.-P. Ding, Sensitivity analysis for generalized nonlinear implicit quasi-variational inclusions, Appl. Math. Lett. 17 (2) (2004) 225-235; X.-P. Ding, Parametric completely generalized mixed implicit quasi-variational inclusions involving h-maximal monotone mappings, J. Comput. Appl. Math. 182 (2) (2005) 252-269; X.-P. Ding, C.L. Luo, On parametric generalized quasi-variational inequalities, J. Optim. Theory Appl. 100 (1999) 195-205; Z. Liu, L. Debnath, S.M. Kang, J.S. Ume, Sensitivity analysis for parametric completely generalized nonlinear implicit quasi-variational inclusions, J. Math. Anal. Appl. 277 (1) (2003) 142-154; R.N. Mukherjee, H.L. Verma, Sensitivity analysis of generalized variational inequalities, J. Math. Anal. Appl. 167 (1992) 299-304; M.A. Noor, Sensitivity analysis framework for general quasi-variational inclusions, Comput. Math. Appl. 44 (2002) 1175-1181; M.A. Noor, Sensitivity analysis for quasivariational inclusions, J. Math. Anal. Appl. 236 (1999) 290-299; J.Y. Park, J.U. Jeong, Parametric generalized mixed variational inequalities, Appl. Math. Lett. 17 (2004) 43-48].

  8. An EM-based semi-parametric mixture model approach to the regression analysis of competing-risks data.

    PubMed

    Ng, S K; McLachlan, G J

    2003-04-15

    We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.

  9. Numerical modeling and model updating for smart laminated structures with viscoelastic damping

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Zhan, Zhenfei; Liu, Xu; Wang, Pan

    2018-07-01

    This paper presents a numerical modeling method combined with model updating techniques for the analysis of smart laminated structures with viscoelastic damping. Starting with finite element formulation, the dynamics model with piezoelectric actuators is derived based on the constitutive law of the multilayer plate structure. The frequency-dependent characteristics of the viscoelastic core are represented utilizing the anelastic displacement fields (ADF) parametric model in the time domain. The analytical model is validated experimentally and used to analyze the influencing factors of kinetic parameters under parametric variations. Emphasis is placed upon model updating for smart laminated structures to improve the accuracy of the numerical model. Key design variables are selected through the smoothing spline ANOVA statistical technique to mitigate the computational cost. This updating strategy not only corrects the natural frequencies but also improves the accuracy of damping prediction. The effectiveness of the approach is examined through an application problem of a smart laminated plate. It is shown that a good consistency can be achieved between updated results and measurements. The proposed method is computationally efficient.

  10. Parametric analysis of parameters for electrical-load forecasting using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Gerber, William J.; Gonzalez, Avelino J.; Georgiopoulos, Michael

    1997-04-01

    Accurate total system electrical load forecasting is a necessary part of resource management for power generation companies. The better the hourly load forecast, the more closely the power generation assets of the company can be configured to minimize the cost. Automating this process is a profitable goal and neural networks should provide an excellent means of doing the automation. However, prior to developing such a system, the optimal set of input parameters must be determined. The approach of this research was to determine what those inputs should be through a parametric study of potentially good inputs. Input parameters tested were ambient temperature, total electrical load, the day of the week, humidity, dew point temperature, daylight savings time, length of daylight, season, forecast light index and forecast wind velocity. For testing, a limited number of temperatures and total electrical loads were used as a basic reference input parameter set. Most parameters showed some forecasting improvement when added individually to the basic parameter set. Significantly, major improvements were exhibited with the day of the week, dew point temperatures, additional temperatures and loads, forecast light index and forecast wind velocity.

  11. Parametric Method Performance for Dynamic 3'-Deoxy-3'-18F-Fluorothymidine PET/CT in Epidermal Growth Factor Receptor-Mutated Non-Small Cell Lung Carcinoma Patients Before and During Therapy.

    PubMed

    Kramer, Gerbrand Maria; Frings, Virginie; Heijtel, Dennis; Smit, E F; Hoekstra, Otto S; Boellaard, Ronald

    2017-06-01

    The objective of this study was to validate several parametric methods for quantification of 3'-deoxy-3'- 18 F-fluorothymidine ( 18 F-FLT) PET in advanced-stage non-small cell lung carcinoma (NSCLC) patients with an activating epidermal growth factor receptor mutation who were treated with gefitinib or erlotinib. Furthermore, we evaluated the impact of noise on accuracy and precision of the parametric analyses of dynamic 18 F-FLT PET/CT to assess the robustness of these methods. Methods : Ten NSCLC patients underwent dynamic 18 F-FLT PET/CT at baseline and 7 and 28 d after the start of treatment. Parametric images were generated using plasma input Logan graphic analysis and 2 basis functions-based methods: a 2-tissue-compartment basis function model (BFM) and spectral analysis (SA). Whole-tumor-averaged parametric pharmacokinetic parameters were compared with those obtained by nonlinear regression of the tumor time-activity curve using a reversible 2-tissue-compartment model with blood volume fraction. In addition, 2 statistically equivalent datasets were generated by countwise splitting the original list-mode data, each containing 50% of the total counts. Both new datasets were reconstructed, and parametric pharmacokinetic parameters were compared between the 2 replicates and the original data. Results: After the settings of each parametric method were optimized, distribution volumes (V T ) obtained with Logan graphic analysis, BFM, and SA all correlated well with those derived using nonlinear regression at baseline and during therapy ( R 2 ≥ 0.94; intraclass correlation coefficient > 0.97). SA-based V T images were most robust to increased noise on a voxel-level (repeatability coefficient, 16% vs. >26%). Yet BFM generated the most accurate K 1 values ( R 2 = 0.94; intraclass correlation coefficient, 0.96). Parametric K 1 data showed a larger variability in general; however, no differences were found in robustness between methods (repeatability coefficient, 80%-84%). Conclusion: Both BFM and SA can generate quantitatively accurate parametric 18 F-FLT V T images in NSCLC patients before and during therapy. SA was more robust to noise, yet BFM provided more accurate parametric K 1 data. We therefore recommend BFM as the preferred parametric method for analysis of dynamic 18 F-FLT PET/CT studies; however, SA can also be used. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  12. Parametric analysis of ATM solar array.

    NASA Technical Reports Server (NTRS)

    Singh, B. K.; Adkisson, W. B.

    1973-01-01

    The paper discusses the methods used for the calculation of ATM solar array performance characteristics and provides the parametric analysis of solar panels used in SKYLAB. To predict the solar array performance under conditions other than test conditions, a mathematical model has been developed. Four computer programs have been used to convert the solar simulator test data to the parametric curves. The first performs module summations, the second determines average solar cell characteristics which will cause a mathematical model to generate a curve matching the test data, the third is a polynomial fit program which determines the polynomial equations for the solar cell characteristics versus temperature, and the fourth program uses the polynomial coefficients generated by the polynomial curve fit program to generate the parametric data.

  13. The parametric resonance—from LEGO Mindstorms to cold atoms

    NASA Astrophysics Data System (ADS)

    Kawalec, Tomasz; Sierant, Aleksandra

    2017-07-01

    We show an experimental setup based on a popular LEGO Mindstorms set, allowing us to both observe and investigate the parametric resonance phenomenon. The presented method is simple but covers a variety of student activities like embedded software development, conducting measurements, data collection and analysis. It may be used during science shows, as part of student projects and to illustrate the parametric resonance in mechanics or even quantum physics, during lectures or classes. The parametrically driven LEGO pendulum gains energy in a spectacular way, increasing its amplitude from 10° to about 100° within a few tens of seconds. We provide also a short description of a wireless absolute orientation sensor that may be used in quantitative analysis of driven or free pendulum movement.

  14. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    PubMed

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  15. Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters

    NASA Astrophysics Data System (ADS)

    Kim, T.; Kim, Y. S.

    2017-12-01

    The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).

  16. Cost-effectiveness analysis of treatments for vertebral compression fractures.

    PubMed

    Edidin, Avram A; Ong, Kevin L; Lau, Edmund; Schmier, Jordana K; Kemner, Jason E; Kurtz, Steven M

    2012-07-01

    Vertebral compression fractures (VCFs) can be treated by nonsurgical management or by minimally invasive surgical treatment including vertebroplasty and balloon kyphoplasty. The purpose of the present study was to characterize the cost to Medicare for treating VCF-diagnosed patients by nonsurgical management, vertebroplasty, or kyphoplasty. We hypothesized that surgical treatments for VCFs using vertebroplasty or kyphoplasty would be a cost-effective alternative to nonsurgical management for the Medicare patient population. Cost per life-year gained for VCF patients in the US Medicare population was compared between operated (kyphoplasty and vertebroplasty) and non-operated patients and between kyphoplasty and vertebroplasty patients, all as a function of patient age and gender. Life expectancy was estimated using a parametric Weibull survival model (adjusted for comorbidities) for 858 978 VCF patients in the 100% Medicare dataset (2005-2008). Median payer costs were identified for each treatment group for up to 3 years following VCF diagnosis, based on 67 018 VCF patients in the 5% Medicare dataset (2005-2008). A discount rate of 3% was used for the base case in the cost-effectiveness analysis, with 0% and 5% discount rates used in sensitivity analyses. After accounting for the differences in median costs and using a discount rate of 3%, the cost per life-year gained for kyphoplasty and vertebroplasty patients ranged from $US1863 to $US6687 and from $US2452 to $US13 543, respectively, compared with non-operated patients. The cost per life-year gained for kyphoplasty compared with vertebroplasty ranged from -$US4878 (cost saving) to $US2763. Among patients for whom surgical treatment was indicated, kyphoplasty was found to be cost effective, and perhaps even cost saving, compared with vertebroplasty. Even for the oldest patients (85 years of age and older), both interventions would be considered cost effective in terms of cost per life-year gained.

  17. Civil Service Workforce Market Supply and the Effect on Cost Estimating Relationship (CERS) that May Effect the Productivity Factors for Future NASA Missions

    NASA Technical Reports Server (NTRS)

    Sterk, Steve; Chesley, Stephan

    2008-01-01

    The upcoming retirement of the Baby Boomers will leave a workforce age gap between the younger generation (the future NASA decision makers) and the gray beards. This paper will reflect on the average age of the workforce across NASA Centers, the Aerospace Industry and other Government Agencies, like DoD. This paper will dig into Productivity and Realization Factors and how they get applied to bi-monthly (payroll) data for true full-time equivalent (FTE) calculations that could be used at each of the NASA Centers and other business systems that are on the forefront in being implemented. This paper offers some comparative costs analysis/solutions, from simple FTE cost-estimating relationships (CERs) versus CERs for monthly time-phasing activities for small research projects that start and get completed within a government fiscal year. This paper will present the results of a parametric study investigating the cost-effectiveness of alternative performance-based CERs and how they get applied into the Center's forward pricing rate proposals (FPRP). True CERs based on the relationship of a younger aged workforce will have some effects on labor rates used in both commercial cost models and other internal home-grown cost models which may impact the productivity factors for future NASA missions.

  18. Closed-loop wavelength stabilization of an optical parametric oscillator as a front end of a high-power iodine laser chain.

    PubMed

    Kral, L

    2007-05-01

    We present a complex stabilization and control system for a commercially available optical parametric oscillator. The system is able to stabilize the oscillator's output wavelength at a narrow spectral line of atomic iodine with subpicometer precision, allowing utilization of this solid-state parametric oscillator as a front end of a high-power photodissociation laser chain formed by iodine gas amplifiers. In such setup, a precise wavelength matching between the front end and the amplifier chain is necessary due to extremely narrow spectral lines of the gaseous iodine (approximately 20 pm). The system is based on a personal computer, a heated iodine cell, and a few other low-cost components. It automatically identifies the proper peak within the iodine absorption spectrum, and then keeps the oscillator tuned to this peak with high precision and reliability. The use of the solid-state oscillator as the front end allows us to use the whole iodine laser system as a pump laser for the optical parametric chirped pulse amplification, as it enables precise time synchronization with a signal Ti:sapphire laser.

  19. Parametric study of different contributors to tumor thermal profile

    NASA Astrophysics Data System (ADS)

    Tepper, Michal; Gannot, Israel

    2014-03-01

    Treating cancer is one of the major challenges of modern medicine. There is great interest in assessing tumor development in in vivo animal and human models, as well as in in vitro experiments. Existing methods are either limited by cost and availability or by their low accuracy and reproducibility. Thermography holds the potential of being a noninvasive, low-cost, irradiative and easy-to-use method for tumor monitoring. Tumors can be detected in thermal images due to their relatively higher or lower temperature compared to the temperature of the healthy skin surrounding them. Extensive research is performed to show the validity of thermography as an efficient method for tumor detection and the possibility of extracting tumor properties from thermal images, showing promising results. However, deducing from one type of experiment to others is difficult due to the differences in tumor properties, especially between different types of tumors or different species. There is a need in a research linking different types of tumor experiments. In this research, parametric analysis of possible contributors to tumor thermal profiles was performed. The effect of tumor geometric, physical and thermal properties was studied, both independently and together, in phantom model experiments and computer simulations. Theoretical and experimental results were cross-correlated to validate the models used and increase the accuracy of simulated complex tumor models. The contribution of different parameters in various tumor scenarios was estimated and the implication of these differences on the observed thermal profiles was studied. The correlation between animal and human models is discussed.

  20. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    NASA Technical Reports Server (NTRS)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.

  1. Structural cost optimization of photovoltaic central power station modules and support structure

    NASA Technical Reports Server (NTRS)

    Sutton, P. D.; Stolte, W. J.; Marsh, R. O.

    1979-01-01

    The results of a comprehensive study of photovoltaic module structural support concepts for photovoltaic central power stations and their associated costs are presented. The objective of the study has been the identification of structural cost drivers. Parametric structural design and cost analyses of complete array systems consisting of modules, primary support structures, and foundations were performed. Area related module cost was found to be constant with design, size, and loading. A curved glass module concept was evaluated and found to have the potential to significantly reduce panel structural costs. Conclusions of the study are: array costs do not vary greatly among the designs evaluated; panel and array costs are strongly dependent on design loading; and the best support configuration is load dependent

  2. Space-based solar power conversion and delivery systems study. Volume 4: Energy conversion systems studies

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Solar cells and optical configurations for the SSPS were examined. In this task, three specific solar cell materials were examined: single crystal silicon, single crystal gallium arsenide, and polycrystalline cadmium sulfide. The comparison of the three different cells on the basis of a subsystem parametric cost per kW of SSPS-generated power at the terrestrial utility interface showed that gallium arsenide was the most promising solar cell material at high concentration ratios. The most promising solar cell material with no concentration, was dependent upon the particular combination of parameters representing cost, mass and performance that were chosen to represent each cell in this deterministic comparative analysis. The potential for mass production, based on the projections of the present state-of-the-art would tend to favor cadmium sulfide in lieu of single crystal silicon or gallium arsenide solar cells.

  3. Power flow analysis of two coupled plates with arbitrary characteristics

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    In the last progress report (Feb. 1988) some results were presented for a parametric analysis on the vibrational power flow between two coupled plate structures using the mobility power flow approach. The results reported then were for changes in the structural parameters of the two plates, but with the two plates identical in their structural characteristics. Herein, limitation is removed. The vibrational power input and output are evaluated for different values of the structural damping loss factor for the source and receiver plates. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. The results obtained from the mobility power flow approach are compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between the SEA results and the mobility power flow results. Furthermore, the benefits derived from using the mobility power flow approach are examined.

  4. Integrated-circuit balanced parametric amplifier

    NASA Technical Reports Server (NTRS)

    Dickens, L. E.

    1975-01-01

    Amplifier, fabricated on single dielectric substrate, has pair of Schottky barrier varactor diodes mounted on single semiconductor chip. Circuit includes microstrip transmission line and slot line section to conduct signals. Main features of amplifier are reduced noise output and low production cost.

  5. Integral abutment bridges under thermal loading : numerical simulations and parametric study.

    DOT National Transportation Integrated Search

    2016-06-01

    Integral abutment bridges (IABs) have become of interest due to their decreased construction and maintenance costs in : comparison to conventional jointed bridges. Most prior IAB research was related to substructure behavior, and, as a result, most :...

  6. Multi-parametric centrality method for graph network models

    NASA Astrophysics Data System (ADS)

    Ivanov, Sergei Evgenievich; Gorlushkina, Natalia Nikolaevna; Ivanova, Lubov Nikolaevna

    2018-04-01

    The graph model networks are investigated to determine centrality, weights and the significance of vertices. For centrality analysis appliesa typical method that includesany one of the properties of graph vertices. In graph theory, methods of analyzing centrality are used: in terms by degree, closeness, betweenness, radiality, eccentricity, page-rank, status, Katz and eigenvector. We have proposed a new method of multi-parametric centrality, which includes a number of basic properties of the network member. The mathematical model of multi-parametric centrality method is developed. Comparison of results for the presented method with the centrality methods is carried out. For evaluate the results for the multi-parametric centrality methodthe graph model with hundreds of vertices is analyzed. The comparative analysis showed the accuracy of presented method, includes simultaneously a number of basic properties of vertices.

  7. DCE-MRI, DW-MRI, and MRS in Cancer: Challenges and Advantages of Implementing Qualitative and Quantitative Multi-parametric Imaging in the Clinic

    PubMed Central

    Winfield, Jessica M.; Payne, Geoffrey S.; Weller, Alex; deSouza, Nandita M.

    2016-01-01

    Abstract Multi-parametric magnetic resonance imaging (mpMRI) offers a unique insight into tumor biology by combining functional MRI techniques that inform on cellularity (diffusion-weighted MRI), vascular properties (dynamic contrast-enhanced MRI), and metabolites (magnetic resonance spectroscopy) and has scope to provide valuable information for prognostication and response assessment. Challenges in the application of mpMRI in the clinic include the technical considerations in acquiring good quality functional MRI data, development of robust techniques for analysis, and clinical interpretation of the results. This article summarizes the technical challenges in acquisition and analysis of multi-parametric MRI data before reviewing the key applications of multi-parametric MRI in clinical research and practice. PMID:27748710

  8. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  9. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  10. Parametric study of helicopter aircraft systems costs and weights

    NASA Technical Reports Server (NTRS)

    Beltramo, M. N.

    1980-01-01

    Weight estimating relationships (WERs) and recurring production cost estimating relationships (CERs) were developed for helicopters at the system level. The WERs estimate system level weight based on performance or design characteristics which are available during concept formulation or the preliminary design phase. The CER (or CERs in some cases) for each system utilize weight (either actual or estimated using the appropriate WER) and production quantity as the key parameters.

  11. Improving the Parametric Method of Cost Estimating Relationships of Naval Ships

    DTIC Science & Technology

    2014-06-01

    tool since the total cost of the ship is broken down into smaller parts as defined by the WBS. The Navy currently uses the Expanded Ship Work Breakdown...Includes boilers , reactors, turbines, gears, shafting, propellers, steam piping, lube oil piping, and radiation 300 Electric Plant Includes ship...spaces, ladders, storerooms, laundry, and workshops 700 Armament Includes guns, missile launchers, ammunition handling and stowage, torpedo tubes , depth

  12. Manned Mars mission cost estimate

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph; Smith, Keith

    1986-01-01

    The potential costs of several options of a manned Mars mission are examined. A cost estimating methodology based primarily on existing Marshall Space Flight Center (MSFC) parametric cost models is summarized. These models include the MSFC Space Station Cost Model and the MSFC Launch Vehicle Cost Model as well as other modes and techniques. The ground rules and assumptions of the cost estimating methodology are discussed and cost estimates presented for six potential mission options which were studied. The estimated manned Mars mission costs are compared to the cost of the somewhat analogous Apollo Program cost after normalizing the Apollo cost to the environment and ground rules of the manned Mars missions. It is concluded that a manned Mars mission, as currently defined, could be accomplished for under $30 billion in 1985 dollars excluding launch vehicle development and mission operations.

  13. Cost-Benefit Analysis for the Advanced Near Net Shape Technology (ANNST) Method for Fabricating Stiffened Cylinders

    NASA Technical Reports Server (NTRS)

    Stoner, Mary Cecilia; Hehir, Austin R.; Ivanco, Marie L.; Domack, Marcia S.

    2016-01-01

    This cost-benefit analysis assesses the benefits of the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. These preliminary, rough order-of-magnitude results report a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Production cost savings of 35 to 58 percent were reported over the composite manufacturing technique used in this study for comparison; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels when compared with conventional metallic manufacturing. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. A case study compared these three alternatives for manufacturing a cylinder of specified geometry, with particular focus placed on production costs and process complexity, with cost analyses performed by the analogy and parametric methods. Furthermore, a scalability study was conducted for three tank diameters to assess the highest potential payoff of the ANNST process for manufacture of large-diameter cryogenic tanks. The analytical hierarchy process (AHP) was subsequently used with a group of selected subject matter experts to assess the value of the various benefits achieved by the ANNST method for potential stakeholders. The AHP study results revealed that decreased final cylinder mass and quality assurance were the most valued benefits of cylinder manufacturing methods, therefore emphasizing the relevance of the benefits achieved with the ANNST process for future projects.

  14. A Parametric Analysis of HELSTAR

    DTIC Science & Technology

    1983-12-01

    AFIT/GSO/OS/83D-7 S....A PARAMETRIC ANALYSIS OF HELSTAR THESIS James Miklasevich Captain, USAF AFIT/CSO/OS/83D-7 ’- 3 - Reproduced From J𔄁 04. • ’ S...1 Statement of Problem. ...... ................ ......... 3 Objectives of the Research. .... ............ . . . 3 ...Launch Scenarios ................. 39 Launch Sequencel................... 39 Launch Sequence 2 . . . . . .. . . . .. . . . . . 1 Launch Sequence 3

  15. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    ERIC Educational Resources Information Center

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  16. Parametric Cost Study of AC-DC Wayside Power Systems

    DOT National Transportation Integrated Search

    1975-09-01

    The wayside power system provides all the power requirements of an electric vehicle operating on a fixed guideway. For a given set of specifications there are numerous wayside power supply configurations which will be satisfactory from a technical st...

  17. Parametrically excited non-linear multidegree-of-freedom systems with repeated natural frequencies

    NASA Astrophysics Data System (ADS)

    Tezak, E. G.; Nayfeh, A. H.; Mook, D. T.

    1982-12-01

    A method for analyzing multidegree-of-freedom systems having a repeated natural frequency subjected to a parametric excitation is presented. Attention is given to the ordering of the various terms (linear and non-linear) in the governing equations. The analysis is based on the method of multiple scales. As a numerical example involving a parametric resonance, panel flutter is discussed in detail in order to illustrate the type of results one can expect to obtain with this analysis. Some of the analytical results are verified by a numerical integration of the governing equations.

  18. Bidirectional reflectance distribution function measurements and analysis of retroreflective materials.

    PubMed

    Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure

    2014-12-01

    We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data.

  19. Coupled parametric design of flow control and duct shape

    NASA Technical Reports Server (NTRS)

    Florea, Razvan (Inventor); Bertuccioli, Luca (Inventor)

    2009-01-01

    A method for designing gas turbine engine components using a coupled parametric analysis of part geometry and flow control is disclosed. Included are the steps of parametrically defining the geometry of the duct wall shape, parametrically defining one or more flow control actuators in the duct wall, measuring a plurality of performance parameters or metrics (e.g., flow characteristics) of the duct and comparing the results of the measurement with desired or target parameters, and selecting the optimal duct geometry and flow control for at least a portion of the duct, the selection process including evaluating the plurality of performance metrics in a pareto analysis. The use of this method in the design of inter-turbine transition ducts, serpentine ducts, inlets, diffusers, and similar components provides a design which reduces pressure losses and flow profile distortions.

  20. Parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method of ledre profile attributes

    NASA Astrophysics Data System (ADS)

    Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.

    2018-03-01

    This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).

  1. Conceptual spacecraft systems design and synthesis

    NASA Technical Reports Server (NTRS)

    Wright, R. L.; Deryder, D. D.; Ferebee, M. J., Jr.

    1984-01-01

    An interactive systems design and synthesis is performed on future spacecraft concepts using the Interactive Design and Evaluation of Advanced Systems (IDEAS) computer-aided design and analysis system. The capabilities and advantages of the systems-oriented interactive computer-aided design and analysis system are described. The synthesis of both large antenna and space station concepts, and space station evolutionary growth designs is demonstrated. The IDEAS program provides the user with both an interactive graphics and an interactive computing capability which consists of over 40 multidisciplinary synthesis and analysis modules. Thus, the user can create, analyze, and conduct parametric studies and modify earth-orbiting spacecraft designs (space stations, large antennas or platforms, and technologically advanced spacecraft) at an interactive terminal with relative ease. The IDEAS approach is useful during the conceptual design phase of advanced space missions when a multiplicity of parameters and concepts must be analyzed and evaluated in a cost-effective and timely manner.

  2. Interactive systems design and synthesis of future spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Wright, R. L.; Deryder, D. D.; Ferebee, M. J., Jr.

    1984-01-01

    An interactive systems design and synthesis is performed on future spacecraft concepts using the Interactive Design and Evaluation of Advanced spacecraft (IDEAS) computer-aided design and analysis system. The capabilities and advantages of the systems-oriented interactive computer-aided design and analysis system are described. The synthesis of both large antenna and space station concepts, and space station evolutionary growth is demonstrated. The IDEAS program provides the user with both an interactive graphics and an interactive computing capability which consists of over 40 multidisciplinary synthesis and analysis modules. Thus, the user can create, analyze and conduct parametric studies and modify Earth-orbiting spacecraft designs (space stations, large antennas or platforms, and technologically advanced spacecraft) at an interactive terminal with relative ease. The IDEAS approach is useful during the conceptual design phase of advanced space missions when a multiplicity of parameters and concepts must be analyzed and evaluated in a cost-effective and timely manner.

  3. Applications of Coherent Radiation from Electrons traversing Crystals

    NASA Astrophysics Data System (ADS)

    Überall, H.

    2000-04-01

    Historically, the first types of coherent radiation from electrons traversing crystals studied were coherent bremsstrahlung (CB: Dyson and Überall 1955; Überall 1956, 1962) and channeling radiation (CR: Kumakhov, 1976) which produce quasimonochromatic X-rays and γ-rays, as well as parametric X-rays (Baryshevsky and Feranchuk, 1983). Related non-crystal sources are transition radiation and synchrotron radiation. We here present a comparison of radiation types from these sources, and we discuss a series of their possible applications, namely (a) CR: X-ray lithography, angiography, structure analysis of macromolecules, and trace element analysis, and (b) for CB: Radiography, use as a neutron source, elemental analysis, radiation therapy, and radioisotope production for commercial or medical use. CR and CB are very intense sources, needing only low-energy, moderately-priced electron linacs for their generation, hence competing with (or surpassing) more conventional X-ray sources intensity-wise and from a cost standpoint.

  4. Cost-Aware Design of a Discrimination Strategy for Unexploded Ordnance Cleanup

    DTIC Science & Technology

    2011-02-25

    Acronyms ANN: Artificial Neural Network AUC: Area Under the Curve BRAC: Base Realignment And Closure DLRT: Distance Likelihood Ratio Test EER...Discriminative Aggregate Nonparametric [25] Artificial Neural Network ANN Discriminative Aggregate Parametric [33] 11 Results and Discussion Task #1

  5. Preliminary design, analysis, and costing of a dynamic scale model of the NASA space station

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Pinson, E. D.; Voqui, H. L.; Crawley, E. F.; Everman, M. R.

    1987-01-01

    The difficulty of testing the next generation of large flexible space structures on the ground places an emphasis on other means for validating predicted on-orbit dynamic behavior. Scale model technology represents one way of verifying analytical predictions with ground test data. This study investigates the preliminary design, scaling and cost trades for a Space Station dynamic scale model. The scaling of nonlinear joint behavior is studied from theoretical and practical points of view. Suspension system interaction trades are conducted for the ISS Dual Keel Configuration and Build-Up Stages suspended in the proposed NASA/LaRC Large Spacecraft Laboratory. Key issues addressed are scaling laws, replication vs. simulation of components, manufacturing, suspension interactions, joint behavior, damping, articulation capability, and cost. These issues are the subject of parametric trades versus the scale model factor. The results of these detailed analyses are used to recommend scale factors for four different scale model options, each with varying degrees of replication. Potential problems in constructing and testing the scale model are identified, and recommendations for further study are outlined.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Kyoung Mo; Jee, Kye Kwang; Pyo, Chang Ryul

    The basis of the leak before break (LBB) concept is to demonstrate that piping will leak significantly before a double ended guillotine break (DEGB) occurs. This is demonstrated by quantifying and evaluating the leak process and prescribing safe shutdown of the plant on the basis of the monitored leak rate. The application of LBB for power plant design has reduced plant cost while improving plant integrity. Several evaluations employing LBB analysis on system piping based on DEGB design have been completed. However, the application of LBB on main steam (MS) piping, which is LBB applicable piping, has not been performedmore » due to several uncertainties associated with occurrence of steam hammer and dynamic strain aging (DSA). The objective of this paper is to demonstrate the applicability of the LBB design concept to main steam lines manufactured with SA106 Gr.C carbon steel. Based on the material properties, including fracture toughness and tensile properties obtained from the comprehensive material tests for base and weld metals, a parametric study was performed as described in this paper. The PICEP code was used to determine leak size crack (LSC) and the FLET code was used to perform the stability assessment of MS piping. The effects of material properties obtained from tests were evaluated to determine the LBB applicability for the MS piping. It can be shown from this parametric study that the MS piping has a high possibility of design using LBB analysis.« less

  7. A mixture model for bovine abortion and foetal survival.

    PubMed

    Hanson, Timothy; Bedrick, Edward J; Johnson, Wesley O; Thurmond, Mark C

    2003-05-30

    The effect of spontaneous abortion on the dairy industry is substantial, costing the industry on the order of US dollars 200 million per year in California alone. We analyse data from a cohort study of nine dairy herds in Central California. A key feature of the analysis is the observation that only a relatively small proportion of cows will abort (around 10;15 per cent), so that it is inappropriate to analyse the time-to-abortion (TTA) data as if it were standard censored survival data, with cows that fail to abort by the end of the study treated as censored observations. We thus broaden the scope to consider the analysis of foetal lifetime distribution (FLD) data for the cows, with the dual goals of characterizing the effects of various risk factors on (i). the likelihood of abortion and, conditional on abortion status, on (ii). the risk of early versus late abortion. A single model is developed to accomplish both goals with two sets of specific herd effects modelled as random effects. Because multimodal foetal hazard functions are expected for the TTA data, both a parametric mixture model and a non-parametric model are developed. Furthermore, the two sets of analyses are linked because of anticipated dependence between the random herd effects. All modelling and inferences are accomplished using modern Bayesian methods. Copyright 2003 John Wiley & Sons, Ltd.

  8. Parametric resonance in the early Universe—a fitting analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Figueroa, Daniel G.; Torrentí, Francisco, E-mail: daniel.figueroa@cern.ch, E-mail: f.torrenti@csic.es

    Particle production via parametric resonance in the early Universe, is a non-perturbative, non-linear and out-of-equilibrium phenomenon. Although it is a well studied topic, whenever a new scenario exhibits parametric resonance, a full re-analysis is normally required. To avoid this tedious task, many works present often only a simplified linear treatment of the problem. In order to surpass this circumstance in the future, we provide a fitting analysis of parametric resonance through all its relevant stages: initial linear growth, non-linear evolution, and relaxation towards equilibrium. Using lattice simulations in an expanding grid in 3+1 dimensions, we parametrize the dynamics' outcome scanningmore » over the relevant ingredients: role of the oscillatory field, particle coupling strength, initial conditions, and background expansion rate. We emphasize the inaccuracy of the linear calculation of the decay time of the oscillatory field, and propose a more appropriate definition of this scale based on the subsequent non-linear dynamics. We provide simple fits to the relevant time scales and particle energy fractions at each stage. Our fits can be applied to post-inflationary preheating scenarios, where the oscillatory field is the inflaton, or to spectator-field scenarios, where the oscillatory field can be e.g. a curvaton, or the Standard Model Higgs.« less

  9. Hamilton's rule and the causes of social evolution

    PubMed Central

    Bourke, Andrew F. G.

    2014-01-01

    Hamilton's rule is a central theorem of inclusive fitness (kin selection) theory and predicts that social behaviour evolves under specific combinations of relatedness, benefit and cost. This review provides evidence for Hamilton's rule by presenting novel syntheses of results from two kinds of study in diverse taxa, including cooperatively breeding birds and mammals and eusocial insects. These are, first, studies that empirically parametrize Hamilton's rule in natural populations and, second, comparative phylogenetic analyses of the genetic, life-history and ecological correlates of sociality. Studies parametrizing Hamilton's rule are not rare and demonstrate quantitatively that (i) altruism (net loss of direct fitness) occurs even when sociality is facultative, (ii) in most cases, altruism is under positive selection via indirect fitness benefits that exceed direct fitness costs and (iii) social behaviour commonly generates indirect benefits by enhancing the productivity or survivorship of kin. Comparative phylogenetic analyses show that cooperative breeding and eusociality are promoted by (i) high relatedness and monogamy and, potentially, by (ii) life-history factors facilitating family structure and high benefits of helping and (iii) ecological factors generating low costs of social behaviour. Overall, the focal studies strongly confirm the predictions of Hamilton's rule regarding conditions for social evolution and their causes. PMID:24686934

  10. Hamilton's rule and the causes of social evolution.

    PubMed

    Bourke, Andrew F G

    2014-05-19

    Hamilton's rule is a central theorem of inclusive fitness (kin selection) theory and predicts that social behaviour evolves under specific combinations of relatedness, benefit and cost. This review provides evidence for Hamilton's rule by presenting novel syntheses of results from two kinds of study in diverse taxa, including cooperatively breeding birds and mammals and eusocial insects. These are, first, studies that empirically parametrize Hamilton's rule in natural populations and, second, comparative phylogenetic analyses of the genetic, life-history and ecological correlates of sociality. Studies parametrizing Hamilton's rule are not rare and demonstrate quantitatively that (i) altruism (net loss of direct fitness) occurs even when sociality is facultative, (ii) in most cases, altruism is under positive selection via indirect fitness benefits that exceed direct fitness costs and (iii) social behaviour commonly generates indirect benefits by enhancing the productivity or survivorship of kin. Comparative phylogenetic analyses show that cooperative breeding and eusociality are promoted by (i) high relatedness and monogamy and, potentially, by (ii) life-history factors facilitating family structure and high benefits of helping and (iii) ecological factors generating low costs of social behaviour. Overall, the focal studies strongly confirm the predictions of Hamilton's rule regarding conditions for social evolution and their causes.

  11. Computer aided system for parametric design of combination die

    NASA Astrophysics Data System (ADS)

    Naranje, Vishal G.; Hussein, H. M. A.; Kumar, S.

    2017-09-01

    In this paper, a computer aided system for parametric design of combination dies is presented. The system is developed using knowledge based system technique of artificial intelligence. The system is capable to design combination dies for production of sheet metal parts having punching and cupping operations. The system is coded in Visual Basic and interfaced with AutoCAD software. The low cost of the proposed system will help die designers of small and medium scale sheet metal industries for design of combination dies for similar type of products. The proposed system is capable to reduce design time and efforts of die designers for design of combination dies.

  12. Combined non-parametric and parametric approach for identification of time-variant systems

    NASA Astrophysics Data System (ADS)

    Dziedziech, Kajetan; Czop, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz

    2018-03-01

    Identification of systems, structures and machines with variable physical parameters is a challenging task especially when time-varying vibration modes are involved. The paper proposes a new combined, two-step - i.e. non-parametric and parametric - modelling approach in order to determine time-varying vibration modes based on input-output measurements. Single-degree-of-freedom (SDOF) vibration modes from multi-degree-of-freedom (MDOF) non-parametric system representation are extracted in the first step with the use of time-frequency wavelet-based filters. The second step involves time-varying parametric representation of extracted modes with the use of recursive linear autoregressive-moving-average with exogenous inputs (ARMAX) models. The combined approach is demonstrated using system identification analysis based on the experimental mass-varying MDOF frame-like structure subjected to random excitation. The results show that the proposed combined method correctly captures the dynamics of the analysed structure, using minimum a priori information on the model.

  13. Application of an enhanced discrete element method to oil and gas drilling processes

    NASA Astrophysics Data System (ADS)

    Ubach, Pere Andreu; Arrufat, Ferran; Ring, Lev; Gandikota, Raju; Zárate, Francisco; Oñate, Eugenio

    2016-03-01

    The authors present results on the use of the discrete element method (DEM) for the simulation of drilling processes typical in the oil and gas exploration industry. The numerical method uses advanced DEM techniques using a local definition of the DEM parameters and combined FEM-DEM procedures. This paper presents a step-by-step procedure to build a DEM model for analysis of the soil region coupled to a FEM model for discretizing the drilling tool that reproduces the drilling mechanics of a particular drill bit. A parametric study has been performed to determine the model parameters in order to maintain accurate solutions with reduced computational cost.

  14. Decomposing Cost Efficiency in Regional Long-term Care Provision in Japan.

    PubMed

    Yamauchi, Yasuhiro

    2015-07-12

    Many developed countries face a growing need for long-term care provision because of population ageing. Japan is one such example, given its population's longevity and low birth rate. In this study, we examine the efficiency of Japan's regional long-term care system in FY2010 by performing a data envelopment analysis, a non-parametric frontier approach, on prefectural data and separating cost efficiency into technical, allocative, and price efficiencies under different average unit costs across regions. In doing so, we elucidate the structure of cost inefficiency by incorporating a method for restricting weight flexibility to avoid unrealistic concerns arising from zero optimal weight. The results indicate that technical inefficiency accounts for the highest share of losses, followed by price inefficiency and allocation inefficiency. Moreover, the majority of technical inefficiency losses stem from labor costs, particularly those for professional caregivers providing institutional services. We show that the largest share of allocative inefficiency losses can also be traced to labor costs for professional caregivers providing institutional services, while the labor provision of in-home care services shows an efficiency gain. However, although none of the prefectures gains efficiency by increasing the number of professional caregivers for institutional services, quite a few prefectures would gain allocative efficiency by increasing capital inputs for institutional services. These results indicate that preferred policies for promoting efficiency might vary from region to region, and thus, policy implications should be drawn with care.

  15. Optimality of cycle time and inventory decisions in a two echelon inventory system with exponential price dependent demand under credit period

    NASA Astrophysics Data System (ADS)

    Krugon, Seelam; Nagaraju, Dega

    2017-05-01

    This work describes and proposes an two echelon inventory system under supply chain, where the manufacturer offers credit period to the retailer with exponential price dependent demand. The model is framed as demand is expressed as exponential function of retailer’s unit selling price. Mathematical model is framed to demonstrate the optimality of cycle time, retailer replenishment quantity, number of shipments, and total relevant cost of the supply chain. The major objective of the paper is to provide trade credit concept from the manufacturer to the retailer with exponential price dependent demand. The retailer would like to delay the payments of the manufacturer. At the first stage retailer and manufacturer expressions are expressed with the functions of ordering cost, carrying cost, transportation cost. In second stage combining of the manufacturer and retailer expressions are expressed. A MATLAB program is written to derive the optimality of cycle time, retailer replenishment quantity, number of shipments, and total relevant cost of the supply chain. From the optimality criteria derived managerial insights can be made. From the research findings, it is evident that the total cost of the supply chain is decreased with the increase in credit period under exponential price dependent demand. To analyse the influence of the model parameters, parametric analysis is also done by taking with help of numerical example.

  16. A quasi-Monte-Carlo comparison of parametric and semiparametric regression methods for heavy-tailed and non-normal data: an application to healthcare costs.

    PubMed

    Jones, Andrew M; Lomas, James; Moore, Peter T; Rice, Nigel

    2016-10-01

    We conduct a quasi-Monte-Carlo comparison of the recent developments in parametric and semiparametric regression methods for healthcare costs, both against each other and against standard practice. The population of English National Health Service hospital in-patient episodes for the financial year 2007-2008 (summed for each patient) is randomly divided into two equally sized subpopulations to form an estimation set and a validation set. Evaluating out-of-sample using the validation set, a conditional density approximation estimator shows considerable promise in forecasting conditional means, performing best for accuracy of forecasting and among the best four for bias and goodness of fit. The best performing model for bias is linear regression with square-root-transformed dependent variables, whereas a generalized linear model with square-root link function and Poisson distribution performs best in terms of goodness of fit. Commonly used models utilizing a log-link are shown to perform badly relative to other models considered in our comparison.

  17. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  18. Four photon parametric amplification. [in unbiased Josephson junction

    NASA Technical Reports Server (NTRS)

    Parrish, P. T.; Feldman, M. J.; Ohta, H.; Chiao, R. Y.

    1974-01-01

    An analysis is presented describing four-photon parametric amplification in an unbiased Josephson junction. Central to the theory is the model of the Josephson effect as a nonlinear inductance. Linear, small signal analysis is applied to the two-fluid model of the Josephson junction. The gain, gain-bandwidth product, high frequency limit, and effective noise temperature are calculated for a cavity reflection amplifier. The analysis is extended to multiple (series-connected) junctions and subharmonic pumping.

  19. SPM analysis of parametric (R)-[11C]PK11195 binding images: plasma input versus reference tissue parametric methods.

    PubMed

    Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald

    2007-05-01

    (R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).

  20. A haemophilia disease management programme targeting cost and utilization of specialty pharmaceuticals.

    PubMed

    Duncan, N; Roberson, C; Lail, A; Donfield, S; Shapiro, A

    2014-07-01

    The high cost of clotting factor concentrate (CFC) used to treat haemophilia and von Willebrand disease (VWD) attracts health plans' attention for cost management strategies such as disease management programmes (DMPs). In 2004, Indiana's high risk insurance health plan, the Indiana Comprehensive Health Insurance Association, in partnership with the Indiana Hemophilia and Thrombosis Center developed and implemented a DMP for beneficiaries with bleeding disorders. This report evaluates the effectiveness of the DMP 5 years post implementation, with specific emphasis on the cost of CFC and other medical expenditures by severity of disease. A pre/post analysis was used. The main evaluation measures were total cost, total outpatient CFC IU dispensed and adjusted total outpatient CFC cost. Summary statistics and mean and median plots were calculated. Overall, 1000 non-parametric bootstrap replicates were created and percentile confidence limits for 95% confidence intervals (CI) are reported. Mean emergency department (ED) visits and mean and median duration of hospitalizations are also reported. The DMP was associated with a significant decrease in mean annualized total cost including decreased CFC utilization and cost in most years in the overall group, and specifically in patients with severe haemophilia. Patients with mild and moderate haemophilia contributed little to overall programme expenditures. This specialty health care provider-administered DMP exemplifies the success of targeted interventions developed and implemented through a health care facility expert in the disease state to curb the cost of specialty pharmaceuticals in conditions when their expenditures represent a significant portion of total annual costs of care. © 2014 John Wiley & Sons Ltd.

  1. Towards the generation of a parametric foot model using principal component analysis: A pilot study.

    PubMed

    Scarton, Alessandra; Sawacha, Zimi; Cobelli, Claudio; Li, Xinshan

    2016-06-01

    There have been many recent developments in patient-specific models with their potential to provide more information on the human pathophysiology and the increase in computational power. However they are not yet successfully applied in a clinical setting. One of the main challenges is the time required for mesh creation, which is difficult to automate. The development of parametric models by means of the Principle Component Analysis (PCA) represents an appealing solution. In this study PCA has been applied to the feet of a small cohort of diabetic and healthy subjects, in order to evaluate the possibility of developing parametric foot models, and to use them to identify variations and similarities between the two populations. Both the skin and the first metatarsal bones have been examined. Besides the reduced sample of subjects considered in the analysis, results demonstrated that the method adopted herein constitutes a first step towards the realization of a parametric foot models for biomechanical analysis. Furthermore the study showed that the methodology can successfully describe features in the foot, and evaluate differences in the shape of healthy and diabetic subjects. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. Cost-effectiveness of a community-delivered multicomponent intervention compared with enhanced standard care of obese adolescents: cost-utility analysis alongside a randomised controlled trial (the HELP trial).

    PubMed

    Panca, Monica; Christie, Deborah; Cole, Tim J; Costa, Silvia; Gregson, John; Holt, Rebecca; Hudson, Lee D; Kessel, Anthony S; Kinra, Sanjay; Mathiot, Anne; Nazareth, Irwin; Wataranan, Jay; Wong, Ian Chi Kei; Viner, Russell M; Morris, Stephen

    2018-02-15

    To undertake a cost-utility analysis of a motivational multicomponent lifestyle-modification intervention in a community setting (the Healthy Eating Lifestyle Programme (HELP)) compared with enhanced standard care. Cost-utility analysis alongside a randomised controlled trial. Community settings in Greater London, England. 174 young people with obesity aged 12-19 years. Intervention participants received 12 one-to-one sessions across 6 months, addressing lifestyle behaviours and focusing on motivation to change and self-esteem rather than weight change, delivered by trained graduate health workers in community settings. Control participants received a single 1-hour one-to-one nurse-delivered session providing didactic weight-management advice. Mean costs and quality-adjusted life years (QALYs) per participant over a 1-year period using resource use data and utility values collected during the trial. Incremental cost-effectiveness ratio (ICER) was calculated and non-parametric bootstrapping was conducted to generate a cost-effectiveness acceptability curve (CEAC). Mean intervention costs per participant were £918 for HELP and £68 for enhanced standard care. There were no significant differences between the two groups in mean resource use per participant for any type of healthcare contact. Adjusted costs were significantly higher in the intervention group (mean incremental costs for HELP vs enhanced standard care £1003 (95% CI £837 to £1168)). There were no differences in adjusted QALYs between groups (mean QALYs gained 0.008 (95% CI -0.031 to 0.046)). The ICER of the HELP versus enhanced standard care was £120 630 per QALY gained. The CEAC shows that the probability that HELP was cost-effective relative to the enhanced standard care was 0.002 or 0.046, at a threshold of £20 000 or £30 000 per QALY gained. We did not find evidence that HELP was more effective than a single educational session in improving quality of life in a sample of adolescents with obesity. HELP was associated with higher costs, mainly due to the extra costs of delivering the intervention and therefore is not cost-effective. ISRCTN9984011. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    PubMed Central

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery. PMID:26181208

  4. A Conceptual Design Study of a High Temperature Solar Thermal Receiver

    NASA Technical Reports Server (NTRS)

    Robertson, C. S.; Ehde, C. L.; Stacy, L. E.; Abujawdeh, S. S.; Narayanan, R.; Mccreight, L. R.; Gatti, A.; Rauch, H. W., Sr.

    1980-01-01

    A conceptual design was made for a solar thermal receiver capable of operation in the 1095 to 1650 C (2000 to 3000 F) temperature range. This receiver is designed for use with a two-axis paraboloidal concentrator in the 25 to 150 kW sub t power range, and is intended for industrial process heat, Brayton engines, or chemical/fuels reactions. Three concepts were analyzed parametrically. One was selected for conceptual design. Its key feature is a helical coiled tube of sintered silicon nitride which serves as the heat exchanger between the incident solar radiation and the working fluid. A mechanical design of this concept was prepared, and both thermal and stress analysis performed. The analysis showed good performance, low potential cost in mass production, and adaptability to both Brayton cycle engines and chemical/fuels production.

  5. A study of an orbital radar mapping mission to Venus. Volume 3: Parametric studies and subsystem comparisons

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Parametric studies and subsystem comparisons for the orbital radar mapping mission to planet Venus are presented. Launch vehicle requirements and primary orbiter propulsion system requirements are evaluated. The systems parametric analysis indicated that orbit size and orientation interrelated with almost all of the principal spacecraft systems and influenced significantly the definition of orbit insertion propulsion requirements, weight in orbit capability, radar system design, and mapping strategy.

  6. Cost-effectiveness of a nurse practitioner-family physician model of care in a nursing home: controlled before and after study.

    PubMed

    Lacny, Sarah; Zarrabi, Mahmood; Martin-Misener, Ruth; Donald, Faith; Sketris, Ingrid; Murphy, Andrea L; DiCenso, Alba; Marshall, Deborah A

    2016-09-01

    To examine the cost-effectiveness of a nurse practitioner-family physician model of care compared with family physician-only care in a Canadian nursing home. As demand for long-term care increases, alternative care models including nurse practitioners are being explored. Cost-effectiveness analysis using a controlled before-after design. The study included an 18-month 'before' period (2005-2006) and a 21-month 'after' time period (2007-2009). Data were abstracted from charts from 2008-2010. We calculated incremental cost-effectiveness ratios comparing the intervention (nurse practitioner-family physician model; n = 45) to internal (n = 65), external (n = 70) and combined internal/external family physician-only control groups, measured as the change in healthcare costs divided by the change in emergency department transfers/person-month. We assessed joint uncertainty around costs and effects using non-parametric bootstrapping and cost-effectiveness acceptability curves. Point estimates of the incremental cost-effectiveness ratio demonstrated the nurse practitioner-family physician model dominated the internal and combined control groups (i.e. was associated with smaller increases in costs and emergency department transfers/person-month). Compared with the external control, the intervention resulted in a smaller increase in costs and larger increase in emergency department transfers. Using a willingness-to-pay threshold of $1000 CAD/emergency department transfer, the probability the intervention was cost-effective compared with the internal, external and combined control groups was 26%, 21% and 25%. Due to uncertainty around the distribution of costs and effects, we were unable to make a definitive conclusion regarding the cost-effectiveness of the nurse practitioner-family physician model; however, these results suggest benefits that could be confirmed in a larger study. © 2016 John Wiley & Sons Ltd.

  7. Parametric Analysis and Safety Concepts of CWR Track Buckling.

    DOT National Transportation Integrated Search

    1993-12-01

    The report presents a comprehensive study of continuous welded rail (CWR) track buckling strength as influenced by the range of all key parameters such as the lateral, torsional and longitudinal resistance, vehicle loads, etc. The parametric study pr...

  8. Cost-effectiveness of osimertinib in the UK for advanced EGFR-T790M non-small cell lung cancer.

    PubMed

    Bertranou, Evelina; Bodnar, Carolyn; Dansk, Viktor; Greystoke, Alastair; Large, Samuel; Dyer, Matthew

    2018-02-01

    This study presents the cost-utility analysis that was developed to inform the NICE health technology assessment of osimertinib vs platinum-based doublet chemotherapy (PDC) in patients with EGFR-T790M mutation-positive non-small cell lung cancer (NSCLC) who have progressed on epidermal growth factor receptor-tyrosine kinase inhibitor (EGFR-TKI) therapy. A partitioned survival model with three health states (progression-free, progressed disease, and death) from a UK payer perspective and over lifetime (15 years) was developed. Direct costs included disease management, treatment-related (acquisition, administration, monitoring, adverse events), and T790M testing costs. Efficacy and safety data were taken from clinical trials AURA extension and AURA2 for osimertinib and IMPRESS for PDC. An adjusted indirect treatment comparison was applied to reduce the potential bias in the non-randomized comparison. Parametric functions were utilized to extrapolate survival beyond the observed period. Health state utility values were calculated from EQ-5D data collected in the trials and valued using UK tariffs. Resource use and costs were based on published sources. Osimertinib was associated with a gain of 1.541 quality-adjusted life-years (QALYs) at an incremental cost of £64,283 vs PDC (incremental cost-effectiveness ratio [ICER]: £41,705/QALY gained). Scenario analyses showed that none of the plausible scenarios produced an ICER above £44,000 per QALY gained, and probabilistic sensitivity analyses demonstrated a 63.4% probability that osimertinib will be cost-effective at a willingness-to-pay threshold of £50,000. The analysis is subject to some level of uncertainty inherent to phase 2 single-arm data and the immaturity of the currently available survival data for osimertinib. Osimertinib may be considered a cost-effective treatment option compared with PDC in the second-line setting in patients with EGFR-T790M mutation-positive NSCLC from a UK payer perspective. Further data from the ongoing AURA clinical trial program will reduce the inherent uncertainty in the analysis.

  9. Economic evaluation of a weight control program with e-mail and telephone counseling among overweight employees: a randomized controlled trial

    PubMed Central

    2012-01-01

    Background Distance lifestyle counseling for weight control is a promising public health intervention in the work setting. Information about the cost-effectiveness of such interventions is lacking, but necessary to make informed implementation decisions. The purpose of this study was to perform an economic evaluation of a six-month program with lifestyle counseling aimed at weight reduction in an overweight working population with a two-year time horizon from a societal perspective. Methods A randomized controlled trial comparing a program with two modes of intervention delivery against self-help. 1386 Employees from seven companies participated (67% male, mean age 43 (SD 8.6) years, mean BMI 29.6 (SD 3.5) kg/m2). All groups received self-directed lifestyle brochures. The two intervention groups additionally received a workbook-based program with phone counseling (phone; n=462) or a web-based program with e-mail counseling (internet; n=464). Body weight was measured at baseline and 24 months after baseline. Quality of life (EuroQol-5D) was assessed at baseline, 6, 12, 18 and 24 months after baseline. Resource use was measured with six-monthly diaries and valued with Dutch standard costs. Missing data were multiply imputed. Uncertainty around differences in costs and incremental cost-effectiveness ratios was estimated by applying non-parametric bootstrapping techniques and graphically plotting the results in cost-effectiveness planes and cost-effectiveness acceptability curves. Results At two years the incremental cost-effectiveness ratio was €1009/kg weight loss in the phone group and €16/kg weight loss in the internet group. The cost-utility analysis resulted in €245,243/quality adjusted life year (QALY) and €1337/QALY, respectively. The results from a complete-case analysis were slightly more favorable. However, there was considerable uncertainty around all outcomes. Conclusions Neither intervention mode was proven to be cost-effective compared to self-help. Trial registration ISRCTN04265725 PMID:22967224

  10. Economic outcomes of maintenance gefitinib for locally advanced/metastatic non-small-cell lung cancer with unknown EGFR mutations: a semi-Markov model analysis.

    PubMed

    Zeng, Xiaohui; Li, Jianhe; Peng, Liubao; Wang, Yunhua; Tan, Chongqing; Chen, Gannong; Wan, Xiaomin; Lu, Qiong; Yi, Lidan

    2014-01-01

    Maintenance gefitinib significantly prolonged progression-free survival (PFS) compared with placebo in patients from eastern Asian with locally advanced/metastatic non-small-cell lung cancer (NSCLC) after four chemotherapeutic cycles (21 days per cycle) of first-line platinum-based combination chemotherapy without disease progression. The objective of the current study was to evaluate the cost-effectiveness of maintenance gefitinib therapy after four chemotherapeutic cycle's stand first-line platinum-based chemotherapy for patients with locally advanced or metastatic NSCLC with unknown EGFR mutations, from a Chinese health care system perspective. A semi-Markov model was designed to evaluate cost-effectiveness of the maintenance gefitinib treatment. Two-parametric Weibull and Log-logistic distribution were fitted to PFS and overall survival curves independently. One-way and probabilistic sensitivity analyses were conducted to assess the stability of the model designed. The model base-case analysis suggested that maintenance gefitinib would increase benefits in a 1, 3, 6 or 10-year time horizon, with incremental $184,829, $19,214, $19,328, and $21,308 per quality-adjusted life-year (QALY) gained, respectively. The most sensitive influential variable in the cost-effectiveness analysis was utility of PFS plus rash, followed by utility of PFS plus diarrhoea, utility of progressed disease, price of gefitinib, cost of follow-up treatment in progressed survival state, and utility of PFS on oral therapy. The price of gefitinib is the most significant parameter that could reduce the incremental cost per QALY. Probabilistic sensitivity analysis indicated that the cost-effective probability of maintenance gefitinib was zero under the willingness-to-pay (WTP) threshold of $16,349 (3 × per-capita gross domestic product of China). The sensitivity analyses all suggested that the model was robust. Maintenance gefitinib following first-line platinum-based chemotherapy for patients with locally advanced/metastatic NSCLC with unknown EGFR mutations is not cost-effective. Decreasing the price of gefitinib may be a preferential choice for meeting widely treatment demands in China.

  11. The Absolute Stability Analysis in Fuzzy Control Systems with Parametric Uncertainties and Reference Inputs

    NASA Astrophysics Data System (ADS)

    Wu, Bing-Fei; Ma, Li-Shan; Perng, Jau-Woei

    This study analyzes the absolute stability in P and PD type fuzzy logic control systems with both certain and uncertain linear plants. Stability analysis includes the reference input, actuator gain and interval plant parameters. For certain linear plants, the stability (i.e. the stable equilibriums of error) in P and PD types is analyzed with the Popov or linearization methods under various reference inputs and actuator gains. The steady state errors of fuzzy control systems are also addressed in the parameter plane. The parametric robust Popov criterion for parametric absolute stability based on Lur'e systems is also applied to the stability analysis of P type fuzzy control systems with uncertain plants. The PD type fuzzy logic controller in our approach is a single-input fuzzy logic controller and is transformed into the P type for analysis. In our work, the absolute stability analysis of fuzzy control systems is given with respect to a non-zero reference input and an uncertain linear plant with the parametric robust Popov criterion unlike previous works. Moreover, a fuzzy current controlled RC circuit is designed with PSPICE models. Both numerical and PSPICE simulations are provided to verify the analytical results. Furthermore, the oscillation mechanism in fuzzy control systems is specified with various equilibrium points of view in the simulation example. Finally, the comparisons are also given to show the effectiveness of the analysis method.

  12. Theoretical Analysis of Penalized Maximum-Likelihood Patlak Parametric Image Reconstruction in Dynamic PET for Lesion Detection.

    PubMed

    Yang, Li; Wang, Guobao; Qi, Jinyi

    2016-04-01

    Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.

  13. Studies on the Parametric Effects of Plasma Arc Welding of 2205 Duplex Stainless Steel

    NASA Astrophysics Data System (ADS)

    Selva Bharathi, R.; Siva Shanmugam, N.; Murali Kannan, R.; Arungalai Vendan, S.

    2018-03-01

    This research study attempts to create an optimized parametric window by employing Taguchi algorithm for Plasma Arc Welding (PAW) of 2 mm thick 2205 duplex stainless steel. The parameters considered for experimentation and optimization are the welding current, welding speed and pilot arc length respectively. The experimentation involves the parameters variation and subsequently recording the depth of penetration and bead width. Welding current of 60-70 A, welding speed of 250-300 mm/min and pilot arc length of 1-2 mm are the range between which the parameters are varied. Design of experiments is used for the experimental trials. Back propagation neural network, Genetic algorithm and Taguchi techniques are used for predicting the bead width, depth of penetration and validated with experimentally achieved results which were in good agreement. Additionally, micro-structural characterizations are carried out to examine the weld quality. The extrapolation of these optimized parametric values yield enhanced weld strength with cost and time reduction.

  14. Large-scale subject-specific cerebral arterial tree modeling using automated parametric mesh generation for blood flow simulation.

    PubMed

    Ghaffari, Mahsa; Tangen, Kevin; Alaraj, Ali; Du, Xinjian; Charbel, Fady T; Linninger, Andreas A

    2017-12-01

    In this paper, we present a novel technique for automatic parametric mesh generation of subject-specific cerebral arterial trees. This technique generates high-quality and anatomically accurate computational meshes for fast blood flow simulations extending the scope of 3D vascular modeling to a large portion of cerebral arterial trees. For this purpose, a parametric meshing procedure was developed to automatically decompose the vascular skeleton, extract geometric features and generate hexahedral meshes using a body-fitted coordinate system that optimally follows the vascular network topology. To validate the anatomical accuracy of the reconstructed vasculature, we performed statistical analysis to quantify the alignment between parametric meshes and raw vascular images using receiver operating characteristic curve. Geometric accuracy evaluation showed an agreement with area under the curves value of 0.87 between the constructed mesh and raw MRA data sets. Parametric meshing yielded on-average, 36.6% and 21.7% orthogonal and equiangular skew quality improvement over the unstructured tetrahedral meshes. The parametric meshing and processing pipeline constitutes an automated technique to reconstruct and simulate blood flow throughout a large portion of the cerebral arterial tree down to the level of pial vessels. This study is the first step towards fast large-scale subject-specific hemodynamic analysis for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  16. Organizing Space Shuttle parametric data for maintainability

    NASA Technical Reports Server (NTRS)

    Angier, R. C.

    1983-01-01

    A model of organization and management of Space Shuttle data is proposed. Shuttle avionics software is parametrically altered by a reconfiguration process for each flight. As the flight rate approaches an operational level, current methods of data management would become increasingly complex. An alternative method is introduced, using modularized standard data, and its implications for data collection, integration, validation, and reconfiguration processes are explored. Information modules are cataloged for later use, and may be combined in several levels for maintenance. For each flight, information modules can then be selected from the catalog at a high level. These concepts take advantage of the reusability of Space Shuttle information to reduce the cost of reconfiguration as flight experience increases.

  17. A generalized concept for cost-effective structural design. [Statistical Decision Theory applied to aerospace systems

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hawk, J. D.

    1975-01-01

    A generalized concept for cost-effective structural design is introduced. It is assumed that decisions affecting the cost effectiveness of aerospace structures fall into three basic categories: design, verification, and operation. Within these basic categories, certain decisions concerning items such as design configuration, safety factors, testing methods, and operational constraints are to be made. All or some of the variables affecting these decisions may be treated probabilistically. Bayesian statistical decision theory is used as the tool for determining the cost optimum decisions. A special case of the general problem is derived herein, and some very useful parametric curves are developed and applied to several sample structures.

  18. Coal-Fired Boilers at Navy Bases, Navy Energy Guidance Study, Phase II and III.

    DTIC Science & Technology

    1979-05-01

    several sizes were performed. Central plants containing four equal-sized boilers and central flue gas desulfurization facilities were shown to be less...Conceptual design and parametric cost studies of steam and power generation systems using coal-fired stoker boilers and stack gas scrubbers in

  19. Theoretical Comparison of Fixed Route Bus and Flexible Route Subscription Bus Feeder Service in Low Density Areas

    DOT National Transportation Integrated Search

    1975-03-01

    parametric variation of demand density was used to compare service level and cost of two alternative systems for providing low density feeder service. Supply models for fixed route and flexible route service were developed and applied to determine ra...

  20. Integrated modeling environment for systems-level performance analysis of the Next-Generation Space Telescope

    NASA Astrophysics Data System (ADS)

    Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry

    1998-08-01

    All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.

  1. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  2. Design study of wind turbines 50 kW to 3000 kW for electric utility applications: Analysis and design

    NASA Technical Reports Server (NTRS)

    1976-01-01

    In the conceptual design task, several feasible wind generator systems (WGS) configurations were evaluated, and the concept offering the lowest energy cost potential and minimum technical risk for utility applications was selected. In the optimization task, the selected concept was optimized utilizing a parametric computer program prepared for this purpose. In the preliminary design task, the optimized selected concept was designed and analyzed in detail. The utility requirements evaluation task examined the economic, operational, and institutional factors affecting the WGS in a utility environment, and provided additional guidance for the preliminary design effort. Results of the conceptual design task indicated that a rotor operating at constant speed, driving an AC generator through a gear transmission is the most cost effective WGS configuration. The optimization task results led to the selection of a 500 kW rating for the low power WGS and a 1500 kW rating for the high power WGS.

  3. Failure at Frame-Stringer Intersections in PRSEUS Panels

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C.

    2012-01-01

    NASA, the Air Force Research Laboratory and The Boeing Company have worked to develop new low-cost, light-weight composite structures for aircraft. A Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) concept has been developed which offers advantages over traditional metallic structures. In this concept a stitched carbon-epoxy material system has been developed with the potential for reducing the weight and cost of transport aircraft structure by eliminating fasteners, thereby reducing part count and labor. By adding unidirectional carbon rods to the top of stiffeners, the panel becomes more structurally efficient. This combination produces a more damage tolerant design. This study focuses on the intersection between the rod-stiffener and the foam-filled frame in a PRSEUS specimen. Compression loading is considered, which induces stress concentrations at the intersection point that can lead to failures. An experiment with accompanying analysis for a single-frame specimen is described, followed by a parametric study of simple reinforcements to reduce strains in the intersection region.

  4. Parallel runway requirement analysis study. Volume 1: The analysis

    NASA Technical Reports Server (NTRS)

    Ebrahimi, Yaghoob S.

    1993-01-01

    The correlation of increased flight delays with the level of aviation activity is well recognized. A main contributor to these flight delays has been the capacity of airports. Though new airport and runway construction would significantly increase airport capacity, few programs of this type are currently underway, let alone planned, because of the high cost associated with such endeavors. Therefore, it is necessary to achieve the most efficient and cost effective use of existing fixed airport resources through better planning and control of traffic flows. In fact, during the past few years the FAA has initiated such an airport capacity program designed to provide additional capacity at existing airports. Some of the improvements that that program has generated thus far have been based on new Air Traffic Control procedures, terminal automation, additional Instrument Landing Systems, improved controller display aids, and improved utilization of multiple runways/Instrument Meteorological Conditions (IMC) approach procedures. A useful element to understanding potential operational capacity enhancements at high demand airports has been the development and use of an analysis tool called The PLAND_BLUNDER (PLB) Simulation Model. The objective for building this simulation was to develop a parametric model that could be used for analysis in determining the minimum safety level of parallel runway operations for various parameters representing the airplane, navigation, surveillance, and ATC system performance. This simulation is useful as: a quick and economical evaluation of existing environments that are experiencing IMC delays, an efficient way to study and validate proposed procedure modifications, an aid in evaluating requirements for new airports or new runways in old airports, a simple, parametric investigation of a wide range of issues and approaches, an ability to tradeoff air and ground technology and procedures contributions, and a way of considering probable blunder mechanisms and range of blunder scenarios. This study describes the steps of building the simulation and considers the input parameters, assumptions and limitations, and available outputs. Validation results and sensitivity analysis are addressed as well as outlining some IMC and Visual Meteorological Conditions (VMC) approaches to parallel runways. Also, present and future applicable technologies (e.g., Digital Autoland Systems, Traffic Collision and Avoidance System II, Enhanced Situational Awareness System, Global Positioning Systems for Landing, etc.) are assessed and recommendations made.

  5. Costs explained by function rather than diagnosis--results from the SNAC Nordanstig elderly cohort in Sweden.

    PubMed

    Lindholm, C; Gustavsson, A; Jönsson, L; Wimo, A

    2013-05-01

    Because the prevalence of many brain disorders rises with age, and brain disorders are costly, the economic burden of brain disorders will increase markedly during the next decades. The purpose of this study is to analyze how the costs to society vary with different levels of functioning and with the presence of a brain disorder. Resource utilization and costs from a societal viewpoint were analyzed versus cognition, activities of daily living (ADL), instrumental activities of daily living (IADL), brain disorder diagnosis and age in a population-based cohort of people aged 65 years and older in Nordanstig in Northern Sweden. Descriptive statistics, non-parametric bootstrapping and a generalized linear model (GLM) were used for the statistical analyses. Most people were zero users of care. Societal costs of dementia were by far the highest, ranging from SEK 262,000 (mild) to SEK 519,000 per year (severe dementia). In univariate analysis, all measures of functioning were significantly related to costs. When controlling for ADL and IADL in the multivariate GLM, cognition did not have a statistically significant effect on total cost. The presence of a brain disorder did not impact total cost when controlling for function. The greatest shift in costs was seen when comparing no dependency in ADL and dependency in one basic ADL function. It is the level of functioning, rather than the presence of a brain disorder diagnosis, which predicts costs. ADLs are better explanatory variables of costs than Mini mental state examination. Most people in a population-based cohort are zero users of care. Copyright © 2012 John Wiley & Sons, Ltd.

  6. Cost effectiveness of brace, physiotherapy, or both for treatment of tennis elbow

    PubMed Central

    Struijs, P A A; Bos, I B C Korthals‐de; van Tulder, M W; van Dijk, C N; Bouter, L M

    2006-01-01

    Background The annual incidence of tennis elbow in the general population is high (1–3%). Tennis elbow often leads to limitation of activities of daily living and work absenteeism. Physiotherapy and braces are the most common treatments. Objectives The hypothesis of the trial was that no difference exists in the cost effectiveness of physiotherapy, braces, and a combination of the two for treatment of tennis elbow. Methods The trial was designed as a randomised controlled trial with intention to treat analysis. A total of 180 patients with tennis elbow were randomised to brace only (n  =  68), physiotherapy (n  =  56), or a combination of the two (n  =  56). Outcome measures were success rate, severity of complaints, pain, functional disability, and quality of life. Follow up was at six, 26, and 52 weeks. Direct healthcare and non‐healthcare costs and indirect costs were measured. Mean cost differences over 12 months were evaluated by applying non‐parametric bootstrap techniques. Results No clinically relevant or statistically significant differences were found between the groups. Success rate at 12 months was 89% in the physiotherapy group, 86% in the brace group, and 87% in the combination group. Mean total costs per patient were €2069 in the brace only group, €978 in the physiotherapy group, and €1256 in the combination group. The mean difference in total costs between the physiotherapy and brace group was substantial (€1005), although not significant. Cost effectiveness ratios and cost utility ratios showed physiotherapy to be the most cost effective, although this also was not statistically significant. Conclusion No clinically relevant or statistically significant differences in costs were identified between the three strategies. PMID:16687482

  7. Comparing of Cox model and parametric models in analysis of effective factors on event time of neuropathy in patients with type 2 diabetes.

    PubMed

    Kargarian-Marvasti, Sadegh; Rimaz, Shahnaz; Abolghasemi, Jamileh; Heydari, Iraj

    2017-01-01

    Cox proportional hazard model is the most common method for analyzing the effects of several variables on survival time. However, under certain circumstances, parametric models give more precise estimates to analyze survival data than Cox. The purpose of this study was to investigate the comparative performance of Cox and parametric models in a survival analysis of factors affecting the event time of neuropathy in patients with type 2 diabetes. This study included 371 patients with type 2 diabetes without neuropathy who were registered at Fereydunshahr diabetes clinic. Subjects were followed up for the development of neuropathy between 2006 to March 2016. To investigate the factors influencing the event time of neuropathy, significant variables in univariate model ( P < 0.20) were entered into the multivariate Cox and parametric models ( P < 0.05). In addition, Akaike information criterion (AIC) and area under ROC curves were used to evaluate the relative goodness of fitted model and the efficiency of each procedure, respectively. Statistical computing was performed using R software version 3.2.3 (UNIX platforms, Windows and MacOS). Using Kaplan-Meier, survival time of neuropathy was computed 76.6 ± 5 months after initial diagnosis of diabetes. After multivariate analysis of Cox and parametric models, ethnicity, high-density lipoprotein and family history of diabetes were identified as predictors of event time of neuropathy ( P < 0.05). According to AIC, "log-normal" model with the lowest Akaike's was the best-fitted model among Cox and parametric models. According to the results of comparison of survival receiver operating characteristics curves, log-normal model was considered as the most efficient and fitted model.

  8. Integrated versus nOn-integrated Peripheral inTravenous catheter. Which Is the most effective systeM for peripheral intravenoUs catheter Management? (The OPTIMUM study): a randomised controlled trial protocol

    PubMed Central

    Castillo, Maria Isabel; Larsen, Emily; Cooke, Marie; Marsh, Nicole M; Wallis, Marianne C; Finucane, Julie; Brown, Peter; Mihala, Gabor; Byrnes, Joshua; Walker, Rachel; Cable, Prudence; Zhang, Li; Sear, Candi; Jackson, Gavin; Rowsome, Anna; Ryan, Alison; Humphries, Julie C; Sivyer, Susan; Flanigan, Kathy; Rickard, Claire M

    2018-01-01

    Introduction Peripheral intravenous catheters (PIVCs) are frequently used in hospitals. However, PIVC complications are common, with failures leading to treatment delays, additional procedures, patient pain and discomfort, increased clinician workload and substantially increased healthcare costs. Recent evidence suggests integrated PIVC systems may be more effective than traditional non-integrated PIVC systems in reducing phlebitis, infiltration and costs and increasing functional dwell time. The study aim is to determine the efficacy, cost–utility and acceptability to patients and professionals of an integrated PIVC system compared with a non-integrated PIVC system. Methods and analysis Two-arm, multicentre, randomised controlled superiority trial of integrated versus non-integrated PIVC systems to compare effectiveness on clinical and economic outcomes. Recruitment of 1560 patients over 2 years, with randomisation by a centralised service ensuring allocation concealment. Primary outcomes: catheter failure (composite endpoint) for reasons of: occlusion, infiltration/extravasation, phlebitis/thrombophlebitis, dislodgement, localised or catheter-associated bloodstream infections. Secondary outcomes: first time insertion success, types of PIVC failure, device colonisation, insertion pain, functional dwell time, adverse events, mortality, cost–utility and consumer acceptability. One PIVC per patient will be included, with intention-to-treat analysis. Baseline group comparisons will be made for potentially clinically important confounders. The proportional hazards assumption will be checked, and Cox regression will test the effect of group, patient, device and clinical variables on failure. An as-treated analysis will assess the effect of protocol violations. Kaplan-Meier survival curves with log-rank tests will compare failure by group over time. Secondary endpoints will be compared between groups using parametric/non-parametric techniques. Ethics and dissemination Ethical approval from the Royal Brisbane and Women’s Hospital Human Research Ethics Committee (HREC/16/QRBW/527), Griffith University Human Research Ethics Committee (Ref No. 2017/002) and the South Metropolitan Health Services Human Research Ethics Committee (Ref No. 2016–239). Results will be published in peer-reviewed journals. Trial registration number ACTRN12617000089336. PMID:29764876

  9. Parametric number covariance in quantum chaotic spectra.

    PubMed

    Vinayak; Kumar, Sandeep; Pandey, Akhilesh

    2016-03-01

    We study spectral parametric correlations in quantum chaotic systems and introduce the number covariance as a measure of such correlations. We derive analytic results for the classical random matrix ensembles using the binary correlation method and obtain compact expressions for the covariance. We illustrate the universality of this measure by presenting the spectral analysis of the quantum kicked rotors for the time-reversal invariant and time-reversal noninvariant cases. A local version of the parametric number variance introduced earlier is also investigated.

  10. Parametric models of reflectance spectra for dyed fabrics

    NASA Astrophysics Data System (ADS)

    Aiken, Daniel C.; Ramsey, Scott; Mayo, Troy; Lambrakos, Samuel G.; Peak, Joseph

    2016-05-01

    This study examines parametric modeling of NIR reflectivity spectra for dyed fabrics, which provides for both their inverse and direct modeling. The dye considered for prototype analysis is triarylamine dye. The fabrics considered are camouflage textiles characterized by color variations. The results of this study provide validation of the constructed parametric models, within reasonable error tolerances for practical applications, including NIR spectral characteristics in camouflage textiles, for purposes of simulating NIR spectra corresponding to various dye concentrations in host fabrics, and potentially to mixtures of dyes.

  11. Computation of the intensities of parametric holographic scattering patterns in photorefractive crystals.

    PubMed

    Schwalenberg, Simon

    2005-06-01

    The present work represents a first attempt to perform computations of output intensity distributions for different parametric holographic scattering patterns. Based on the model for parametric four-wave mixing processes in photorefractive crystals and taking into account realistic material properties, we present computed images of selected scattering patterns. We compare these calculated light distributions to the corresponding experimental observations. Our analysis is especially devoted to dark scattering patterns as they make high demands on the underlying model.

  12. Numerical prediction of 3-D ejector flows

    NASA Technical Reports Server (NTRS)

    Roberts, D. W.; Paynter, G. C.

    1979-01-01

    The use of parametric flow analysis, rather than parametric scale testing, to support the design of an ejector system offers a number of potential advantages. The application of available 3-D flow analyses to the design ejectors can be subdivided into several key elements. These are numerics, turbulence modeling, data handling and display, and testing in support of analysis development. Experimental and predicted jet exhaust for the Boeing 727 aircraft are examined.

  13. Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis.

    PubMed

    Crowther, Michael J; Look, Maxime P; Riley, Richard D

    2014-09-28

    Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Fuzzy interval Finite Element/Statistical Energy Analysis for mid-frequency analysis of built-up systems with mixed fuzzy and interval parameters

    NASA Astrophysics Data System (ADS)

    Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan

    2016-10-01

    This paper introduces mixed fuzzy and interval parametric uncertainties into the FE components of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model for mid-frequency analysis of built-up systems, thus an uncertain ensemble combining non-parametric with mixed fuzzy and interval parametric uncertainties comes into being. A fuzzy interval Finite Element/Statistical Energy Analysis (FIFE/SEA) framework is proposed to obtain the uncertain responses of built-up systems, which are described as intervals with fuzzy bounds, termed as fuzzy-bounded intervals (FBIs) in this paper. Based on the level-cut technique, a first-order fuzzy interval perturbation FE/SEA (FFIPFE/SEA) and a second-order fuzzy interval perturbation FE/SEA method (SFIPFE/SEA) are developed to handle the mixed parametric uncertainties efficiently. FFIPFE/SEA approximates the response functions by the first-order Taylor series, while SFIPFE/SEA improves the accuracy by considering the second-order items of Taylor series, in which all the mixed second-order items are neglected. To further improve the accuracy, a Chebyshev fuzzy interval method (CFIM) is proposed, in which the Chebyshev polynomials is used to approximate the response functions. The FBIs are eventually reconstructed by assembling the extrema solutions at all cut levels. Numerical results on two built-up systems verify the effectiveness of the proposed methods.

  15. Commercial launch systems: A risky investment?

    NASA Astrophysics Data System (ADS)

    Dupnick, Edwin; Skratt, John

    1996-03-01

    A myriad of evolutionary paths connect the current state of government-dominated space launch operations to true commercial access to space. Every potential path requires the investment of private capital sufficient to fund the commercial venture with a perceived risk/return ratio acceptable to the investors. What is the private sector willing to invest? Does government participation reduce financial risk? How viable is a commercial launch system without government participation and support? We examine the interplay between various forms of government participation in commercial launch system development, alternative launch system designs, life cycle cost estimates, and typical industry risk aversion levels. The boundaries of this n-dimensional envelope are examined with an ECON-developed business financial model which provides for the parametric assessment and interaction of SSTO design variables (including various operational scenarios with financial variables including debt/equity assumptions, and commercial enterprise burden rates on various functions. We overlay this structure with observations from previous ECON research which characterize financial risk aversion levels for selected industrial sectors in terms of acceptable initial lump-sum investments, cumulative investments, probability of failure, payback periods, and ROI. The financial model allows the construction of parametric tradeoffs based on ranges of variables which can be said to actually encompass the ``true'' cost of operations and determine what level of ``true'' costs can be tolerated by private capitalization.

  16. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    NASA Astrophysics Data System (ADS)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  17. Evaluation of advanced airship concepts

    NASA Technical Reports Server (NTRS)

    Joner, B. A.; Schneider, J. J.

    1975-01-01

    A historical overview of the airship, technical and operational characteristics of conventional and hybrid concepts, and the results of a parametric design analysis and evaluation are presented. The lift capabilities of certain buoyant fluids for a hypothetical 16 million cu.ft. volume airship are compared. The potential advanced airship concepts are surveyed, followed by a discussion of the six configurations: conventional nonrigid, conventional rigid, Deltoid (Dynairship), Guppoid (Megalifter), Helipsoid, and Heli-Stat. It is suggested that a partially buoyant Helipsoid concept of the optimum buoyancy ratio has the potential to solve the problems facing future airship development, such as Ballast and Ballast Recovery System, Full Low-Speed Controllability, Susceptibility to Wind/Gusting, Weather/Icing Constraints, Ground Handling/Hangaring, and Direct/Indirect Operating Costs.

  18. Feasibility study of modern airships, phase 1. Volume 2: Parametric analysis (task 3). [lift, weight (mass)

    NASA Technical Reports Server (NTRS)

    Lancaster, J. W.

    1975-01-01

    Various types of lighter-than-air vehicles from fully buoyant to semibuoyant hybrids were examined. Geometries were optimized for gross lifting capabilities for ellipsoidal airships, modified delta planform lifting bodies, and a short-haul, heavy-lift vehicle concept. It is indicated that: (1) neutrally buoyant airships employing a conservative update of materials and propulsion technology provide significant improvements in productivity; (2) propulsive lift for VTOL and aerodynamic lift for cruise significantly improve the productivity of low to medium gross weight ellipsoidal airships; and (3) the short-haul, heavy-lift vehicle, consisting of a simple combination of an ellipsoidal airship hull and existing helicopter componentry, provides significant potential for low-cost, near-term applications for ultra-heavy lift missions.

  19. Cost-minimization analysis of panitumumab compared with cetuximab for first-line treatment of patients with wild-type RAS metastatic colorectal cancer.

    PubMed

    Graham, Christopher N; Hechmati, Guy; Fakih, Marwan G; Knox, Hediyyih N; Maglinte, Gregory A; Hjelmgren, Jonas; Barber, Beth; Schwartzberg, Lee S

    2015-01-01

    To compare the costs of first-line treatment with panitumumab + FOLFOX in comparison to cetuximab + FOLFIRI among patients with wild-type (WT) RAS metastatic colorectal cancer (mCRC) in the US. A cost-minimization model was developed assuming similar treatment efficacy between both regimens. The model estimated the costs associated with drug acquisition, treatment administration frequency (every 2 weeks for panitumumab, weekly for cetuximab), and incidence of infusion reactions. Average anti-EGFR doses were calculated from the ASPECCT clinical trial, and average doses of chemotherapy regimens were based on product labels. Using the medical component of the consumer price index, adverse event costs were inflated to 2014 US dollars, and all other costs were reported in 2014 US dollars. The time horizon for the model was based on average first-line progression-free survival of a WT RAS patient, estimated from parametric survival analyses of PRIME clinical trial data. Relative to cetuximab + FOLFIRI in the first-line treatment of WT RAS mCRC, the cost-minimization model demonstrated lower projected drug acquisition, administration, and adverse event costs for patients who received panitumumab + FOLFOX. The overall cost per patient for first-line treatment was $179,219 for panitumumab + FOLFOX vs $202,344 for cetuximab + FOLFIRI, resulting in a per-patient saving of $23,125 (11.4%) in favor of panitumumab + FOLFOX. From a value perspective, the cost-minimization model supports panitumumab + FOLFOX instead of cetuximab + FOLFIRI as the preferred first-line treatment of WT RAS mCRC patients requiring systemic therapy.

  20. Operation analysis of a Chebyshev-Pantograph leg mechanism for a single DOF biped robot

    NASA Astrophysics Data System (ADS)

    Liang, Conghui; Ceccarelli, Marco; Takeda, Yukio

    2012-12-01

    In this paper, operation analysis of a Chebyshev-Pantograph leg mechanism is presented for a single degree of freedom (DOF) biped robot. The proposed leg mechanism is composed of a Chebyshev four-bar linkage and a pantograph mechanism. In contrast to general fully actuated anthropomorphic leg mechanisms, the proposed leg mechanism has peculiar features like compactness, low-cost, and easy-operation. Kinematic equations of the proposed leg mechanism are formulated for a computer oriented simulation. Simulation results show the operation performance of the proposed leg mechanism with suitable characteristics. A parametric study has been carried out to evaluate the operation performance as function of design parameters. A prototype of a single DOF biped robot equipped with two proposed leg mechanisms has been built at LARM (Laboratory of Robotics and Mechatronics). Experimental test shows practical feasible walking ability of the prototype, as well as drawbacks are discussed for the mechanical design.

  1. A computer program for multiple decrement life table analyses.

    PubMed

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  2. Chaotic map clustering algorithm for EEG analysis

    NASA Astrophysics Data System (ADS)

    Bellotti, R.; De Carlo, F.; Stramaglia, S.

    2004-03-01

    The non-parametric chaotic map clustering algorithm has been applied to the analysis of electroencephalographic signals, in order to recognize the Huntington's disease, one of the most dangerous pathologies of the central nervous system. The performance of the method has been compared with those obtained through parametric algorithms, as K-means and deterministic annealing, and supervised multi-layer perceptron. While supervised neural networks need a training phase, performed by means of data tagged by the genetic test, and the parametric methods require a prior choice of the number of classes to find, the chaotic map clustering gives a natural evidence of the pathological class, without any training or supervision, thus providing a new efficient methodology for the recognition of patterns affected by the Huntington's disease.

  3. Recent advances in parametric neuroreceptor mapping with dynamic PET: basic concepts and graphical analyses.

    PubMed

    Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung

    2014-10-01

    Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.

  4. An appraisal of statistical procedures used in derivation of reference intervals.

    PubMed

    Ichihara, Kiyoshi; Boyd, James C

    2010-11-01

    When conducting studies to derive reference intervals (RIs), various statistical procedures are commonly applied at each step, from the planning stages to final computation of RIs. Determination of the necessary sample size is an important consideration, and evaluation of at least 400 individuals in each subgroup has been recommended to establish reliable common RIs in multicenter studies. Multiple regression analysis allows identification of the most important factors contributing to variation in test results, while accounting for possible confounding relationships among these factors. Of the various approaches proposed for judging the necessity of partitioning reference values, nested analysis of variance (ANOVA) is the likely method of choice owing to its ability to handle multiple groups and being able to adjust for multiple factors. Box-Cox power transformation often has been used to transform data to a Gaussian distribution for parametric computation of RIs. However, this transformation occasionally fails. Therefore, the non-parametric method based on determination of the 2.5 and 97.5 percentiles following sorting of the data, has been recommended for general use. The performance of the Box-Cox transformation can be improved by introducing an additional parameter representing the origin of transformation. In simulations, the confidence intervals (CIs) of reference limits (RLs) calculated by the parametric method were narrower than those calculated by the non-parametric approach. However, the margin of difference was rather small owing to additional variability in parametrically-determined RLs introduced by estimation of parameters for the Box-Cox transformation. The parametric calculation method may have an advantage over the non-parametric method in allowing identification and exclusion of extreme values during RI computation.

  5. Decomposing Cost Efficiency in Regional Long-term Care Provision in Japan

    PubMed Central

    Yamauchi, Yasuhiro

    2016-01-01

    Many developed countries face a growing need for long-term care provision because of population ageing. Japan is one such example, given its population's longevity and low birth rate. In this study, we examine the efficiency of Japan's regional long-term care system in FY2010 by performing a data envelopment analysis, a non-parametric frontier approach, on prefectural data and separating cost efficiency into technical, allocative, and price efficiencies under different average unit costs across regions. In doing so, we elucidate the structure of cost inefficiency by incorporating a method for restricting weight flexibility to avoid unrealistic concerns arising from zero optimal weight. The results indicate that technical inefficiency accounts for the highest share of losses, followed by price inefficiency and allocation inefficiency. Moreover, the majority of technical inefficiency losses stem from labor costs, particularly those for professional caregivers providing institutional services. We show that the largest share of allocative inefficiency losses can also be traced to labor costs for professional caregivers providing institutional services, while the labor provision of in-home care services shows an efficiency gain. However, although none of the prefectures gains efficiency by increasing the number of professional caregivers for institutional services, quite a few prefectures would gain allocative efficiency by increasing capital inputs for institutional services. These results indicate that preferred policies for promoting efficiency might vary from region to region, and thus, policy implications should be drawn with care. PMID:26493427

  6. The Problem of Multiple Criteria Selection of the Surface Mining Haul Trucks

    NASA Astrophysics Data System (ADS)

    Bodziony, Przemysław; Kasztelewicz, Zbigniew; Sawicki, Piotr

    2016-06-01

    Vehicle transport is a dominant type of technological processes in rock mines, and its profit ability is strictly dependent on overall cost of its exploitation, especially on diesel oil consumption. Thus, a rational design of transportation system based on haul trucks should result from thorough analysis of technical and economic issues, including both cost of purchase and its further exploitation, having a crucial impact on the cost of minerals extraction. Moreover, off-highway trucks should be selected with respect to all specific exploitation conditions and even the user's preferences and experience. In this paper a development of universal family of evaluation criteria as well as application of evaluation method for haul truck selection process for a specific exploitation conditions in surface mining have been carried out. The methodology presented in the paper is based on the principles of multiple criteria decision aiding (MCDA) using one of the ranking method, i.e. ELECTRE III. The applied methodology has been allowed for ranking of alternative solution (variants), on the considered set of haul trucks. The result of the research is a universal methodology, and it consequently may be applied in other surface mines with similar exploitation parametres.

  7. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods: A Comparison with Clinical Assessment

    PubMed Central

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration (High Freq) performed similarly to non-parametric methods, but had the highest recall values, suggesting that this method could be employed for automatic tremor detection. PMID:27258018

  8. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.

  9. The Design-To-Cost Manifold

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1990-01-01

    Design-to-cost is a popular technique for controlling costs. Although qualitative techniques exist for implementing design to cost, quantitative methods are sparse. In the launch vehicle and spacecraft engineering process, the question whether to minimize mass is usually an issue. The lack of quantification in this issue leads to arguments on both sides. This paper presents a mathematical technique which both quantifies the design-to-cost process and the mass/complexity issue. Parametric cost analysis generates and applies mathematical formulas called cost estimating relationships. In their most common forms, they are continuous and differentiable. This property permits the application of the mathematics of differentiable manifolds. Although the terminology sounds formidable, the application of the techniques requires only a knowledge of linear algebra and ordinary differential equations, common subjects in undergraduate scientific and engineering curricula. When the cost c is expressed as a differentiable function of n system metrics, setting the cost c to be a constant generates an n-1 dimensional subspace of the space of system metrics such that any set of metric values in that space satisfies the constant design-to-cost criterion. This space is a differentiable manifold upon which all mathematical properties of a differentiable manifold may be applied. One important property is that an easily implemented system of ordinary differential equations exists which permits optimization of any function of the system metrics, mass for example, over the design-to-cost manifold. A dual set of equations defines the directions of maximum and minimum cost change. A simplified approximation of the PRICE H(TM) production-production cost is used to generate this set of differential equations over [mass, complexity] space. The equations are solved in closed form to obtain the one dimensional design-to-cost trade and design-for-cost spaces. Preliminary results indicate that cost is relatively insensitive to changes in mass and that the reduction of complexity, both in the manufacturing process and of the spacecraft, is dominant in reducing cost.

  10. Task three: Report: STDN Antenna and preamplifier cost tradeoff study. [combinations of antennas and preamplifiers for several communication environments

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The general goal of this task, STDN Antenna and Preamplifier G/T Study, was to determine cost-effective combinations of antennas and preamplifiers for several sets of conditions for frequency, antenna elevation angle, and rain. The output of the study includes design curves and tables which indicate the best choice of antenna size and preamplifier type to provide a given G/T performance. The report indicates how to evaluate the cost effectiveness of proposed improvements to a given station. Certain parametric variations are presented to emphasize the improvement available by reducing RF losses and improving the antenna feed.

  11. Adaptive parametric model order reduction technique for optimization of vibro-acoustic models: Application to hearing aid design

    NASA Astrophysics Data System (ADS)

    Creixell-Mediante, Ester; Jensen, Jakob S.; Naets, Frank; Brunskog, Jonas; Larsen, Martin

    2018-06-01

    Finite Element (FE) models of complex structural-acoustic coupled systems can require a large number of degrees of freedom in order to capture their physical behaviour. This is the case in the hearing aid field, where acoustic-mechanical feedback paths are a key factor in the overall system performance and modelling them accurately requires a precise description of the strong interaction between the light-weight parts and the internal and surrounding air over a wide frequency range. Parametric optimization of the FE model can be used to reduce the vibroacoustic feedback in a device during the design phase; however, it requires solving the model iteratively for multiple frequencies at different parameter values, which becomes highly time consuming when the system is large. Parametric Model Order Reduction (pMOR) techniques aim at reducing the computational cost associated with each analysis by projecting the full system into a reduced space. A drawback of most of the existing techniques is that the vector basis of the reduced space is built at an offline phase where the full system must be solved for a large sample of parameter values, which can also become highly time consuming. In this work, we present an adaptive pMOR technique where the construction of the projection basis is embedded in the optimization process and requires fewer full system analyses, while the accuracy of the reduced system is monitored by a cheap error indicator. The performance of the proposed method is evaluated for a 4-parameter optimization of a frequency response for a hearing aid model, evaluated at 300 frequencies, where the objective function evaluations become more than one order of magnitude faster than for the full system.

  12. Yadage and Packtivity - analysis preservation using parametrized workflows

    NASA Astrophysics Data System (ADS)

    Cranmer, Kyle; Heinrich, Lukas

    2017-10-01

    Preserving data analyses produced by the collaborations at LHC in a parametrized fashion is crucial in order to maintain reproducibility and re-usability. We argue for a declarative description in terms of individual processing steps - “packtivities” - linked through a dynamic directed acyclic graph (DAG) and present an initial set of JSON schemas for such a description and an implementation - “yadage” - capable of executing workflows of analysis preserved via Linux containers.

  13. Component isolation for multi-component signal analysis using a non-parametric gaussian latent feature model

    NASA Astrophysics Data System (ADS)

    Yang, Yang; Peng, Zhike; Dong, Xingjian; Zhang, Wenming; Clifton, David A.

    2018-03-01

    A challenge in analysing non-stationary multi-component signals is to isolate nonlinearly time-varying signals especially when they are overlapped in time and frequency plane. In this paper, a framework integrating time-frequency analysis-based demodulation and a non-parametric Gaussian latent feature model is proposed to isolate and recover components of such signals. The former aims to remove high-order frequency modulation (FM) such that the latter is able to infer demodulated components while simultaneously discovering the number of the target components. The proposed method is effective in isolating multiple components that have the same FM behavior. In addition, the results show that the proposed method is superior to generalised demodulation with singular-value decomposition-based method, parametric time-frequency analysis with filter-based method and empirical model decomposition base method, in recovering the amplitude and phase of superimposed components.

  14. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    PubMed

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  15. Predicting Market Impact Costs Using Nonparametric Machine Learning Models

    PubMed Central

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance. PMID:26926235

  16. Team X Report #1401: Exoplanet Coronagraph STDT Study 2013-06

    NASA Technical Reports Server (NTRS)

    Warfield, Keith

    2013-01-01

    This document is intended to stimulate discussion of the topic described. All technical and cost analyses are preliminary. This document is not a commitment to work, but is a precursor to a formal proposal if it generates sufficient mutual interest. The data contained in this document may not be modified in any way. Cost estimates described or summarized in this document were generated as part of a preliminary, first-order cost class identification as part of an early trade space study, are based on JPL-internal parametric cost modeling, assume a JPL in-house build, and do not constitute a commitment on the part of JPL or Caltech. JPL and Team X add cost reserves for development and operations. Unadjusted estimate totals and cost reserve allocations would be revised as needed in future more-detailed studies as appropriate for the specific cost-risks for a given mission concept.

  17. When Unified Teacher Pay Scales Meet Differential Alternative Returns

    ERIC Educational Resources Information Center

    Walsh, Patrick

    2014-01-01

    This paper quantifies the extent to which unified teacher pay scales and differential alternatives produce opportunity costs that are asymmetric in math and verbal skills. Data from the Baccalaureate and Beyond 1997 and 2003 follow-ups are used to estimate a fully parametric, selection-corrected wage equation for nonteachers, which is then used to…

  18. Manufacturing information system

    NASA Astrophysics Data System (ADS)

    Allen, D. K.; Smith, P. R.; Smart, M. J.

    1983-12-01

    The size and cost of manufacturing equipment has made it extremely difficult to perform realistic modeling and simulation of the manufacturing process in university research laboratories. Likewise the size and cost factors, coupled with many uncontrolled variables of the production situation has even made it difficult to perform adequate manufacturing research in the industrial setting. Only the largest companies can afford manufacturing research laboratories; research results are often held proprietary and seldom find their way into the university classroom to aid in education and training of new manufacturing engineers. It is the purpose for this research to continue the development of miniature prototype equipment suitable for use in an integrated CAD/CAM Laboratory. The equipment being developed is capable of actually performing production operations (e.g. drilling, milling, turning, punching, etc.) on metallic and non-metallic workpieces. The integrated CAD/CAM Mini-Lab is integrating high resolution, computer graphics, parametric design, parametric N/C parts programmings, CNC machine control, automated storage and retrieval, with robotics materials handling. The availability of miniature CAD/CAM laboratory equipment will provide the basis for intensive laboratory research on manufacturing information systems.

  19. BLIND EXTRACTION OF AN EXOPLANETARY SPECTRUM THROUGH INDEPENDENT COMPONENT ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waldmann, I. P.; Tinetti, G.; Hollis, M. D. J.

    2013-03-20

    Blind-source separation techniques are used to extract the transmission spectrum of the hot-Jupiter HD189733b recorded by the Hubble/NICMOS instrument. Such a 'blind' analysis of the data is based on the concept of independent component analysis. The detrending of Hubble/NICMOS data using the sole assumption that nongaussian systematic noise is statistically independent from the desired light-curve signals is presented. By not assuming any prior or auxiliary information but the data themselves, it is shown that spectroscopic errors only about 10%-30% larger than parametric methods can be obtained for 11 spectral bins with bin sizes of {approx}0.09 {mu}m. This represents a reasonablemore » trade-off between a higher degree of objectivity for the non-parametric methods and smaller standard errors for the parametric de-trending. Results are discussed in light of previous analyses published in the literature. The fact that three very different analysis techniques yield comparable spectra is a strong indication of the stability of these results.« less

  20. Streamlining the Design Tradespace for Earth Imaging Constellations

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; Hughes, Steven P.; Le Moigne, Jacqueline J.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.

  1. Assessment of Dimensionality in Social Science Subtest

    ERIC Educational Resources Information Center

    Ozbek Bastug, Ozlem Yesim

    2012-01-01

    Most of the literature on dimensionality focused on either comparison of parametric and nonparametric dimensionality detection procedures or showing the effectiveness of one type of procedure. There is no known study to shown how to do combined parametric and nonparametric dimensionality analysis on real data. The current study is aimed to fill…

  2. Control of dispatch dynamics for lowering the cost of distributed generation in the built environment

    NASA Astrophysics Data System (ADS)

    Flores, Robert Joseph

    Distributed generation can provide many benefits over traditional central generation such as increased reliability and efficiency while reducing emissions. Despite these potential benefits, distributed generation is generally not purchased unless it reduces energy costs. Economic dispatch strategies can be designed such that distributed generation technologies reduce overall facility energy costs. In this thesis, a microturbine generator is dispatched using different economic control strategies, reducing the cost of energy to the facility. Several industrial and commercial facilities are simulated using acquired electrical, heating, and cooling load data. Industrial and commercial utility rate structures are modeled after Southern California Edison and Southern California Gas Company tariffs and used to find energy costs for the simulated buildings and corresponding microturbine dispatch. Using these control strategies, building models, and utility rate models, a parametric study examining various generator characteristics is performed. An economic assessment of the distributed generation is then performed for both the microturbine generator and parametric study. Without the ability to export electricity to the grid, the economic value of distributed generation is limited to reducing the individual costs that make up the cost of energy for a building. Any economic dispatch strategy must be built to reduce these individual costs. While the ability of distributed generation to reduce cost depends of factors such as electrical efficiency and operations and maintenance cost, the building energy demand being serviced has a strong effect on cost reduction. Buildings with low load factors can accept distributed generation with higher operating costs (low electrical efficiency and/or high operations and maintenance cost) due to the value of demand reduction. As load factor increases, lower operating cost generators are desired due to a larger portion of the building load being met in an effort to reduce demand. In addition, buildings with large thermal demand have access to the least expensive natural gas, lowering the cost of operating distributed generation. Recovery of exhaust heat from DG reduces cost only if the buildings thermal demand coincides with the electrical demand. Capacity limits exist where annual savings from operation of distributed generation decrease if further generation is installed. For low operating cost generators, the approximate limit is the average building load. This limit decreases as operating costs increase. In addition, a high capital cost of distributed generation can be accepted if generator operating costs are low. As generator operating costs increase, capital cost must decrease if a positive economic performance is desired.

  3. Summarizing techniques that combine three non-parametric scores to detect disease-associated 2-way SNP-SNP interactions.

    PubMed

    Sengupta Chattopadhyay, Amrita; Hsiao, Ching-Lin; Chang, Chien Ching; Lian, Ie-Bin; Fann, Cathy S J

    2014-01-01

    Identifying susceptibility genes that influence complex diseases is extremely difficult because loci often influence the disease state through genetic interactions. Numerous approaches to detect disease-associated SNP-SNP interactions have been developed, but none consistently generates high-quality results under different disease scenarios. Using summarizing techniques to combine a number of existing methods may provide a solution to this problem. Here we used three popular non-parametric methods-Gini, absolute probability difference (APD), and entropy-to develop two novel summary scores, namely principle component score (PCS) and Z-sum score (ZSS), with which to predict disease-associated genetic interactions. We used a simulation study to compare performance of the non-parametric scores, the summary scores, the scaled-sum score (SSS; used in polymorphism interaction analysis (PIA)), and the multifactor dimensionality reduction (MDR). The non-parametric methods achieved high power, but no non-parametric method outperformed all others under a variety of epistatic scenarios. PCS and ZSS, however, outperformed MDR. PCS, ZSS and SSS displayed controlled type-I-errors (<0.05) compared to GS, APDS, ES (>0.05). A real data study using the genetic-analysis-workshop 16 (GAW 16) rheumatoid arthritis dataset identified a number of interesting SNP-SNP interactions. © 2013 Elsevier B.V. All rights reserved.

  4. Vehicle Sketch Pad: a Parametric Geometry Modeler for Conceptual Aircraft Design

    NASA Technical Reports Server (NTRS)

    Hahn, Andrew S.

    2010-01-01

    The conceptual aircraft designer is faced with a dilemma, how to strike the best balance between productivity and fidelity? Historically, handbook methods have required only the coarsest of geometric parameterizations in order to perform analysis. Increasingly, there has been a drive to upgrade analysis methods, but these require considerably more precise and detailed geometry. Attempts have been made to use computer-aided design packages to fill this void, but their cost and steep learning curve have made them unwieldy at best. Vehicle Sketch Pad (VSP) has been developed over several years to better fill this void. While no substitute for the full feature set of computer-aided design packages, VSP allows even novices to quickly become proficient in defining three-dimensional, watertight aircraft geometries that are adequate for producing multi-disciplinary meta-models for higher order analysis methods, wind tunnel and display models, as well as a starting point for animation models. This paper will give an overview of the development and future course of VSP.

  5. On equivalent parameter learning in simplified feature space based on Bayesian asymptotic analysis.

    PubMed

    Yamazaki, Keisuke

    2012-07-01

    Parametric models for sequential data, such as hidden Markov models, stochastic context-free grammars, and linear dynamical systems, are widely used in time-series analysis and structural data analysis. Computation of the likelihood function is one of primary considerations in many learning methods. Iterative calculation of the likelihood such as the model selection is still time-consuming though there are effective algorithms based on dynamic programming. The present paper studies parameter learning in a simplified feature space to reduce the computational cost. Simplifying data is a common technique seen in feature selection and dimension reduction though an oversimplified space causes adverse learning results. Therefore, we mathematically investigate a condition of the feature map to have an asymptotically equivalent convergence point of estimated parameters, referred to as the vicarious map. As a demonstration to find vicarious maps, we consider the feature space, which limits the length of data, and derive a necessary length for parameter learning in hidden Markov models. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Parametrically excited helicopter ground resonance dynamics with high blade asymmetries

    NASA Astrophysics Data System (ADS)

    Sanches, L.; Michon, G.; Berlioz, A.; Alazard, D.

    2012-07-01

    The present work is aimed at verifying the influence of high asymmetries in the variation of in-plane lead-lag stiffness of one blade on the ground resonance phenomenon in helicopters. The periodical equations of motions are analyzed by using Floquet's Theory (FM) and the boundaries of instabilities predicted. The stability chart obtained as a function of asymmetry parameters and rotor speed reveals a complex evolution of critical zones and the existence of bifurcation points at low rotor speed values. Additionally, it is known that when treated as parametric excitations; periodic terms may cause parametric resonances in dynamic systems, some of which can become unstable. Therefore, the helicopter is later considered as a parametrically excited system and the equations are treated analytically by applying the Method of Multiple Scales (MMS). A stability analysis is used to verify the existence of unstable parametric resonances with first and second-order sets of equations. The results are compared and validated with those obtained by Floquet's Theory. Moreover, an explanation is given for the presence of unstable motion at low rotor speeds due to parametric instabilities of the second order.

  7. A Bayesian goodness of fit test and semiparametric generalization of logistic regression with measurement data.

    PubMed

    Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E

    2013-06-01

    Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric Society.

  8. Model selection criterion in survival analysis

    NASA Astrophysics Data System (ADS)

    Karabey, Uǧur; Tutkun, Nihal Ata

    2017-07-01

    Survival analysis deals with time until occurrence of an event of interest such as death, recurrence of an illness, the failure of an equipment or divorce. There are various survival models with semi-parametric or parametric approaches used in medical, natural or social sciences. The decision on the most appropriate model for the data is an important point of the analysis. In literature Akaike information criteria or Bayesian information criteria are used to select among nested models. In this study,the behavior of these information criterion is discussed for a real data set.

  9. Likert scales, levels of measurement and the "laws" of statistics.

    PubMed

    Norman, Geoff

    2010-12-01

    Reviewers of research reports frequently criticize the choice of statistical methods. While some of these criticisms are well-founded, frequently the use of various parametric methods such as analysis of variance, regression, correlation are faulted because: (a) the sample size is too small, (b) the data may not be normally distributed, or (c) The data are from Likert scales, which are ordinal, so parametric statistics cannot be used. In this paper, I dissect these arguments, and show that many studies, dating back to the 1930s consistently show that parametric statistics are robust with respect to violations of these assumptions. Hence, challenges like those above are unfounded, and parametric methods can be utilized without concern for "getting the wrong answer".

  10. Overview of NASA's Integrated Design and Engineering Analysis (IDEA)Environment

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.; Martin John G.

    2008-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures) each of which performs their design and analysis in relative isolation from others. This is possible in most cases either because the amount of interdisciplinary coupling is minimal or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA s X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design & Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary launch vehicle configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, configuration, propulsion, aerodynamics, aerothermodynamics, trajectory, closure and structural analysis into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics Research Mission Directorate. The environment is currently focused around a two-stage-to-orbit configuration with a turbine based combined cycle (TBCC) first stage and reusable rocket second stage. This paper provides an overview of the development of the IDEA environment, a description of the current status and detail of future plans.

  11. Estimating the Life Cycle Cost of Space Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    A space system's Life Cycle Cost (LCC) includes design and development, launch and emplacement, and operations and maintenance. Each of these cost factors is usually estimated separately. NASA uses three different parametric models for the design and development cost of crewed space systems; the commercial PRICE-H space hardware cost model, the NASA-Air Force Cost Model (NAFCOM), and the Advanced Missions Cost Model (AMCM). System mass is an important parameter in all three models. System mass also determines the launch and emplacement cost, which directly depends on the cost per kilogram to launch mass to Low Earth Orbit (LEO). The launch and emplacement cost is the cost to launch to LEO the system itself and also the rockets, propellant, and lander needed to emplace it. The ratio of the total launch mass to payload mass depends on the mission scenario and destination. The operations and maintenance costs include any material and spares provided, the ground control crew, and sustaining engineering. The Mission Operations Cost Model (MOCM) estimates these costs as a percentage of the system development cost per year.

  12. Coronary artery bypass grafting with minimal versus conventional extracorporeal circulation; an economic analysis.

    PubMed

    Anastasiadis, K; Fragoulakis, V; Antonitsis, P; Maniadakis, N

    2013-10-15

    This study aims to develop a methodological framework for the comparative economic evaluation between Minimal Extracorporeal Circulation (MECC) versus conventional Extracorporeal Circulation (CECC) in patients undergoing coronary artery bypass grafting (CABG) in different healthcare systems. Moreover, we evaluate the cost-effectiveness ratio of alternative comparators in the healthcare setting of Greece, Germany, the Netherlands and Switzerland. The effectiveness data utilized were derived from a recent meta-analysis which incorporated 24 randomized clinical trials. Total therapy cost per patient reflects all resources expensed in delivery of therapy and the management of any adverse events, including drugs, diagnostics tests, materials, devices, blood units, the utilization of operating theaters, intensive care units, and wards. Perioperative mortality was used as the primary health outcome to estimate life years gained in treatment arms. Bias-corrected uncertainty intervals were calculated using the percentile method of non-parametric Monte-Carlo simulation. The MECC circuit was more expensive than CECC, with a difference ranging from €180 to €600 depending on the country. However, in terms of total therapy cost per patient the comparison favored MECC in all countries. Specifically it was associated with a reduction of €635 in Greece, €297 in Germany, €1590 in the Netherlands and €375 in Switzerland. In terms of effectiveness, the total life-years gained were slightly higher in favor of MECC. Surgery with MECC may be dominant (lower cost and higher effectiveness) compared to CECC in coronary revascularization procedures and therefore it represents an attractive new option relative to conventional extracorporeal circulation for CABG. © 2013.

  13. Cost-effectiveness of pembrolizumab versus docetaxel for the treatment of previously treated PD-L1 positive advanced NSCLC patients in the United States.

    PubMed

    Huang, Min; Lou, Yanyan; Pellissier, James; Burke, Thomas; Liu, Frank Xiaoqing; Xu, Ruifeng; Velcheti, Vamsidhar

    2017-02-01

    This analysis aimed to evaluate the cost-effectiveness of pembrolizumab compared with docetaxel in patients with previously treated advanced non-squamous cell lung cancer (NSCLC) with PD-L1 positive tumors (total proportion score [TPS] ≥ 50%). The analysis was conducted from a US third-party payer perspective. A partitioned-survival model was developed using data from patients from the KEYNOTE 010 clinical trial. The model used Kaplan-Meier (KM) estimates of progression-free survival (PFS) and overall survival (OS) from the trial for patients treated with either pembrolizumab 2 mg/kg or docetaxel 75 mg/m 2 with extrapolation based on fitted parametric functions and long-term registry data. Quality-adjusted life years (QALYs) were derived based on EQ-5D data from KEYNOTE 010 using a time to death approach. Costs of drug acquisition/administration, adverse event management, and clinical management of advanced NSCLC were included in the model. The base-case analysis used a time horizon of 20 years. Costs and health outcomes were discounted at a rate of 3% per year. A series of one-way and probabilistic sensitivity analyses were performed to test the robustness of the results. Base case results project for PD-L1 positive (TPS ≥50%) patients treated with pembrolizumab a mean survival of 2.25 years. For docetaxel, a mean survival time of 1.07 years was estimated. Expected QALYs were 1.71 and 0.76 for pembrolizumab and docetaxel, respectively. The incremental cost per QALY gained with pembrolizumab vs docetaxel is $168,619/QALY, which is cost-effective in the US using a threshold of 3-times GDP per capita. Sensitivity analyses showed the results to be robust over plausible values of the majority of inputs. Results were most sensitive to extrapolation of overall survival. Pembrolizumab improves survival, increases QALYs, and can be considered as a cost-effective option compared to docetaxel in PD-L1 positive (TPS ≥50%) pre-treated advanced NSCLC patients in the US.

  14. Cost-Effectiveness of the Third-Agent Class in Treatment-Naive Human Immunodeficiency Virus-Infected Patients in Portugal

    PubMed Central

    Aragão, Filipa; Vera, José; Vaz Pinto, Inês

    2012-01-01

    Introduction Current Portuguese HIV treatment guidelines recommend initiating antiretroviral therapy with a regimen composed of two Nucleoside Reverse Transcriptase Inhibitors plus one Non-nucleoside Reverse Transcriptase Inhibitor (2NRTI+NNRTI) or two Nucleoside Reverse Transcriptase Inhibitors plus one boosted protease inhibitor (2NRTI+PI/r). Given the lower daily cost of NNRTI as the third agent when compared to the average daily costs of PI/r, it is relevant to estimate the long term impact of each treatment option in the Portuguese context. Methods We developed a microsimulation discrete events model for cost-effectiveness analysis of HIV treatment, simulating individual paths from ART initiation to death. Four driving forces determine the course of events: CD4+ cell count, viral load, resistance and adherence. Distributions of time to event are conditional to individuals’ characteristics and past history. Time to event was modeled using parametric survival analysis using Stata 11®. Disease progression was structured according to therapy lines and the model was parameterized with cohort Portuguese observational data. All resources were valued at 2009 prices. The National Health Service’s perspective was assumed considering a lifetime horizon and a 5% annual discount rate. Results In this analysis, initiating therapy with two Nucleoside Reverse Transcriptase Inhibitors plus one Non-nucleoside Reverse Transcriptase Inhibitor reduces the average number of switches by 17%, saves 19.573€ per individual and increases life expectancy by 1.7 months showing to be a dominant strategy in 57% of the simulations when compared to two Nucleoside Reverse Transcriptase Inhibitors plus one boosted protease inhibitor. Conclusion This study suggests that, when clinically valid, initiating therapy with two Nucleoside Reverse Transcriptase Inhibitors plus one Non-nucleoside Reverse Transcriptase Inhibitor is a cost-saving strategy and equally effective when compared to two Nucleoside Reverse Transcriptase Inhibitors plus one boosted protease inhibitor as the first regimen. PMID:23028618

  15. Application of artificial neural network to fMRI regression analysis.

    PubMed

    Misaki, Masaya; Miyauchi, Satoru

    2006-01-15

    We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.

  16. Evaluation of grid generation technologies from an applied perspective

    NASA Technical Reports Server (NTRS)

    Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.

    1995-01-01

    An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.

  17. Power flow analysis of two coupled plates with arbitrary characteristics

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1988-01-01

    The limitation of keeping two plates identical is removed and the vibrational power input and output are evaluated for different area ratios, plate thickness ratios, and for different values of the structural damping loss factor for the source plate (plate with excitation) and the receiver plate. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to be able to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. As was done previously, results obtained from the mobility power flow approach will be compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between SEA results and the mobility power flow results. Furthermore, the benefits that can be derived from using the mobility power flow approach, are also examined.

  18. Parametric Net Influx Rate Images of 68Ga-DOTATOC and 68Ga-DOTATATE: Quantitative Accuracy and Improved Image Contrast.

    PubMed

    Ilan, Ezgi; Sandström, Mattias; Velikyan, Irina; Sundin, Anders; Eriksson, Barbro; Lubberink, Mark

    2017-05-01

    68 Ga-DOTATOC and 68 Ga-DOTATATE are radiolabeled somatostatin analogs used for the diagnosis of somatostatin receptor-expressing neuroendocrine tumors (NETs), and SUV measurements are suggested for treatment monitoring. However, changes in net influx rate ( K i ) may better reflect treatment effects than those of the SUV, and accordingly there is a need to compute parametric images showing K i at the voxel level. The aim of this study was to evaluate parametric methods for computation of parametric K i images by comparison to volume of interest (VOI)-based methods and to assess image contrast in terms of tumor-to-liver ratio. Methods: Ten patients with metastatic NETs underwent a 45-min dynamic PET examination followed by whole-body PET/CT at 1 h after injection of 68 Ga-DOTATOC and 68 Ga-DOTATATE on consecutive days. Parametric K i images were computed using a basis function method (BFM) implementation of the 2-tissue-irreversible-compartment model and the Patlak method using a descending aorta image-derived input function, and mean tumor K i values were determined for 50% isocontour VOIs and compared with K i values based on nonlinear regression (NLR) of the whole-VOI time-activity curve. A subsample of healthy liver was delineated in the whole-body and K i images, and tumor-to-liver ratios were calculated to evaluate image contrast. Correlation ( R 2 ) and agreement between VOI-based and parametric K i values were assessed using regression and Bland-Altman analysis. Results: The R 2 between NLR-based and parametric image-based (BFM) tumor K i values was 0.98 (slope, 0.81) and 0.97 (slope, 0.88) for 68 Ga-DOTATOC and 68 Ga-DOTATATE, respectively. For Patlak analysis, the R 2 between NLR-based and parametric-based (Patlak) tumor K i was 0.95 (slope, 0.71) and 0.92 (slope, 0.74) for 68 Ga-DOTATOC and 68 Ga-DOTATATE, respectively. There was no bias between NLR and parametric-based K i values. Tumor-to-liver contrast was 1.6 and 2.0 times higher in the parametric BFM K i images and 2.3 and 3.0 times in the Patlak images than in the whole-body images for 68 Ga-DOTATOC and 68 Ga-DOTATATE, respectively. Conclusion: A high R 2 and agreement between NLR- and parametric-based K i values was found, showing that K i images are quantitatively accurate. In addition, tumor-to-liver contrast was superior in the parametric K i images compared with whole-body images for both 68 Ga-DOTATOC and 68 Ga DOTATATE. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  19. [Linkage analysis of susceptibility loci in 2 target chromosomes in pedigrees with paranoid schizophrenia and undifferentiated schizophrenia].

    PubMed

    Zeng, Li-ping; Hu, Zheng-mao; Mu, Li-li; Mei, Gui-sen; Lu, Xiu-ling; Zheng, Yong-jun; Li, Pei-jian; Zhang, Ying-xue; Pan, Qian; Long, Zhi-gao; Dai, He-ping; Zhang, Zhuo-hua; Xia, Jia-hui; Zhao, Jing-ping; Xia, Kun

    2011-06-01

    To investigate the relationship of susceptibility loci in chromosomes 1q21-25 and 6p21-25 and schizophrenia subtypes in Chinese population. A genomic scan and parametric and non-parametric analyses were performed on 242 individuals from 36 schizophrenia pedigrees, including 19 paranoid schizophrenia and 17 undifferentiated schizophrenia pedigrees, from Henan province of China using 5 microsatellite markers in the chromosome region 1q21-25 and 8 microsatellite markers in the chromosome region 6p21-25, which were the candidates of previous studies. All affected subjects were diagnosed and typed according to the criteria of the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revised (DSM-IV-TR; American Psychiatric Association, 2000). All subjects signed informed consent. In chromosome 1, parametric analysis under the dominant inheritance mode of all 36 pedigrees showed that the maximum multi-point heterogeneity Log of odds score method (HLOD) score was 1.33 (α = 0.38). The non-parametric analysis and the single point and multi-point nonparametric linkage (NPL) scores suggested linkage at D1S484, D1S2878, and D1S196. In the 19 paranoid schizophrenias pedigrees, linkage was not observed for any of the 5 markers. In the 17 undifferentiated schizophrenia pedigrees, the multi-point NPL score was 1.60 (P= 0.0367) at D1S484. The single point NPL score was 1.95(P= 0.0145) and the multi-point NPL score was 2.39 (P= 0.0041) at D1S2878. Additionally, the multi-point NPL score was 1.74 (P= 0.0255) at D1S196. These same three loci showed suggestive linkage during the integrative analysis of all 36 pedigrees. In chromosome 6, parametric linkage analysis under the dominant and recessive inheritance and the non-parametric linkage analysis of all 36 pedigrees and the 17 undifferentiated schizophrenia pedigrees, linkage was not observed for any of the 8 markers. In the 19 paranoid schizophrenias pedigrees, parametric analysis showed that under recessive inheritance mode the maximum single-point HLOD score was 1.26 (α = 0.40) and the multi-point HLOD was 1.12 (α = 0.38) at D6S289 in the chromosome 6p23. In nonparametric analysis, the single-point NPL score was 1.52 (P= 0.0402) and the multi-point NPL score was 1.92 (P= 0.0206) at D6S289. Susceptibility genes correlated with undifferentiated schizophrenia pedigrees from D1S484, D1S2878, D1S196 loci, and those correlated with paranoid schizophrenia pedigrees from D6S289 locus are likely present in chromosome regions 1q23.3 and 1q24.2, and chromosome region 6p23, respectively.

  20. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    PubMed Central

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118

  1. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    NASA Astrophysics Data System (ADS)

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.

  2. Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.

    1978-01-01

    Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.

  3. A parametric determination of transport aircraft price

    NASA Technical Reports Server (NTRS)

    Anderson, J. L.

    1975-01-01

    Cost per unit weight and other airframe and engine cost relations are given. Power equations representing these relations are presented for six airplane groups: general aircraft, turboprop transports, small jet transports, conventional jet transports, wide-body transports, supersonic transports, and for reciprocating, turboshaft, and turbothrust engines. Market prices calculated for a number of aircraft by use of the equations together with the aircraft characteristics are in reasonably good agreement with actual prices. Such price analyses are of value in the assessment of new aircraft devices and designs and potential research and development programs.

  4. A generalized parametric response mapping method for analysis of multi-parametric imaging: A feasibility study with application to glioblastoma.

    PubMed

    Lausch, Anthony; Yeung, Timothy Pok-Chi; Chen, Jeff; Law, Elton; Wang, Yong; Urbini, Benedetta; Donelli, Filippo; Manco, Luigi; Fainardi, Enrico; Lee, Ting-Yim; Wong, Eugene

    2017-11-01

    Parametric response map (PRM) analysis of functional imaging has been shown to be an effective tool for early prediction of cancer treatment outcomes and may also be well-suited toward guiding personalized adaptive radiotherapy (RT) strategies such as sub-volume boosting. However, the PRM method was primarily designed for analysis of longitudinally acquired pairs of single-parameter image data. The purpose of this study was to demonstrate the feasibility of a generalized parametric response map analysis framework, which enables analysis of multi-parametric data while maintaining the key advantages of the original PRM method. MRI-derived apparent diffusion coefficient (ADC) and relative cerebral blood volume (rCBV) maps acquired at 1 and 3-months post-RT for 19 patients with high-grade glioma were used to demonstrate the algorithm. Images were first co-registered and then standardized using normal tissue image intensity values. Tumor voxels were then plotted in a four-dimensional Cartesian space with coordinate values equal to a voxel's image intensity in each of the image volumes and an origin defined as the multi-parametric mean of normal tissue image intensity values. Voxel positions were orthogonally projected onto a line defined by the origin and a pre-determined response vector. The voxels are subsequently classified as positive, negative or nil, according to whether projected positions along the response vector exceeded a threshold distance from the origin. The response vector was selected by identifying the direction in which the standard deviation of tumor image intensity values was maximally different between responding and non-responding patients within a training dataset. Voxel classifications were visualized via familiar three-class response maps and then the fraction of tumor voxels associated with each of the classes was investigated for predictive utility analogous to the original PRM method. Independent PRM and MPRM analyses of the contrast-enhancing lesion (CEL) and a 1 cm shell of surrounding peri-tumoral tissue were performed. Prediction using tumor volume metrics was also investigated. Leave-one-out cross validation (LOOCV) was used in combination with permutation testing to assess preliminary predictive efficacy and estimate statistically robust P-values. The predictive endpoint was overall survival (OS) greater than or equal to the median OS of 18.2 months. Single-parameter PRM and multi-parametric response maps (MPRMs) were generated for each patient and used to predict OS via the LOOCV. Tumor volume metrics (P ≥ 0.071 ± 0.01) and single-parameter PRM analyses (P ≥ 0.170 ± 0.01) were not found to be predictive of OS within this study. MPRM analysis of the peri-tumoral region but not the CEL was found to be predictive of OS with a classification sensitivity, specificity and accuracy of 80%, 100%, and 89%, respectively (P = 0.001 ± 0.01). The feasibility of a generalized MPRM analysis framework was demonstrated with improved prediction of overall survival compared to the original single-parameter method when applied to a glioblastoma dataset. The proposed algorithm takes the spatial heterogeneity in multi-parametric response into consideration and enables visualization. MPRM analysis of peri-tumoral regions was shown to have predictive potential supporting further investigation of a larger glioblastoma dataset. © 2017 American Association of Physicists in Medicine.

  5. A note on the correlation between circular and linear variables with an application to wind direction and air temperature data in a Mediterranean climate

    NASA Astrophysics Data System (ADS)

    Lototzis, M.; Papadopoulos, G. K.; Droulia, F.; Tseliou, A.; Tsiros, I. X.

    2018-04-01

    There are several cases where a circular variable is associated with a linear one. A typical example is wind direction that is often associated with linear quantities such as air temperature and air humidity. The analysis of a statistical relationship of this kind can be tested by the use of parametric and non-parametric methods, each of which has its own advantages and drawbacks. This work deals with correlation analysis using both the parametric and the non-parametric procedure on a small set of meteorological data of air temperature and wind direction during a summer period in a Mediterranean climate. Correlations were examined between hourly, daily and maximum-prevailing values, under typical and non-typical meteorological conditions. Both tests indicated a strong correlation between mean hourly wind directions and mean hourly air temperature, whereas mean daily wind direction and mean daily air temperature do not seem to be correlated. In some cases, however, the two procedures were found to give quite dissimilar levels of significance on the rejection or not of the null hypothesis of no correlation. The simple statistical analysis presented in this study, appropriately extended in large sets of meteorological data, may be a useful tool for estimating effects of wind on local climate studies.

  6. A parametric ribcage geometry model accounting for variations among the adult population.

    PubMed

    Wang, Yulong; Cao, Libo; Bai, Zhonghao; Reed, Matthew P; Rupp, Jonathan D; Hoff, Carrie N; Hu, Jingwen

    2016-09-06

    The objective of this study is to develop a parametric ribcage model that can account for morphological variations among the adult population. Ribcage geometries, including 12 pair of ribs, sternum, and thoracic spine, were collected from CT scans of 101 adult subjects through image segmentation, landmark identification (1016 for each subject), symmetry adjustment, and template mesh mapping (26,180 elements for each subject). Generalized procrustes analysis (GPA), principal component analysis (PCA), and regression analysis were used to develop a parametric ribcage model, which can predict nodal locations of the template mesh according to age, sex, height, and body mass index (BMI). Two regression models, a quadratic model for estimating the ribcage size and a linear model for estimating the ribcage shape, were developed. The results showed that the ribcage size was dominated by the height (p=0.000) and age-sex-interaction (p=0.007) and the ribcage shape was significantly affected by the age (p=0.0005), sex (p=0.0002), height (p=0.0064) and BMI (p=0.0000). Along with proper assignment of cortical bone thickness, material properties and failure properties, this parametric ribcage model can directly serve as the mesh of finite element ribcage models for quantifying effects of human characteristics on thoracic injury risks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  8. Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.

    PubMed

    Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J

    2017-10-20

    This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p < 0.05) between the two lesion types. A hybrid biomarker developed using a stepwise feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.

  9. Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis

    PubMed Central

    Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.

    2006-01-01

    In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709

  10. Parametric and non-parametric species delimitation methods result in the recognition of two new Neotropical woody bamboo species.

    PubMed

    Ruiz-Sanchez, Eduardo

    2015-12-01

    The Neotropical woody bamboo genus Otatea is one of five genera in the subtribe Guaduinae. Of the eight described Otatea species, seven are endemic to Mexico and one is also distributed in Central and South America. Otatea acuminata has the widest geographical distribution of the eight species, and two of its recently collected populations do not match the known species morphologically. Parametric and non-parametric methods were used to delimit the species in Otatea using five chloroplast markers, one nuclear marker, and morphological characters. The parametric coalescent method and the non-parametric analysis supported the recognition of two distinct evolutionary lineages. Molecular clock estimates were used to estimate divergence times in Otatea. The results for divergence time in Otatea estimated the origin of the speciation events from the Late Miocene to Late Pleistocene. The species delimitation analyses (parametric and non-parametric) identified that the two populations of O. acuminata from Chiapas and Hidalgo are from two separate evolutionary lineages and these new species have morphological characters that separate them from O. acuminata s.s. The geological activity of the Trans-Mexican Volcanic Belt and the Isthmus of Tehuantepec may have isolated populations and limited the gene flow between Otatea species, driving speciation. Based on the results found here, I describe Otatea rzedowskiorum and Otatea victoriae as two new species, morphologically different from O. acuminata. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. A framework for multivariate data-based at-site flood frequency analysis: Essentiality of the conjugal application of parametric and nonparametric approaches

    NASA Astrophysics Data System (ADS)

    Vittal, H.; Singh, Jitendra; Kumar, Pankaj; Karmakar, Subhankar

    2015-06-01

    In watershed management, flood frequency analysis (FFA) is performed to quantify the risk of flooding at different spatial locations and also to provide guidelines for determining the design periods of flood control structures. The traditional FFA was extensively performed by considering univariate scenario for both at-site and regional estimation of return periods. However, due to inherent mutual dependence of the flood variables or characteristics [i.e., peak flow (P), flood volume (V) and flood duration (D), which are random in nature], analysis has been further extended to multivariate scenario, with some restrictive assumptions. To overcome the assumption of same family of marginal density function for all flood variables, the concept of copula has been introduced. Although, the advancement from univariate to multivariate analyses drew formidable attention to the FFA research community, the basic limitation was that the analyses were performed with the implementation of only parametric family of distributions. The aim of the current study is to emphasize the importance of nonparametric approaches in the field of multivariate FFA; however, the nonparametric distribution may not always be a good-fit and capable of replacing well-implemented multivariate parametric and multivariate copula-based applications. Nevertheless, the potential of obtaining best-fit using nonparametric distributions might be improved because such distributions reproduce the sample's characteristics, resulting in more accurate estimations of the multivariate return period. Hence, the current study shows the importance of conjugating multivariate nonparametric approach with multivariate parametric and copula-based approaches, thereby results in a comprehensive framework for complete at-site FFA. Although the proposed framework is designed for at-site FFA, this approach can also be applied to regional FFA because regional estimations ideally include at-site estimations. The framework is based on the following steps: (i) comprehensive trend analysis to assess nonstationarity in the observed data; (ii) selection of the best-fit univariate marginal distribution with a comprehensive set of parametric and nonparametric distributions for the flood variables; (iii) multivariate frequency analyses with parametric, copula-based and nonparametric approaches; and (iv) estimation of joint and various conditional return periods. The proposed framework for frequency analysis is demonstrated using 110 years of observed data from Allegheny River at Salamanca, New York, USA. The results show that for both univariate and multivariate cases, the nonparametric Gaussian kernel provides the best estimate. Further, we perform FFA for twenty major rivers over continental USA, which shows for seven rivers, all the flood variables followed nonparametric Gaussian kernel; whereas for other rivers, parametric distributions provide the best-fit either for one or two flood variables. Thus the summary of results shows that the nonparametric method cannot substitute the parametric and copula-based approaches, but should be considered during any at-site FFA to provide the broadest choices for best estimation of the flood return periods.

  12. A Lunar Surface Operations Simulator

    NASA Technical Reports Server (NTRS)

    Nayar, H.; Balaram, J.; Cameron, J.; Jain, A.; Lim, C.; Mukherjee, R.; Peters, S.; Pomerantz, M.; Reder, L.; Shakkottai, P.; hide

    2008-01-01

    The Lunar Surface Operations Simulator (LSOS) is being developed to support planning and design of space missions to return astronauts to the moon. Vehicles, habitats, dynamic and physical processes and related environment systems are modeled and simulated in LSOS to assist in the visualization and design optimization of systems for lunar surface operations. A parametric analysis tool and a data browser were also implemented to provide an intuitive interface to run multiple simulations and review their results. The simulator and parametric analysis capability are described in this paper.

  13. Cost-effectiveness of drug-eluting stents in patients at high or low risk of major cardiac events in the Basel Stent KostenEffektivitäts Trial (BASKET): an 18-month analysis.

    PubMed

    Brunner-La Rocca, Hans Peter; Kaiser, Christoph; Bernheim, Alain; Zellweger, Michael J; Jeger, Raban; Buser, Peter T; Osswald, Stefan; Pfisterer, Matthias

    2007-11-03

    Our aim was to determine whether drug-eluting stents are good value for money in long-term, everyday practice. We did an 18-month cost-effectiveness analysis of the Basel Stent KostenEffektivitäts Trial (BASKET), which randomised 826 patients 2:1 to drug-eluting stents (n=545) or to bare-metal stents (281). We used non-parametric bootstrap techniques to determine incremental cost-effectiveness ratios (ICERs) of drug-eluting versus bare-metal stents, to compare low-risk (> or =3.0 mm stents in native vessels; n=558, 68%) and high-risk patients (<3.0 mm stents/bypass graft stenting; n=268, 32%), and to do sensitivity analyses by altering costs and event rates in the whole study sample and in predefined subgroups. Quality-adjusted life-years (QALYs) were assessed by EQ-5D questionnaire (available in 703/826 patients). Overall costs were higher for patients with drug-eluting stents than in those with bare-metal stents (11,808 euros [SD 400] per patient with drug-eluting stents and 10,450 euros [592] per patient with bare-metal stents, mean difference 1358 euros [717], p<0.0001), due to higher stent costs. We calculated an ICER of 64,732 euros to prevent one major adverse cardiac event, and of 40,467 euros per QALY gained. Stent costs, number of events, and QALYs affected ICERs most, but unrealistic alterations would have been required to achieve acceptable cost-effectiveness. In low-risk patients, the probability of drug-eluting stents achieving an arbitrary ICER of 10,000 euros or less to prevent one major adverse cardiac event was 0.016; by contrast, it was 0.874 in high-risk patients. If used in all patients, drug-eluting stents are not good value for money, even if prices were substantially reduced. Drug-eluting stents are cost effective in patients needing small vessel or bypass graft stenting, but not in those who require large native vessel stenting.

  14. Potassium topping cycles for stationary power. [conceptual analysis

    NASA Technical Reports Server (NTRS)

    Rossbach, R. J.

    1975-01-01

    A design study was made of the potassium topping cycle powerplant for central station use. Initially, powerplant performance and economics were studied parametrically by using an existing steam plant as the bottom part of the cycle. Two distinct powerplants were identified which had good thermodynamic and economic performance. Conceptual designs were made of these two powerplants in the 1200 MWe size, and capital and operating costs were estimated for these powerplants. A technical evaluation of these plants was made including conservation of fuel resources, environmental impact, technology status, and degree of development risk. It is concluded that the potassium topping cycle could have a significant impact on national goals such as air and water pollution control and conservation of natural resources because of its higher energy conversion efficiency.

  15. Engine System Loads Development for the Fastrac 60K Flight Engine

    NASA Technical Reports Server (NTRS)

    Frady, Greg; Christensen, Eric R.; Mims, Katherine; Harris, Don; Parks, Russell; Brunty, Joseph

    2000-01-01

    Early implementation of structural dynamics finite element analyses for calculation of design loads is considered common design practice for high volume manufacturing industries such as automotive and aeronautical industries. However, with the rarity of rocket engine development programs starts, these tools are relatively new to the design of rocket engines. In the new Fastrac engine program, the focus has been to reduce the cost to weight ratio; current structural dynamics analysis practices were tailored in order to meet both production and structural design goals. Perturbation of rocket engine design parameters resulted in a number of Fastrac load cycles necessary to characterize the impact due to mass and stiffness changes. Evolution of loads and load extraction methodologies, parametric considerations and a discussion of load path sensitivities are discussed.

  16. Secondary outcome analysis for data from an outcome-dependent sampling design.

    PubMed

    Pan, Yinghao; Cai, Jianwen; Longnecker, Matthew P; Zhou, Haibo

    2018-04-22

    Outcome-dependent sampling (ODS) scheme is a cost-effective way to conduct a study. For a study with continuous primary outcome, an ODS scheme can be implemented where the expensive exposure is only measured on a simple random sample and supplemental samples selected from 2 tails of the primary outcome variable. With the tremendous cost invested in collecting the primary exposure information, investigators often would like to use the available data to study the relationship between a secondary outcome and the obtained exposure variable. This is referred as secondary analysis. Secondary analysis in ODS designs can be tricky, as the ODS sample is not a random sample from the general population. In this article, we use the inverse probability weighted and augmented inverse probability weighted estimating equations to analyze the secondary outcome for data obtained from the ODS design. We do not make any parametric assumptions on the primary and secondary outcome and only specify the form of the regression mean models, thus allow an arbitrary error distribution. Our approach is robust to second- and higher-order moment misspecification. It also leads to more precise estimates of the parameters by effectively using all the available participants. Through simulation studies, we show that the proposed estimator is consistent and asymptotically normal. Data from the Collaborative Perinatal Project are analyzed to illustrate our method. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Selecting a Separable Parametric Spatiotemporal Covariance Structure for Longitudinal Imaging Data

    PubMed Central

    George, Brandon; Aban, Inmaculada

    2014-01-01

    Longitudinal imaging studies allow great insight into how the structure and function of a subject’s internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures, and the spatial from the outcomes of interest being observed at multiple points in a patients body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on Type I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the Type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be done in practice, as well as how covariance structure choice can change inferences about fixed effects. PMID:25293361

  18. Cost-effectiveness of preventive case management for parents with a mental illness: a randomized controlled trial from three economic perspectives.

    PubMed

    Wansink, Henny J; Drost, Ruben M W A; Paulus, Aggie T G; Ruwaard, Dirk; Hosman, Clemens M H; Janssens, Jan M A M; Evers, Silvia M A A

    2016-07-07

    The children of parents with a mental illness (COPMI) are at increased risk for developing costly psychiatric disorders because of multiple risk factors which threaten parenting quality and thereby child development. Preventive basic care management (PBCM) is an intervention aimed at reducing risk factors and addressing the needs of COPMI-families in different domains. The intervention may lead to financial consequences in the healthcare sector and in other sectors, also known as inter-sectoral costs and benefits (ICBs). The objective of this study was to assess the cost-effectiveness of PBCM from three perspectives: a narrow healthcare perspective, a social care perspective (including childcare costs) and a broad societal perspective (including all ICBs). Effects on parenting quality (as measured by the HOME) and costs during an 18-month period were studied in in a randomized controlled trial. Families received PBCM (n = 49) or care as usual (CAU) (n = 50). For all three perspectives, incremental cost-effectiveness ratios (ICERs) were calculated. Stochastic uncertainty in the data was dealt with using non-parametric bootstraps. Sensitivity analyses included calculating ICERs excluding cost outliers, and making an adjustment for baseline cost differences. Parenting quality improved in the PBCM group and declined in the CAU group, and PBCM was shown to be more costly than CAU. ICERs differ from 461 Euros (healthcare perspective) to 215 Euros (social care perspective) to 175 Euros (societal perspective) per one point improvement on the HOME T-score. The results of the sensitivity analyses, based on complete cases and excluding cost outliers, support the finding that the ICER is lower when adopting a broader perspective. The subgroup analysis and the analysis with baseline adjustments resulted in higher ICERs. This study is the first economic evaluation of family-focused preventive basic care management for COPMI in psychiatric and family services. The effects of the chosen perspective on determining the cost-effectiveness of PBCM underscore the importance of economic studies of interdepartmental policies. Future studies focusing on the cost-effectiveness of programs like PBCM in other sites and studies with more power are encouraged as this may improve the quality of information used in supporting decision making. NTR2569 , date of registration 2010-10-12.

  19. Incorporating parametric uncertainty into population viability analysis models

    USGS Publications Warehouse

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  20. Discrete-time switching periodic adaptive control for time-varying parameters with unknown periodicity

    NASA Astrophysics Data System (ADS)

    Yu, Miao; Huang, Deqing; Yang, Wanqiu

    2018-06-01

    In this paper, we address the problem of unknown periodicity for a class of discrete-time nonlinear parametric systems without assuming any growth conditions on the nonlinearities. The unknown periodicity hides in the parametric uncertainties, which is difficult to estimate with existing techniques. By incorporating a logic-based switching mechanism, we identify the period and bound of unknown parameter simultaneously. Lyapunov-based analysis is given to demonstrate that a finite number of switchings can guarantee the asymptotic tracking for the nonlinear parametric systems. The simulation result also shows the efficacy of the proposed switching periodic adaptive control approach.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Looney, J.H.; Im, C.J.

    Under the sponsorship of DOE/METC, UCC Research completed a program in 1984 concerned with the development, testing, and manufacture of an ultra-clean coal-water mixture fuel using the UCC two-stage physical beneficiation and coal-water mixture preparation process. Several gallons of ultra-clean coal-water slurry produced at the UCC Research pilot facility was supplied to DOE/METC for combustion testing. The finalization of this project resulted in the presentation of a conceptual design and economic analysis of an ultra-clean coal-water mixture processing facility sufficient in size to continuously supply fuel to a 100 MW turbine power generation system. Upon completion of the above program,more » it became evident that substantial technological and economic improvement could be realized through further laboratory and engineering investigation of the UCC two-stage physical beneficiation process. Therefore, as an extension to the previous work, the purpose of the present program was to define the relationship between the controlling technical parameters as related to coal-water slurry quality and product price, and to determine the areas of improvement in the existing flow-scheme, associated cost savings, and the overall effect of these savings on final coal-water slurry price. Contents of this report include: (1) introduction; (2) process refinement (improvement of coal beneficiation process, different source coals and related cleanability, dispersants and other additives); (3) coal beneficiation and cost parametrics summary; (4) revised conceptual design and economic analysis; (5) operating and capital cost reduction; (6) conclusion; and (7) appendices. 24 figs., 12 tabs.« less

  2. Parametric Sensitivity Analysis of Oscillatory Delay Systems with an Application to Gene Regulation.

    PubMed

    Ingalls, Brian; Mincheva, Maya; Roussel, Marc R

    2017-07-01

    A parametric sensitivity analysis for periodic solutions of delay-differential equations is developed. Because phase shifts cause the sensitivity coefficients of a periodic orbit to diverge, we focus on sensitivities of the extrema, from which amplitude sensitivities are computed, and of the period. Delay-differential equations are often used to model gene expression networks. In these models, the parametric sensitivities of a particular genotype define the local geometry of the evolutionary landscape. Thus, sensitivities can be used to investigate directions of gradual evolutionary change. An oscillatory protein synthesis model whose properties are modulated by RNA interference is used as an example. This model consists of a set of coupled delay-differential equations involving three delays. Sensitivity analyses are carried out at several operating points. Comments on the evolutionary implications of the results are offered.

  3. Elastic full-waveform inversion and parametrization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    NASA Astrophysics Data System (ADS)

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    2018-06-01

    Seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter trade-off, arising from the simultaneous variations of different physical parameters, which increase the nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parametrization and acquisition arrangement. An appropriate choice of model parametrization is important to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parametrizations in isotropic-elastic FWI with walk-away vertical seismic profile (W-VSP) data for unconventional heavy oil reservoir characterization. Six model parametrizations are considered: velocity-density (α, β and ρ΄), modulus-density (κ, μ and ρ), Lamé-density (λ, μ΄ and ρ‴), impedance-density (IP, IS and ρ″), velocity-impedance-I (α΄, β΄ and I_P^' }) and velocity-impedance-II (α″, β″ and I_S^' }). We begin analysing the interparameter trade-off by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. We discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter trade-offs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter trade-offs for various model parametrizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parametrization, the inverted density profile can be overestimated, underestimated or spatially distorted. Among the six cases, only the velocity-density parametrization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. The heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson's ratios, can be identified clearly with the inverted isotropic-elastic parameters.

  4. Development and application of computer assisted optimal method for treatment of femoral neck fracture.

    PubMed

    Wang, Monan; Zhang, Kai; Yang, Ning

    2018-04-09

    To help doctors decide their treatment from the aspect of mechanical analysis, the work built a computer assisted optimal system for treatment of femoral neck fracture oriented to clinical application. The whole system encompassed the following three parts: Preprocessing module, finite element mechanical analysis module, post processing module. Preprocessing module included parametric modeling of bone, parametric modeling of fracture face, parametric modeling of fixed screw and fixed position and input and transmission of model parameters. Finite element mechanical analysis module included grid division, element type setting, material property setting, contact setting, constraint and load setting, analysis method setting and batch processing operation. Post processing module included extraction and display of batch processing operation results, image generation of batch processing operation, optimal program operation and optimal result display. The system implemented the whole operations from input of fracture parameters to output of the optimal fixed plan according to specific patient real fracture parameter and optimal rules, which demonstrated the effectiveness of the system. Meanwhile, the system had a friendly interface, simple operation and could improve the system function quickly through modifying single module.

  5. Analysis of Parametric Adaptive Signal Detection with Applications to Radars and Hyperspectral Imaging

    DTIC Science & Technology

    2010-02-01

    98 8.4.5 Training Screening ............................. .................................................................99 8.5 Experimental...associated with the proposed parametric model. Several im- portant issues are discussed, including model order selection, training screening , and time...parameters associated with the NS-AR model. In addition, we develop model order selection, training screening , and time-series based whitening and

  6. Glycerolysis with crude glycerine as an alternative 3 pretreatment for biodiesel production from grease trap 4 waste: Parametric study and energy analysis

    USDA-ARS?s Scientific Manuscript database

    This study reports the use of crude glycerine from biodiesel production in the glycerolysis process and presents the associated parametric and energy analyses. The potential of glycerolysis as an alternative pretreatment method for high free fatty acid (FFA) containing fats, oils and greases (FOGs) ...

  7. A Parametric Analysis of the Techniques Used for the Recovery and Evacuation of Battle Damaged Tracked Vehicles.

    DTIC Science & Technology

    1980-06-01

    problems, a parametric model was built which uses the TI - 59 programmable calculator as its ve- hicle. Although the calculator has many disadvantages for...previous experience using the TI 59 programmable calculator . For example, explicit instructions for reading cards into the memory set will not be given

  8. Development of a Multivariable Parametric Cost Analysis for Space-Based Telescopes

    NASA Technical Reports Server (NTRS)

    Dollinger, Courtnay

    2011-01-01

    Over the past 400 years, the telescope has proven to be a valuable tool in helping humankind understand the Universe around us. The images and data produced by telescopes have revolutionized planetary, solar, stellar, and galactic astronomy and have inspired a wide range of people, from the child who dreams about the images seen on NASA websites to the most highly trained scientist. Like all scientific endeavors, astronomical research must operate within the constraints imposed by budget limitations. Hence the importance of understanding cost: to find the balance between the dreams of scientists and the restrictions of the available budget. By logically analyzing the data we have collected for over thirty different telescopes from more than 200 different sources, statistical methods, such as plotting regressions and residuals, can be used to determine what drives the cost of telescopes to build and use a cost model for space-based telescopes. Previous cost models have focused their attention on ground-based telescopes due to limited data for space telescopes and the larger number and longer history of ground-based astronomy. Due to the increased availability of cost data from recent space-telescope construction, we have been able to produce and begin testing a comprehensive cost model for space telescopes, with guidance from the cost models for ground-based telescopes. By separating the variables that effect cost such as diameter, mass, wavelength, density, data rate, and number of instruments, we advance the goal to better understand the cost drivers of space telescopes.. The use of sophisticated mathematical techniques to improve the accuracy of cost models has the potential to help society make informed decisions about proposed scientific projects. An improved knowledge of cost will allow scientists to get the maximum value returned for the money given and create a harmony between the visions of scientists and the reality of a budget.

  9. The reduced basis method for the electric field integral equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fares, M., E-mail: fares@cerfacs.f; Hesthaven, J.S., E-mail: Jan_Hesthaven@Brown.ed; Maday, Y., E-mail: maday@ann.jussieu.f

    We introduce the reduced basis method (RBM) as an efficient tool for parametrized scattering problems in computational electromagnetics for problems where field solutions are computed using a standard Boundary Element Method (BEM) for the parametrized electric field integral equation (EFIE). This combination enables an algorithmic cooperation which results in a two step procedure. The first step consists of a computationally intense assembling of the reduced basis, that needs to be effected only once. In the second step, we compute output functionals of the solution, such as the Radar Cross Section (RCS), independently of the dimension of the discretization space, formore » many different parameter values in a many-query context at very little cost. Parameters include the wavenumber, the angle of the incident plane wave and its polarization.« less

  10. Direct reconstruction of cardiac PET kinetic parametric images using a preconditioned conjugate gradient approach

    PubMed Central

    Rakvongthai, Yothin; Ouyang, Jinsong; Guerin, Bastien; Li, Quanzheng; Alpert, Nathaniel M.; El Fakhri, Georges

    2013-01-01

    Purpose: Our research goal is to develop an algorithm to reconstruct cardiac positron emission tomography (PET) kinetic parametric images directly from sinograms and compare its performance with the conventional indirect approach. Methods: Time activity curves of a NCAT phantom were computed according to a one-tissue compartmental kinetic model with realistic kinetic parameters. The sinograms at each time frame were simulated using the activity distribution for the time frame. The authors reconstructed the parametric images directly from the sinograms by optimizing a cost function, which included the Poisson log-likelihood and a spatial regularization terms, using the preconditioned conjugate gradient (PCG) algorithm with the proposed preconditioner. The proposed preconditioner is a diagonal matrix whose diagonal entries are the ratio of the parameter and the sensitivity of the radioactivity associated with parameter. The authors compared the reconstructed parametric images using the direct approach with those reconstructed using the conventional indirect approach. Results: At the same bias, the direct approach yielded significant relative reduction in standard deviation by 12%–29% and 32%–70% for 50 × 106 and 10 × 106 detected coincidences counts, respectively. Also, the PCG method effectively reached a constant value after only 10 iterations (with numerical convergence achieved after 40–50 iterations), while more than 500 iterations were needed for CG. Conclusions: The authors have developed a novel approach based on the PCG algorithm to directly reconstruct cardiac PET parametric images from sinograms, and yield better estimation of kinetic parameters than the conventional indirect approach, i.e., curve fitting of reconstructed images. The PCG method increases the convergence rate of reconstruction significantly as compared to the conventional CG method. PMID:24089922

  11. Direct reconstruction of cardiac PET kinetic parametric images using a preconditioned conjugate gradient approach.

    PubMed

    Rakvongthai, Yothin; Ouyang, Jinsong; Guerin, Bastien; Li, Quanzheng; Alpert, Nathaniel M; El Fakhri, Georges

    2013-10-01

    Our research goal is to develop an algorithm to reconstruct cardiac positron emission tomography (PET) kinetic parametric images directly from sinograms and compare its performance with the conventional indirect approach. Time activity curves of a NCAT phantom were computed according to a one-tissue compartmental kinetic model with realistic kinetic parameters. The sinograms at each time frame were simulated using the activity distribution for the time frame. The authors reconstructed the parametric images directly from the sinograms by optimizing a cost function, which included the Poisson log-likelihood and a spatial regularization terms, using the preconditioned conjugate gradient (PCG) algorithm with the proposed preconditioner. The proposed preconditioner is a diagonal matrix whose diagonal entries are the ratio of the parameter and the sensitivity of the radioactivity associated with parameter. The authors compared the reconstructed parametric images using the direct approach with those reconstructed using the conventional indirect approach. At the same bias, the direct approach yielded significant relative reduction in standard deviation by 12%-29% and 32%-70% for 50 × 10(6) and 10 × 10(6) detected coincidences counts, respectively. Also, the PCG method effectively reached a constant value after only 10 iterations (with numerical convergence achieved after 40-50 iterations), while more than 500 iterations were needed for CG. The authors have developed a novel approach based on the PCG algorithm to directly reconstruct cardiac PET parametric images from sinograms, and yield better estimation of kinetic parameters than the conventional indirect approach, i.e., curve fitting of reconstructed images. The PCG method increases the convergence rate of reconstruction significantly as compared to the conventional CG method.

  12. [Radiology in managed care environment: opportunities for cost savings in an HMO].

    PubMed

    Schmidt, C; Mohr, A; Möller, J; Levin-Scherz, J; Heller, M

    2003-09-01

    A large regional health plan in the Northeastern United States noted that its radiology costs were increasing more than it anticipated in its pricing, and noted further that other similar health plans in markets with high managed care penetration had significantly lower expenses for radiology services. This study describes the potential areas of improvement and managed care techniques that were implemented to reduce costs and reform processes. We performed an in-depth analysis of financial data, claims logic, contracting with provider units and conducted interviews with employees, to identify potential areas of improvement and cost reduction. A detailed market analysis of the environment, competitors and vendors was accompanied by extensive literature, Internet and Medline search for comparable projects. All data were docu-mented in Microsoft Excel(R) and analyzed by non-parametric tests using SPSS(R) 8.0 (Statistical Package for the Social Sciences) for Windows(R). The main factors driving the cost increases in radiology were divided into those internal or external to the HMO. Among the internal factors, the claims logic was allowing overpayment due to limitations of the IT system. Risk arrangements between insurer and provider units (PU) as well as the extent of provider unit management and administration showed a significant correlation with financial performance in terms of variance from budget. Among the external factors, shared risk arrangements between HMO and provider unit were associated with more efficient radiology utilization and overall improvement in financial performance. PU with full-time management had significantly less variance from their budget than those without. Finally, physicians with imaging equipment in their offices ordered up to 4 to 5 times more imaging procedures than physicians who did not perform imaging studies themselves. We identified initiatives with estimated potential savings of approximately $ 5.5 million. Some of these initiatives are similar to the reforms to reduce cost and improve quality that are already implemented or proposed within the German healthcare system.

  13. A New and General Formulation of the Parametric HFGMC Micromechanical Method for Three-Dimensional Multi-Phase Composites

    NASA Technical Reports Server (NTRS)

    Haj-Ali, Rami; Aboudi, Jacob

    2012-01-01

    The recent two-dimensional (2-D) parametric formulation of the high fidelity generalized method of cells (HFGMC) reported by the authors is generalized for the micromechanical analysis of three-dimensional (3-D) multiphase composites with periodic microstructure. Arbitrary hexahedral subcell geometry is developed to discretize a triply periodic repeating unit-cell (RUC). Linear parametric-geometric mapping is employed to transform the arbitrary hexahedral subcell shapes from the physical space to an auxiliary orthogonal shape, where a complete quadratic displacement expansion is performed. Previously in the 2-D case, additional three equations are needed in the form of average moments of equilibrium as a result of the inclusion of the bilinear terms. However, the present 3-D parametric HFGMC formulation eliminates the need for such additional equations. This is achieved by expressing the coefficients of the full quadratic polynomial expansion of the subcell in terms of the side or face average-displacement vectors. The 2-D parametric and orthogonal HFGMC are special cases of the present 3-D formulation. The continuity of displacements and tractions, as well as the equilibrium equations, are imposed in the average (integral) sense as in the original HFGMC formulation. Each of the six sides (faces) of a subcell has an independent average displacement micro-variable vector which forms an energy-conjugate pair with the transformed average-traction vector. This allows generating symmetric stiffness matrices along with internal resisting vectors for the subcells which enhances the computational efficiency. The established new parametric 3-D HFGMC equations are formulated and solution implementations are addressed. Several applications for triply periodic 3-D composites are presented to demonstrate the general capability and varsity of the present parametric HFGMC method for refined micromechanical analysis by generating the spatial distributions of local stress fields. These applications include triply periodic composites with inclusions in the form of a cavity, spherical inclusion, ellipsoidal inclusion, discontinuous aligned short fiber. A 3-D repeating unit-cell for foam material composite is simulated.

  14. Thick electrodes for Li-ion batteries: A model based analysis

    NASA Astrophysics Data System (ADS)

    Danner, Timo; Singh, Madhav; Hein, Simon; Kaiser, Jörg; Hahn, Horst; Latz, Arnulf

    2016-12-01

    Li-ion batteries are commonly used in portable electronic devices due to their outstanding energy and power density. A remaining issue which hinders the breakthrough e.g. in the automotive sector is the high production cost. For low power applications, such as stationary storage, batteries with electrodes thicker than 300 μm were suggested. High energy densities can be attained with only a few electrode layers which reduces production time and cost. However, mass and charge transport limitations can be severe at already small C-rates due to long transport pathways. In this article we use a detailed 3D micro-structure resolved model to investigate limiting factors for battery performance. The model is parametrized with data from the literature and dedicated experiments and shows good qualitative agreement with experimental discharge curves of thick NMC-graphite Li-ion batteries. The model is used to assess the effect of inhomogeneities in carbon black distribution and gives answers to the possible occurrence of lithium plating during battery charge. Based on our simulations we can predict optimal operation strategies and improved design concepts for future Li-ion batteries employing thick electrodes.

  15. Slice-to-Volume Nonrigid Registration of Histological Sections to MR Images of the Human Brain

    PubMed Central

    Osechinskiy, Sergey; Kruggel, Frithjof

    2011-01-01

    Registration of histological images to three-dimensional imaging modalities is an important step in quantitative analysis of brain structure, in architectonic mapping of the brain, and in investigation of the pathology of a brain disease. Reconstruction of histology volume from serial sections is a well-established procedure, but it does not address registration of individual slices from sparse sections, which is the aim of the slice-to-volume approach. This study presents a flexible framework for intensity-based slice-to-volume nonrigid registration algorithms with a geometric transformation deformation field parametrized by various classes of spline functions: thin-plate splines (TPS), Gaussian elastic body splines (GEBS), or cubic B-splines. Algorithms are applied to cross-modality registration of histological and magnetic resonance images of the human brain. Registration performance is evaluated across a range of optimization algorithms and intensity-based cost functions. For a particular case of histological data, best results are obtained with a TPS three-dimensional (3D) warp, a new unconstrained optimization algorithm (NEWUOA), and a correlation-coefficient-based cost function. PMID:22567290

  16. Quantitative representations of an exaggerated anxiety response in the brain of female spider phobics-a parametric fMRI study.

    PubMed

    Zilverstand, Anna; Sorger, Bettina; Kaemingk, Anita; Goebel, Rainer

    2017-06-01

    We employed a novel parametric spider picture set in the context of a parametric fMRI anxiety provocation study, designed to tease apart brain regions involved in threat monitoring from regions representing an exaggerated anxiety response in spider phobics. For the stimulus set, we systematically manipulated perceived proximity of threat by varying a depicted spider's context, size, and posture. All stimuli were validated in a behavioral rating study (phobics n = 20; controls n = 20; all female). An independent group participated in a subsequent fMRI anxiety provocation study (phobics n = 7; controls n = 7; all female), in which we compared a whole-brain categorical to a whole-brain parametric analysis. Results demonstrated that the parametric analysis provided a richer characterization of the functional role of the involved brain networks. In three brain regions-the mid insula, the dorsal anterior cingulate, and the ventrolateral prefrontal cortex-activation was linearly modulated by perceived proximity specifically in the spider phobia group, indicating a quantitative representation of an exaggerated anxiety response. In other regions (e.g., the amygdala), activation was linearly modulated in both groups, suggesting a functional role in threat monitoring. Prefrontal regions, such as dorsolateral prefrontal cortex, were activated during anxiety provocation but did not show a stimulus-dependent linear modulation in either group. The results confirm that brain regions involved in anxiety processing hold a quantitative representation of a pathological anxiety response and more generally suggest that parametric fMRI designs may be a very powerful tool for clinical research in the future, particularly when developing novel brain-based interventions (e.g., neurofeedback training). Hum Brain Mapp 38:3025-3038, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Parametric fMRI of paced motor responses uncovers novel whole-brain imaging biomarkers in spinocerebellar ataxia type 3.

    PubMed

    Duarte, João Valente; Faustino, Ricardo; Lobo, Mercês; Cunha, Gil; Nunes, César; Ferreira, Carlos; Januário, Cristina; Castelo-Branco, Miguel

    2016-10-01

    Machado-Joseph Disease, inherited type 3 spinocerebellar ataxia (SCA3), is the most common form worldwide. Neuroimaging and neuropathology have consistently demonstrated cerebellar alterations. Here we aimed to discover whole-brain functional biomarkers, based on parametric performance-level-dependent signals. We assessed 13 patients with early SCA3 and 14 healthy participants. We used a combined parametric behavioral/functional neuroimaging design to investigate disease fingerprints, as a function of performance levels, coupled with structural MRI and voxel-based morphometry. Functional magnetic resonance imaging (fMRI) was designed to parametrically analyze behavior and neural responses to audio-paced bilateral thumb movements at temporal frequencies of 1, 3, and 5 Hz. Our performance-level-based design probing neuronal correlates of motor coordination enabled the discovery that neural activation and behavior show critical loss of parametric modulation specifically in SCA3, associated with frequency-dependent cortico/subcortical activation/deactivation patterns. Cerebellar/cortical rate-dependent dissociation patterns could clearly differentiate between groups irrespective of grey matter loss. Our findings suggest functional reorganization of the motor network and indicate a possible role of fMRI as a tool to monitor disease progression in SCA3. Accordingly, fMRI patterns proved to be potential biomarkers in early SCA3, as tested by receiver operating characteristic analysis of both behavior and neural activation at different frequencies. Discrimination analysis based on BOLD signal in response to the applied parametric finger-tapping task significantly often reached >80% sensitivity and specificity in single regions-of-interest.Functional fingerprints based on cerebellar and cortical BOLD performance dependent signal modulation can thus be combined as diagnostic and/or therapeutic targets in hereditary ataxia. Hum Brain Mapp 37:3656-3668, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Waveform inversion for orthorhombic anisotropy with P waves: feasibility and resolution

    NASA Astrophysics Data System (ADS)

    Kazei, Vladimir; Alkhalifah, Tariq

    2018-05-01

    Various parametrizations have been suggested to simplify inversions of first arrivals, or P waves, in orthorhombic anisotropic media, but the number and type of retrievable parameters have not been decisively determined. We show that only six parameters can be retrieved from the dynamic linearized inversion of P waves. These parameters are different from the six parameters needed to describe the kinematics of P waves. Reflection-based radiation patterns from the P-P scattered waves are remapped into the spectral domain to allow for our resolution analysis based on the effective angle of illumination concept. Singular value decomposition of the spectral sensitivities from various azimuths, offset coverage scenarios and data bandwidths allows us to quantify the resolution of different parametrizations, taking into account the signal-to-noise ratio in a given experiment. According to our singular value analysis, when the primary goal of inversion is determining the velocity of the P waves, gradually adding anisotropy of lower orders (isotropic, vertically transversally isotropic and orthorhombic) in hierarchical parametrization is the best choice. Hierarchical parametrization reduces the trade-off between the parameters and makes gradual introduction of lower anisotropy orders straightforward. When all the anisotropic parameters affecting P-wave propagation need to be retrieved simultaneously, the classic parametrization of orthorhombic medium with elastic stiffness matrix coefficients and density is a better choice for inversion. We provide estimates of the number and set of parameters that can be retrieved from surface seismic data in different acquisition scenarios. To set up an inversion process, the singular values determine the number of parameters that can be inverted and the resolution matrices from the parametrizations can be used to ascertain the set of parameters that can be resolved.

  19. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    1996-01-01

    We first report on our current progress in the area of explicit methods for tangent curve computation. The basic idea of this method is to decompose the domain into a collection of triangles (or tetrahedra) and assume linear variation of the vector field over each cell. With this assumption, the equations which define a tangent curve become a system of linear, constant coefficient ODE's which can be solved explicitly. There are five different representation of the solution depending on the eigenvalues of the Jacobian. The analysis of these five cases is somewhat similar to the phase plane analysis often associate with critical point classification within the context of topological methods, but it is not exactly the same. There are some critical differences. Moving from one cell to the next as a tangent curve is tracked, requires the computation of the exit point which is an intersection of the solution of the constant coefficient ODE and the edge of a triangle. There are two possible approaches to this root computation problem. We can express the tangent curve into parametric form and substitute into an implicit form for the edge or we can express the edge in parametric form and substitute in an implicit form of the tangent curve. Normally the solution of a system of ODE's is given in parametric form and so the first approach is the most accessible and straightforward. The second approach requires the 'implicitization' of these parametric curves. The implicitization of parametric curves can often be rather difficult, but in this case we have been successful and have been able to develop algorithms and subsequent computer programs for both approaches. We will give these details along with some comparisons in a forthcoming research paper on this topic.

  20. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2004-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  1. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merion M.

    2002-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  2. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2003-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  3. An Affordability Comparison Tool (ACT) for Space Transportation

    NASA Technical Reports Server (NTRS)

    McCleskey, C. M.; Bollo, T. R.; Garcia, J. L.

    2012-01-01

    NASA bas recently emphasized the importance of affordability for Commercial Crew Development Program (CCDP), Space Launch Systems (SLS) and Multi-Purpose Crew Vehicle (MPCV). System architects and designers are challenged to come up with architectures and designs that do not bust the budget. This paper describes the Affordability Comparison Tool (ACT) analyzes different systems or architecture configurations for affordability that allows for a comparison of: total life cycle cost; annual recurring costs, affordability figures-of-merit, such as cost per pound, cost per seat, and cost per flight, as well as productivity measures, such as payload throughput. Although ACT is not a deterministic model, the paper develops algorithms and parametric factors that use characteristics of the architectures or systems being compared to produce important system outcomes (figures-of-merit). Example applications of outcome figures-of-merit are also documented to provide the designer with information on the relative affordability and productivity of different space transportation applications.

  4. Type Specialization in Aldor

    NASA Astrophysics Data System (ADS)

    Dragan, Laurentiu; Watt, Stephen M.

    Computer algebra in scientific computation squarely faces the dilemma of natural mathematical expression versus efficiency. While higher-order programming constructs and parametric polymorphism provide a natural and expressive language for mathematical abstractions, they can come at a considerable cost. We investigate how deeply nested type constructions may be optimized to achieve performance similar to that of hand-tuned code written in lower-level languages.

  5. Comparison of thawing and freezing dark energy parametrizations

    NASA Astrophysics Data System (ADS)

    Pantazis, G.; Nesseris, S.; Perivolaropoulos, L.

    2016-05-01

    Dark energy equation of state w (z ) parametrizations with two parameters and given monotonicity are generically either convex or concave functions. This makes them suitable for fitting either freezing or thawing quintessence models but not both simultaneously. Fitting a data set based on a freezing model with an unsuitable (concave when increasing) w (z ) parametrization [like Chevallier-Polarski-Linder (CPL)] can lead to significant misleading features like crossing of the phantom divide line, incorrect w (z =0 ), incorrect slope, etc., that are not present in the underlying cosmological model. To demonstrate this fact we generate scattered cosmological data at both the level of w (z ) and the luminosity distance DL(z ) based on either thawing or freezing quintessence models and fit them using parametrizations of convex and of concave type. We then compare statistically significant features of the best fit w (z ) with actual features of the underlying model. We thus verify that the use of unsuitable parametrizations can lead to misleading conclusions. In order to avoid these problems it is important to either use both convex and concave parametrizations and select the one with the best χ2 or use principal component analysis thus splitting the redshift range into independent bins. In the latter case, however, significant information about the slope of w (z ) at high redshifts is lost. Finally, we propose a new family of parametrizations w (z )=w0+wa(z/1 +z )n which generalizes the CPL and interpolates between thawing and freezing parametrizations as the parameter n increases to values larger than 1.

  6. Gastroenterologist and nurse management of symptoms after pelvic radiotherapy for cancer: an economic evaluation of a clinical randomized controlled trial (the ORBIT study).

    PubMed

    Jordan, Jake; Gage, Heather; Benton, Barbara; Lalji, Amyn; Norton, Christine; Andreyev, H Jervoise N

    2017-01-01

    Over 20 distressing gastrointestinal symptoms affect many patients after pelvic radiotherapy, but in the United Kingdom few are referred for assessment. Algorithmic-based treatment delivered by either a consultant gastroenterologist or a clinical nurse specialist has been shown in a randomized trial to be statistically and clinically more effective than provision of a self-help booklet. In this study, we assessed cost-effectiveness. Outcomes were measured at baseline (pre-randomization) and 6 months. Change in quality-adjusted life years (QALYs) was the primary outcome for the economic evaluation; a secondary analysis used change in the bowel subset score of the modified Inflammatory Bowel Disease Questionnaire (IBDQ-B). Intervention costs, British pounds 2013, covered visits with the gastroenterologist or nurse, investigations, medications and treatments. Incremental outcomes and incremental costs were estimated simultaneously using multivariate linear regression. Uncertainty was handled non-parametrically using bootstrap with replacement. The mean (SD) cost of treatment was £895 (499) for the nurse and £1101 (567) for the consultant. The nurse was dominated by usual care, which was cheaper and achieved better outcomes. The mean cost per QALY gained from the consultant, compared to usual care, was £250,455; comparing the consultant to the nurse, it was £25,875. Algorithmic care produced better outcomes compared to the booklet only, as reflected in the IBDQ-B results, at a cost of ~£1,000. Algorithmic treatment of radiation bowel injury by a consultant or a nurse results in significant symptom relief for patients but was not found to be cost-effective according to the National Institute for Health and Care Excellence (NICE) criteria.

  7. Estimation of Survival Probabilities for Use in Cost-effectiveness Analyses: A Comparison of a Multi-state Modeling Survival Analysis Approach with Partitioned Survival and Markov Decision-Analytic Modeling

    PubMed Central

    Williams, Claire; Lewsey, James D.; Mackay, Daniel F.; Briggs, Andrew H.

    2016-01-01

    Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results. PMID:27698003

  8. Estimation of Survival Probabilities for Use in Cost-effectiveness Analyses: A Comparison of a Multi-state Modeling Survival Analysis Approach with Partitioned Survival and Markov Decision-Analytic Modeling.

    PubMed

    Williams, Claire; Lewsey, James D; Mackay, Daniel F; Briggs, Andrew H

    2017-05-01

    Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results.

  9. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  10. The 20 kW battery study program

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Six battery configurations were selected for detailed study and these are described. A computer program was modified for use in estimation of the weights, costs, and reliabilities of each of the configurations, as a function of several important independent variables, such as system voltage, battery voltage ratio (battery voltage/bus voltage), and the number of parallel units into which each of the components of the power subsystem was divided. The computer program was used to develop the relationship between the independent variables alone and in combination, and the dependent variables: weight, cost, and availability. Parametric data, including power loss curves, are given.

  11. Pluripotency gene network dynamics: System views from parametric analysis.

    PubMed

    Akberdin, Ilya R; Omelyanchuk, Nadezda A; Fadeev, Stanislav I; Leskova, Natalya E; Oschepkova, Evgeniya A; Kazantsev, Fedor V; Matushkin, Yury G; Afonnikov, Dmitry A; Kolchanov, Nikolay A

    2018-01-01

    Multiple experimental data demonstrated that the core gene network orchestrating self-renewal and differentiation of mouse embryonic stem cells involves activity of Oct4, Sox2 and Nanog genes by means of a number of positive feedback loops among them. However, recent studies indicated that the architecture of the core gene network should also incorporate negative Nanog autoregulation and might not include positive feedbacks from Nanog to Oct4 and Sox2. Thorough parametric analysis of the mathematical model based on this revisited core regulatory circuit identified that there are substantial changes in model dynamics occurred depending on the strength of Oct4 and Sox2 activation and molecular complexity of Nanog autorepression. The analysis showed the existence of four dynamical domains with different numbers of stable and unstable steady states. We hypothesize that these domains can constitute the checkpoints in a developmental progression from naïve to primed pluripotency and vice versa. During this transition, parametric conditions exist, which generate an oscillatory behavior of the system explaining heterogeneity in expression of pluripotent and differentiation factors in serum ESC cultures. Eventually, simulations showed that addition of positive feedbacks from Nanog to Oct4 and Sox2 leads mainly to increase of the parametric space for the naïve ESC state, in which pluripotency factors are strongly expressed while differentiation ones are repressed.

  12. Trial-dependent psychometric functions accounting for perceptual learning in 2-AFC discrimination tasks.

    PubMed

    Kattner, Florian; Cochrane, Aaron; Green, C Shawn

    2017-09-01

    The majority of theoretical models of learning consider learning to be a continuous function of experience. However, most perceptual learning studies use thresholds estimated by fitting psychometric functions to independent blocks, sometimes then fitting a parametric function to these block-wise estimated thresholds. Critically, such approaches tend to violate the basic principle that learning is continuous through time (e.g., by aggregating trials into large "blocks" for analysis that each assume stationarity, then fitting learning functions to these aggregated blocks). To address this discrepancy between base theory and analysis practice, here we instead propose fitting a parametric function to thresholds from each individual trial. In particular, we implemented a dynamic psychometric function whose parameters were allowed to change continuously with each trial, thus parameterizing nonstationarity. We fit the resulting continuous time parametric model to data from two different perceptual learning tasks. In nearly every case, the quality of the fits derived from the continuous time parametric model outperformed the fits derived from a nonparametric approach wherein separate psychometric functions were fit to blocks of trials. Because such a continuous trial-dependent model of perceptual learning also offers a number of additional advantages (e.g., the ability to extrapolate beyond the observed data; the ability to estimate performance on individual critical trials), we suggest that this technique would be a useful addition to each psychophysicist's analysis toolkit.

  13. Nonlinear Analysis of Mechanical Systems Under Combined Harmonic and Stochastic Excitation

    DTIC Science & Technology

    1993-05-27

    Namachchivaya and Naresh Malhotra Department of Aeronautical and Astronautical Engineering University of Illinois, Urbana-Champaign Urbana, Illinois...Aeronauticai and Astronautical Engineering, University of Illinois, 1991. 2. N. Sri Namachchivaya and N. Malhotra , Parametrically Excited Hopf Bifurcation...Namachchivaya and N. Malhotra , Parametrically Excited Hopf Bifurcation with Non-semisimple 1:1 Resonance, Nonlinear Vibrations, ASME-AMD, Vol. 114, 1992. 3

  14. Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks.

    PubMed

    Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph

    2015-08-01

    Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is "non-intrusive" and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design.

  15. Efficient Characterization of Parametric Uncertainty of Complex (Bio)chemical Networks

    PubMed Central

    Schillings, Claudia; Sunnåker, Mikael; Stelling, Jörg; Schwab, Christoph

    2015-01-01

    Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is “non-intrusive” and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design. PMID:26317784

  16. Optical parametric amplifiers using chirped quasi-phase-matching gratings I: practical design formulas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charbonneau-Lefort, Mathieu; Afeyan, Bedros; Fejer, M. M.

    Optical parametric amplifiers using chirped quasi-phase-matching (QPM) gratings offer the possibility of engineering the gain and group delay spectra. We give practical formulas for the design of such amplifiers. We consider linearly chirped QPM gratings providing constant gain over a broad bandwidth, sinusoidally modulated profiles for selective frequency amplification and a pair of QPM gratings working in tandem to ensure constant gain and constant group delay at the same time across the spectrum. Finally, the analysis is carried out in the frequency domain using Wentzel–Kramers–Brillouin analysis.

  17. Solid state SPS microwave generation and transmission study. Volume 1: Phase 2

    NASA Technical Reports Server (NTRS)

    Maynard, O. E.

    1980-01-01

    The solid state sandwich concept for Solar Power Station (SPS) was investigated. The design effort concentrated on the spacetenna, but did include some system analysis for parametric comparison reasons. The study specifically included definition and math modeling of basic solid state microwave devices, an initial conceptual subsystems and system design, sidelobe control and system selection, an assessment of selected system concept and parametric solid state microwave power transmission system data relevant to the SPS concept. Although device efficiency was not a goal, the sensitivities to design of this efficiency were parametrically treated. Sidelobe control consisted of various single step tapers, multistep tapers, and Gaussian tapers. A preliminary assessment of a hybrid concept using tubes and solid state is also included. There is a considerable amount of thermal analysis provided with emphasis on sensitivities to waste heat radiator form factor, emissivity, absorptivity, amplifier efficiency, material and junction temperature.

  18. The use of analysis of variance procedures in biological studies

    USGS Publications Warehouse

    Williams, B.K.

    1987-01-01

    The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.

  19. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  20. Laser-induced fluorescence microscopic system using an optical parametric oscillator for tunable detection in microchip analysis.

    PubMed

    Kumemura, Momoko; Odake, Tamao; Korenaga, Takashi

    2005-06-01

    A laser-induced fluorescence microscopic system based on optical parametric oscillation has been constructed as a tunable detector for microchip analysis. The detection limit of sulforhodamine B (Ex. 520 nm, Em. 570 nm) was 0.2 mumol, which was approximately eight orders of magnitude better than with a conventional fluorophotometer. The system was applied to the determination of fluorescence-labeled DNA (Ex. 494 nm, Em. 519 nm) in a microchannel and the detection limit reached a single molecule. These results showed the feasibility of this system as a highly sensitive and tunable fluorescence detector for microchip analysis.

  1. Evolution of spherical cavitation bubbles: Parametric and closed-form solutions

    NASA Astrophysics Data System (ADS)

    Mancas, Stefan C.; Rosu, Haret C.

    2016-02-01

    We present an analysis of the Rayleigh-Plesset equation for a three dimensional vacuous bubble in water. In the simplest case when the effects of surface tension are neglected, the known parametric solutions for the radius and time evolution of the bubble in terms of a hypergeometric function are briefly reviewed. By including the surface tension, we show the connection between the Rayleigh-Plesset equation and Abel's equation, and obtain the parametric rational Weierstrass periodic solutions following the Abel route. In the same Abel approach, we also provide a discussion of the nonintegrable case of nonzero viscosity for which we perform a numerical integration.

  2. Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation

    NASA Technical Reports Server (NTRS)

    Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.

    1998-01-01

    The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.

  3. Direct 4D reconstruction of parametric images incorporating anato-functional joint entropy.

    PubMed

    Tang, Jing; Kuwabara, Hiroto; Wong, Dean F; Rahmim, Arman

    2010-08-07

    We developed an anatomy-guided 4D closed-form algorithm to directly reconstruct parametric images from projection data for (nearly) irreversible tracers. Conventional methods consist of individually reconstructing 2D/3D PET data, followed by graphical analysis on the sequence of reconstructed image frames. The proposed direct reconstruction approach maintains the simplicity and accuracy of the expectation-maximization (EM) algorithm by extending the system matrix to include the relation between the parametric images and the measured data. A closed-form solution was achieved using a different hidden complete-data formulation within the EM framework. Furthermore, the proposed method was extended to maximum a posterior reconstruction via incorporation of MR image information, taking the joint entropy between MR and parametric PET features as the prior. Using realistic simulated noisy [(11)C]-naltrindole PET and MR brain images/data, the quantitative performance of the proposed methods was investigated. Significant improvements in terms of noise versus bias performance were demonstrated when performing direct parametric reconstruction, and additionally upon extending the algorithm to its Bayesian counterpart using the MR-PET joint entropy measure.

  4. Direct Parametric Reconstruction With Joint Motion Estimation/Correction for Dynamic Brain PET Data.

    PubMed

    Jiao, Jieqing; Bousse, Alexandre; Thielemans, Kris; Burgos, Ninon; Weston, Philip S J; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Markiewicz, Pawel; Ourselin, Sebastien

    2017-01-01

    Direct reconstruction of parametric images from raw photon counts has been shown to improve the quantitative analysis of dynamic positron emission tomography (PET) data. However it suffers from subject motion which is inevitable during the typical acquisition time of 1-2 hours. In this work we propose a framework to jointly estimate subject head motion and reconstruct the motion-corrected parametric images directly from raw PET data, so that the effects of distorted tissue-to-voxel mapping due to subject motion can be reduced in reconstructing the parametric images with motion-compensated attenuation correction and spatially aligned temporal PET data. The proposed approach is formulated within the maximum likelihood framework, and efficient solutions are derived for estimating subject motion and kinetic parameters from raw PET photon count data. Results from evaluations on simulated [ 11 C]raclopride data using the Zubal brain phantom and real clinical [ 18 F]florbetapir data of a patient with Alzheimer's disease show that the proposed joint direct parametric reconstruction motion correction approach can improve the accuracy of quantifying dynamic PET data with large subject motion.

  5. Selecting a separable parametric spatiotemporal covariance structure for longitudinal imaging data.

    PubMed

    George, Brandon; Aban, Inmaculada

    2015-01-15

    Longitudinal imaging studies allow great insight into how the structure and function of a subject's internal anatomy changes over time. Unfortunately, the analysis of longitudinal imaging data is complicated by inherent spatial and temporal correlation: the temporal from the repeated measures and the spatial from the outcomes of interest being observed at multiple points in a patient's body. We propose the use of a linear model with a separable parametric spatiotemporal error structure for the analysis of repeated imaging data. The model makes use of spatial (exponential, spherical, and Matérn) and temporal (compound symmetric, autoregressive-1, Toeplitz, and unstructured) parametric correlation functions. A simulation study, inspired by a longitudinal cardiac imaging study on mitral regurgitation patients, compared different information criteria for selecting a particular separable parametric spatiotemporal correlation structure as well as the effects on types I and II error rates for inference on fixed effects when the specified model is incorrect. Information criteria were found to be highly accurate at choosing between separable parametric spatiotemporal correlation structures. Misspecification of the covariance structure was found to have the ability to inflate the type I error or have an overly conservative test size, which corresponded to decreased power. An example with clinical data is given illustrating how the covariance structure procedure can be performed in practice, as well as how covariance structure choice can change inferences about fixed effects. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Can you trust the parametric standard errors in nonlinear least squares? Yes, with provisos.

    PubMed

    Tellinghuisen, Joel

    2018-04-01

    Questions about the reliability of parametric standard errors (SEs) from nonlinear least squares (LS) algorithms have led to a general mistrust of these precision estimators that is often unwarranted. The importance of non-Gaussian parameter distributions is illustrated by converting linear models to nonlinear by substituting e A , ln A, and 1/A for a linear parameter a. Monte Carlo (MC) simulations characterize parameter distributions in more complex cases, including when data have varying uncertainty and should be weighted, but weights are neglected. This situation leads to loss of precision and erroneous parametric SEs, as is illustrated for the Lineweaver-Burk analysis of enzyme kinetics data and the analysis of isothermal titration calorimetry data. Non-Gaussian parameter distributions are generally asymmetric and biased. However, when the parametric SE is <10% of the magnitude of the parameter, both the bias and the asymmetry can usually be ignored. Sometimes nonlinear estimators can be redefined to give more normal distributions and better convergence properties. Variable data uncertainty, or heteroscedasticity, can sometimes be handled by data transforms but more generally requires weighted LS, which in turn require knowledge of the data variance. Parametric SEs are rigorously correct in linear LS under the usual assumptions, and are a trustworthy approximation in nonlinear LS provided they are sufficiently small - a condition favored by the abundant, precise data routinely collected in many modern instrumental methods. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Reliability-based management of buried pipelines considering external corrosion defects

    NASA Astrophysics Data System (ADS)

    Miran, Seyedeh Azadeh

    Corrosion is one of the main deteriorating mechanisms that degrade the energy pipeline integrity, due to transferring corrosive fluid or gas and interacting with corrosive environment. Corrosion defects are usually detected by periodical inspections using in-line inspection (ILI) methods. In order to ensure pipeline safety, this study develops a cost-effective maintenance strategy that consists of three aspects: corrosion growth model development using ILI data, time-dependent performance evaluation, and optimal inspection interval determination. In particular, the proposed study is applied to a cathodic protected buried steel pipeline located in Mexico. First, time-dependent power-law formulation is adopted to probabilistically characterize growth of the maximum depth and length of the external corrosion defects. Dependency between defect depth and length are considered in the model development and generation of the corrosion defects over time is characterized by the homogenous Poisson process. The growth models unknown parameters are evaluated based on the ILI data through the Bayesian updating method with Markov Chain Monte Carlo (MCMC) simulation technique. The proposed corrosion growth models can be used when either matched or non-matched defects are available, and have ability to consider newly generated defects since last inspection. Results of this part of study show that both depth and length growth models can predict damage quantities reasonably well and a strong correlation between defect depth and length is found. Next, time-dependent system failure probabilities are evaluated using developed corrosion growth models considering prevailing uncertainties where three failure modes, namely small leak, large leak and rupture are considered. Performance of the pipeline is evaluated through failure probability per km (or called a sub-system) where each subsystem is considered as a series system of detected and newly generated defects within that sub-system. Sensitivity analysis is also performed to determine to which incorporated parameter(s) in the growth models reliability of the studied pipeline is most sensitive. The reliability analysis results suggest that newly generated defects should be considered in calculating failure probability, especially for prediction of long-term performance of the pipeline and also, impact of the statistical uncertainty in the model parameters is significant that should be considered in the reliability analysis. Finally, with the evaluated time-dependent failure probabilities, a life cycle-cost analysis is conducted to determine optimal inspection interval of studied pipeline. The expected total life-cycle costs consists construction cost and expected costs of inspections, repair, and failure. The repair is conducted when failure probability from any described failure mode exceeds pre-defined probability threshold after each inspection. Moreover, this study also investigates impact of repair threshold values and unit costs of inspection and failure on the expected total life-cycle cost and optimal inspection interval through a parametric study. The analysis suggests that a smaller inspection interval leads to higher inspection costs, but can lower failure cost and also repair cost is less significant compared to inspection and failure costs.

  8. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data

    PubMed Central

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2012-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976

  9. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data.

    PubMed

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2013-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions.

  10. A computational biomechanical analysis to assess the trade-off between chest deflection and spine translation in side impact.

    PubMed

    Pipkorn, Bengt; Subit, Damien; Donlon, John Paul; Sunnevång, Cecilia

    2014-01-01

    The objective of this study is to evaluate how the impact energy is apportioned between chest deflection and translation of the vehicle occupant for various side impact conditions. The Autoliv Total Human Model for Safety (modified THUMS v1.4) was subjected to localized lateral constant velocity impacts to the upper body. First, the impact tests performed on postmortem human subjects (PMHS) were replicated to evaluate THUMS biofidelity. In these tests, a 75-mm-tall flat probe impacted the thorax at 3 m/s at 3 levels (shoulder, upper chest, and mid-chest) and 3 angles (lateral, +15° posterolateral, and -15° anterolateral), for a stroke of 72 mm. Second, a parametric analysis was performed: the Autoliv THUMS response to a 250-mm impact was evaluated for varying impact levels (shoulder to mid-thorax by 50-mm increments), obliquity (0° [pure lateral] to +20° [posterior impacts] and to -20° [anterior impacts], by 5° steps), and impactor pitch (from 0 to 25° by 5° steps). A total of 139 simulations were run. The impactor force, chest deflection, spine displacement, and spine velocity were calculated for each simulation. The Autoliv THUMS biofidelity was found acceptable. Overall, the predictions from the model were in good agreement with the PMHS results. The worst ratings were observed for the anterolateral impacts. For the parametric analysis, maximum chest deflection (MCD) and maximum spine displacement (MSD) were found to consistently follow opposite trends with increasing obliquity. This trend was level dependent, with greater MCD (lower MSD) for the higher impact levels. However, the spine velocity for the 250-mm impactor stroke followed an independent trend that could not be linked to MCD or MSD. This suggests that the spine velocity, which can be used as a proxy for the thorax kinetic energy, needs to be included in the design parameters of countermeasures for side impact protection. The parametric analysis reveals a trade-off between the deformation of the chest (and therefore the risk of rib fracture) and the lateral translation of the spine: reducing the maximum chest deflection comes at the cost of increasing the occupant lateral displacement. The trade-off between MCD and MSD is location dependent, which suggests that an optimum point of loading on the chest for the action of a safety system can be found.

  11. Phase mismatched optical parametric generation in semiconductor magnetoplasma

    NASA Astrophysics Data System (ADS)

    Dubey, Swati; Ghosh, S.; Jain, Kamal

    2017-05-01

    Optical parametric generation involves the interaction of pump, signal, and idler waves satisfying law of conservation of energy. Phase mismatch parameter plays important role for the spatial distribution of the field along the medium. In this paper instead of exactly matching wave vector, a small mismatch is admitted with a degree of phase velocity mismatch between these waves. Hence the medium must possess certain finite coherence length. This wave mixing process is well explained by coupled mode theory and one dimensional hydrodynamic model. Based on this scheme, expressions for threshold pump field and transmitted intensity have been derived. It is observed that the threshold pump intensity and transmitted intensity can be manipulated by varying doping concentration and magnetic field under phase mismatched condition. A compound semiconductor crystal of n-InSb is assumed to be shined at 77 K by a 10.6μm CO2 laser with photon energy well below band gap energy of the crystal, so that only free charge carrier influence the optical properties of the medium for the I.R. parametric generation in a semiconductor plasma medium. Favorable parameters were explored to incite the said process keeping in mind the cost effectiveness and conversion efficiency of the process.

  12. The detection of pleural effusion using a parametric EIT technique.

    PubMed

    Arad, M; Zlochiver, S; Davidson, T; Shoenfeld, Y; Adunsky, A; Abboud, S

    2009-04-01

    The bioimpedance technique provides a safe, low-cost and non-invasive alternative for routine monitoring of lung fluid levels in patients. In this study we have investigated the feasibility of bioimpedance measurements to monitor pleural effusion (PE) patients. The measurement system (eight-electrode thoracic belt, opposite sequential current injections, 3 mA, 20 kHz) employed a parametric reconstruction algorithm to assess the left and right lung resistivity values. Bioimpedance measurements were taken before and after the removal of pleural fluids, while the patient was sitting at rest during tidal respiration in order to minimize movements of the thoracic cavity. The mean resistivity difference between the lung on the side with PE and the lung on the other side was -48 Omega cm. A high correlation was found between the mean lung resistivity value before the removal of the fluids and the volume of pleural fluids removed, with a sensitivity of -0.17 Omega cm ml(-1) (linear regression, R=0.53). The present study further supports the feasibility and applicability of the bioimpedance technique, and specifically the approach of parametric left and right lung resistivity reconstruction, in monitoring lung patients.

  13. Maximum Marginal Likelihood Estimation of a Monotonic Polynomial Generalized Partial Credit Model with Applications to Multiple Group Analysis.

    PubMed

    Falk, Carl F; Cai, Li

    2016-06-01

    We present a semi-parametric approach to estimating item response functions (IRF) useful when the true IRF does not strictly follow commonly used functions. Our approach replaces the linear predictor of the generalized partial credit model with a monotonic polynomial. The model includes the regular generalized partial credit model at the lowest order polynomial. Our approach extends Liang's (A semi-parametric approach to estimate IRFs, Unpublished doctoral dissertation, 2007) method for dichotomous item responses to the case of polytomous data. Furthermore, item parameter estimation is implemented with maximum marginal likelihood using the Bock-Aitkin EM algorithm, thereby facilitating multiple group analyses useful in operational settings. Our approach is demonstrated on both educational and psychological data. We present simulation results comparing our approach to more standard IRF estimation approaches and other non-parametric and semi-parametric alternatives.

  14. THz-wave parametric sources and imaging applications

    NASA Astrophysics Data System (ADS)

    Kawase, Kodo

    2004-12-01

    We have studied the generation of terahertz (THz) waves by optical parametric processes based on laser light scattering from the polariton mode of nonlinear crystals. Using parametric oscillation of MgO-doped LiNbO3 crystal pumped by a nano-second Q-switched Nd:YAG laser, we have realized a widely tunable coherent THz-wave sources with a simple configuration. We have also developed a novel basic technology for THz imaging, which allows detection and identification of chemicals by introducing the component spatial pattern analysis. The spatial distributions of the chemicals were obtained from terahertz multispectral trasillumination images, using absorption spectra previously measured with a widely tunable THz-wave parametric oscillator. Further we have applied this technique to the detection and identification of illicit drugs concealed in envelopes. The samples we used were methamphetamine and MDMA, two of the most widely consumed illegal drugs in Japan, and aspirin as a reference.

  15. Parametrization study of the land multiparameter VTI elastic waveform inversion

    NASA Astrophysics Data System (ADS)

    He, W.; Plessix, R.-É.; Singh, S.

    2018-06-01

    Multiparameter inversion of seismic data remains challenging due to the trade-off between the different elastic parameters and the non-uniqueness of the solution. The sensitivity of the seismic data to a given subsurface elastic parameter depends on the source and receiver ray/wave path orientations at the subsurface point. In a high-frequency approximation, this is commonly analysed through the study of the radiation patterns that indicate the sensitivity of each parameter versus the incoming (from the source) and outgoing (to the receiver) angles. In practice, this means that the inversion result becomes sensitive to the choice of parametrization, notably because the null-space of the inversion depends on this choice. We can use a least-overlapping parametrization that minimizes the overlaps between the radiation patterns, in this case each parameter is only sensitive in a restricted angle domain, or an overlapping parametrization that contains a parameter sensitive to all angles, in this case overlaps between the radiation parameters occur. Considering a multiparameter inversion in an elastic vertically transverse isotropic medium and a complex land geological setting, we show that the inversion with the least-overlapping parametrization gives less satisfactory results than with the overlapping parametrization. The difficulties come from the complex wave paths that make difficult to predict the areas of sensitivity of each parameter. This shows that the parametrization choice should not only be based on the radiation pattern analysis but also on the angular coverage at each subsurface point that depends on geology and the acquisition layout.

  16. Non-parametric directionality analysis - Extension for removal of a single common predictor and application to time series.

    PubMed

    Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob

    2016-08-01

    The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Comparison of Salmonella enteritidis phage types isolated from layers and humans in Belgium in 2005.

    PubMed

    Welby, Sarah; Imberechts, Hein; Riocreux, Flavien; Bertrand, Sophie; Dierick, Katelijne; Wildemauwe, Christa; Hooyberghs, Jozef; Van der Stede, Yves

    2011-08-01

    The aim of this study was to investigate the available results for Belgium of the European Union coordinated monitoring program (2004/665 EC) on Salmonella in layers in 2005, as well as the results of the monthly outbreak reports of Salmonella Enteritidis in humans in 2005 to identify a possible statistical significant trend in both populations. Separate descriptive statistics and univariate analysis were carried out and the parametric and/or non-parametric hypothesis tests were conducted. A time cluster analysis was performed for all Salmonella Enteritidis phage types (PTs) isolated. The proportions of each Salmonella Enteritidis PT in layers and in humans were compared and the monthly distribution of the most common PT, isolated in both populations, was evaluated. The time cluster analysis revealed significant clusters during the months May and June for layers and May, July, August, and September for humans. PT21, the most frequently isolated PT in both populations in 2005, seemed to be responsible of these significant clusters. PT4 was the second most frequently isolated PT. No significant difference was found for the monthly trend evolution of both PT in both populations based on parametric and non-parametric methods. A similar monthly trend of PT distribution in humans and layers during the year 2005 was observed. The time cluster analysis and the statistical significance testing confirmed these results. Moreover, the time cluster analysis showed significant clusters during the summer time and slightly delayed in time (humans after layers). These results suggest a common link between the prevalence of Salmonella Enteritidis in layers and the occurrence of the pathogen in humans. Phage typing was confirmed to be a useful tool for identifying temporal trends.

  18. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  19. Inverse Thermal Analysis of Titanium GTA Welds Using Multiple Constraints

    NASA Astrophysics Data System (ADS)

    Lambrakos, S. G.; Shabaev, A.; Huang, L.

    2015-06-01

    Inverse thermal analysis of titanium gas-tungsten-arc welds using multiple constraint conditions is presented. This analysis employs a methodology that is in terms of numerical-analytical basis functions for inverse thermal analysis of steady-state energy deposition in plate structures. The results of this type of analysis provide parametric representations of weld temperature histories that can be adopted as input data to various types of computational procedures, such as those for prediction of solid-state phase transformations. In addition, these temperature histories can be used to construct parametric function representations for inverse thermal analysis of welds corresponding to other process parameters or welding processes whose process conditions are within similar regimes. The present study applies an inverse thermal analysis procedure that provides for the inclusion of constraint conditions associated with both solidification and phase transformation boundaries.

  20. Space Transportatioin System (STS) propellant scavenging system study. Volume 3: Cost and work breakdown structure-dictionary

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Fundamentally, the volumes of the oxidizer and fuel propellant scavenged from the orbiter and external tank determine the size and weight of the scavenging system. The optimization of system dimensions and weights is stimulated by the requirement to minimize the use of partial length of the orbiter payload bay. Thus, the cost estimates begin with weights established for the optimum design. Both the design, development, test, and evaluation and theoretical first unit hardware production costs are estimated from parametric cost weight scaling relations for four subsystems. For cryogenic propellants, the widely differing characteristics of the oxidizer and the fuel lead to two separate tank subsystems, in addition to the electrical and instrumentation subsystems. Hardwares costs also involve quantity, as an independent variable, since the number of production scavenging systems is not firm. For storable propellants, since the tankage volume of the oxidizer and fuel are equal, the hardware production costs for developing these systems are lower than for cryogenic propellants.

  1. Source Lines Counter (SLiC) Version 4.0

    NASA Technical Reports Server (NTRS)

    Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.

    2011-01-01

    Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T

  2. The value of improved (ERS) information based on domestic distribution effects of U.S. agriculture crops

    NASA Technical Reports Server (NTRS)

    Bradford, D. F.; Kelejian, H. H.; Brusch, R.; Gross, J.; Fishman, H.; Feenberg, D.

    1974-01-01

    The value of improving information for forecasting future crop harvests was investigated. Emphasis was placed upon establishing practical evaluation procedures firmly based in economic theory. The analysis was applied to the case of U.S. domestic wheat consumption. Estimates for a cost of storage function and a demand function for wheat were calculated. A model of market determinations of wheat inventories was developed for inventory adjustment. The carry-over horizon is computed by the solution of a nonlinear programming problem, and related variables such as spot and future price at each stage are determined. The model is adaptable to other markets. Results are shown to depend critically on the accuracy of current and proposed measurement techniques. The quantitative results are presented parametrically, in terms of various possible values of current and future accuracies.

  3. Hyperpolarizability and Operational Magic Wavelength in an Optical Lattice Clock

    NASA Astrophysics Data System (ADS)

    Brown, R. C.; Phillips, N. B.; Beloy, K.; McGrew, W. F.; Schioppo, M.; Fasano, R. J.; Milani, G.; Zhang, X.; Hinkley, N.; Leopardi, H.; Yoon, T. H.; Nicolodi, D.; Fortier, T. M.; Ludlow, A. D.

    2017-12-01

    Optical clocks benefit from tight atomic confinement enabling extended interrogation times as well as Doppler- and recoil-free operation. However, these benefits come at the cost of frequency shifts that, if not properly controlled, may degrade clock accuracy. Numerous theoretical studies have predicted optical lattice clock frequency shifts that scale nonlinearly with trap depth. To experimentally observe and constrain these shifts in an 171Yb optical lattice clock, we construct a lattice enhancement cavity that exaggerates the light shifts. We observe an atomic temperature that is proportional to the optical trap depth, fundamentally altering the scaling of trap-induced light shifts and simplifying their parametrization. We identify an "operational" magic wavelength where frequency shifts are insensitive to changes in trap depth. These measurements and scaling analysis constitute an essential systematic characterization for clock operation at the 10-18 level and beyond.

  4. Potency control of modified live viral vaccines for veterinary use.

    PubMed

    Terpstra, C; Kroese, A H

    1996-04-01

    This paper reviews various aspects of efficacy, and methods for assaying the potency of modified live viral vaccines. The pros and cons of parametric versus non-parametric methods for analysis of potency assays are discussed and critical levels of protection, as determined by the target(s) of vaccination, are exemplified. Recommendations are presented for designing potency assays on master virus seeds and vaccine batches.

  5. Potency control of modified live viral vaccines for veterinary use.

    PubMed

    Terpstra, C; Kroese, A H

    1996-01-01

    This paper reviews various aspects of efficacy, and methods for assaying the potency of modified live viral vaccines. The pros and cons of parametric versus non-parametric methods for analysis of potency assays are discussed and critical levels of protection, as determined by the target(s) of vaccination, are exemplified. Recommendations are presented for designing potency assays on master virus seeds and vaccine batches.

  6. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to highlight their inappropriateness for what is really needed at the conceptual phase of the design process. The First-Order Process Velocity Cost Model (FOPV) is discussed at length in the next section. This is followed by an application of the FOPV cost model to a generic wing. For designs that have no precedence as far as acquisition costs are concerned, cost data derived from the FOPV cost model may not be accurate enough because of new requirements for shape complexity, material, equipment and precision/tolerance. The concept of Cost Modulus is introduced at this point to compensate for these new burdens on the basic processes. This is treated in section 5. The cost of a design must be conveniently linked to its CAD representation. The interfacing of CAD models and spreadsheets containing the cost equations is the subject of the next section, section 6. The last section of the report is a summary of the progress made so far, and the anticipated research work to be achieved in the future.

  7. Efficient solution of a multi objective fuzzy transportation problem

    NASA Astrophysics Data System (ADS)

    Vidhya, V.; Ganesan, K.

    2018-04-01

    In this paper we present a methodology for the solution of multi-objective fuzzy transportation problem when all the cost and time coefficients are trapezoidal fuzzy numbers and the supply and demand are crisp numbers. Using a new fuzzy arithmetic on parametric form of trapezoidal fuzzy numbers and a new ranking method all efficient solutions are obtained. The proposed method is illustrated with an example.

  8. Cost-effectiveness analysis in the Spanish setting of the PEAK trial of panitumumab plus mFOLFOX6 compared with bevacizumab plus mFOLFOX6 for first-line treatment of patients with wild-type RAS metastatic colorectal cancer.

    PubMed

    Rivera, Fernando; Valladares, Manuel; Gea, Salvador; López-Martínez, Noemí

    2017-06-01

    To assess the cost-effectiveness of panitumumab in combination with mFOLFOX6 (oxaliplatin, 5-fluorouracil, and leucovorin) vs bevacizumab in combination with mFOLFOX6 as first-line treatment of patients with wild-type RAS metastatic colorectal cancer (mCRC) in Spain. A semi-Markov model was developed including the following health states: Progression free; Progressive disease: Treat with best supportive care; Progressive disease: Treat with subsequent active therapy; Attempted resection of metastases; Disease free after metastases resection; Progressive disease: after resection and relapse; and Death. Parametric survival analyses of patient-level progression free survival and overall survival data from the PEAK Phase II clinical trial were used to estimate health state transitions. Additional data from the PEAK trial were considered for the dose and duration of therapy, the use of subsequent therapy, the occurrence of adverse events, and the incidence and probability of time to metastasis resection. Utility weightings were calculated from patient-level data from panitumumab trials evaluating first-, second-, and third-line treatments. The study was performed from the Spanish National Health System (NHS) perspective including only direct costs. A life-time horizon was applied. Probabilistic sensitivity analyses and scenario sensitivity analyses were performed to assess the robustness of the model. Based on the PEAK trial, which demonstrated greater efficacy of panitumumab vs bevacizumab, both in combination with mFOLFOX6 first-line in wild-type RAS mCRC patients, the estimated incremental cost per life-year gained was €16,567 and the estimated incremental cost per quality-adjusted life year gained was €22,794. The sensitivity analyses showed the model was robust to alternative parameters and assumptions. The analysis was based on a simulation model and, therefore, the results should be interpreted cautiously. Based on the PEAK Phase II clinical trial and taking into account Spanish costs, the results of the analysis showed that first-line treatment of mCRC with panitumumab + mFOLFOX6 could be considered a cost-effective option compared with bevacizumab + mFOLFOX6 for the Spanish NHS.

  9. Costs associated with failure to respond to treatment among patients with rheumatoid arthritis initiating TNFi therapy: a retrospective claims analysis.

    PubMed

    Grabner, Michael; Boytsov, Natalie N; Huang, Qing; Zhang, Xiang; Yan, Tingjian; Curtis, Jeffrey R

    2017-05-15

    Tumor necrosis factor inhibitors (TNFi) are common second-line treatments for rheumatoid arthritis (RA). This study was designed to compare the real-world clinical and economic outcomes between patients with RA who responded to TNFi therapy and those who did not. For this retrospective cohort analysis we used medical and pharmacy claims from members of 14 large U.S. commercial health plans represented in the HealthCore Integrated Research Database. Adult patients (aged ≥18 years) diagnosed with RA and initiating TNFi therapy (index date) between 1 January 2007 and 30 April 2014 were included in the study. Treatment response was assessed using a previously developed and validated claims-based algorithm. Patients classified as treatment responders in the 12 months postindex were matched 1:1 to nonresponders on important baseline characteristics, including sex, age, index TNFi agent, and comorbidities. The matched cohorts were then compared on their all-cause and RA-related healthcare resource use, and costs were assessed from a payer perspective during the first, second, and third years postindex using parametric tests, regressions, and a nonparametric bootstrap. A total of 7797 patients met the study inclusion criteria, among whom 2337 (30%) were classified as treatment responders. The responders had significantly lower all-cause hospitalizations, emergency department visits, and physical/occupational therapy visits than matched nonresponders during the first-year postindex. Mean total all-cause medical costs were $5737 higher for matched nonresponders, largely driven by outpatient visits and hospitalizations. Mean all-cause pharmacy costs (excluding costs of biologics) were $354 higher for matched nonresponders. Mean RA-related pharmacy costs (conventional synthetic and biologic drugs), however, were $8579 higher in the responder cohort, driven by higher adherence to their index TNFi agent (p < 0.01 for all comparisons). A similar pattern of cost differentiation was observed over years 2 and 3 of follow-up. In this real-world study we found that, compared with matched nonresponders, patients who responded to TNFi treatments had lower all-cause medical, pharmacy, and total costs (excluding biologics) up to 3 years from initiation of TNFi therapy. These cost differences between the two cohorts provide a considerable offset to the cost of RA medications and should encourage close monitoring of treatment response to minimize disease progression with appropriate therapy choices.

  10. The Dangers of Parametrics

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.

    2017-01-01

    Building a parametric cost model is hard work. The data is noisy and often does not behave like we want it to. We need statistics to give us an indication of the goodness of our models, but; statistics can be manipulated and mislead. On top of all of that, our own very human biases can lead us astray; causing us to see patterns in the noise and draw false conclusions from the data. Yet, it is the data itself that is the foundation for making better cost estimates and cost models. I believe the mistake we often make is we believe that our models are representative of the data; that our models summarize the experiences, the knowledge, and the stories contained in the data. However, it is the opposite that is true. Our models are but imitations of reality. They give us trends, but not truth. The experiences, the knowledge, and the stories that we need in order to make good cost estimates is bound up in the data. You cannot separate good cost estimating from a knowledge of the historical data. One final thought. It is our attempts to make sense out of the randomness that leads us astray. In order to make progress as cost modelers and cost estimators, we must accept that there are real limitations on our ability to model the past and predict the future. I do not believe we should throw up our hands and say this is the best we can do. Rather, to see real improvement we must first recognize these limitations, avoid the easy but misleading solutions, and seek to find ways to better model the world we live in. I don't have any simple solutions. Perhaps the answers lie in better data or in a totally different approach to simulating how the world works. All I know is that we must do our best to speak truth to ourselves and our customers. Misleading ourselves and our customers will, in the end, result in an inability to have a positive impact on those we serve.

  11. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  12. An Economic Evaluation of Stopping versus Continuing TNF-Inhibitor Treatment in Rheumatoid Arthritis Patients in Remission or Low Disease Activity: results from the POET randomized trial.

    PubMed

    Tran-Duy, An; Ghiti Moghadam, Marjan; Oude Voshaar, Martijn A H; Vonkeman, Harald E; Boonen, Annelies; Clarke, Philip; McColl, Geoff; Ten Klooster, Peter M; Zijlstra, T R; Lems, Willem F; Riyazi, N; Griep, E N; Hazes, J M W; Landewé, Robert; Bernelot Moens, Hein J; van Riel, Piet L C M; van de Laar, Mart A F J; Jansen, T L

    2018-05-09

    To evaluate, from a societal perspective, the incremental cost-effectiveness of withdrawing tumor necrosis factor inhibitors (TNFis) compared to continuation of these drugs within a one-year randomized trial among patients with rheumatoid arthritis (RA) having longstanding stable disease activity or remission. Data were collected from a pragmatic, open label trial. Cost-utility analysis was performed using the non-parametric bootstrapping method and a cost-effectiveness acceptability curve was constructed using the net-monetary benefit (NMB) framework, where a willingness-to-accept threshold (WTA) was defined as the minimal cost saved that a patient accepted for each quality-adjusted life year (QALY) lost. 531 patients were randomized to the Stop Group and 186 patients to the Continue Group. Withdrawal of TNFis resulted in more than 60% reduction of the total drug cost, but led to an increase of about 30% in the other healthcare expenditure. Compared to continuation, stopping TNFis resulted in a mean yearly cost saving of €7,133 (95% CI, [€6,071, €8,234]) and was associated with a mean loss of QALYs of 0.02 (95% CI, [0.002, 0.040]). Mean saved cost [95% CI] per QALY lost and per extra flare incurred in the Stop group compared to the Continuation group was €368,269 [€155,132, €1,675,909] and €17,670 [€13,650, €22,721], respectively. At a WTA of €98,438 per QALY lost, the probability that stopping TNFis is cost-effective was 100%. Although an official WTA is not defined, the mean saved cost of €368,269 per QALY lost seems acceptable in The Netherlands, given existing data on the willingness-to-pay. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Enhanced multi-protocol analysis via intelligent supervised embedding (EMPrAvISE): detecting prostate cancer on multi-parametric MRI

    NASA Astrophysics Data System (ADS)

    Viswanath, Satish; Bloch, B. Nicholas; Chappelow, Jonathan; Patel, Pratik; Rofsky, Neil; Lenkinski, Robert; Genega, Elizabeth; Madabhushi, Anant

    2011-03-01

    Currently, there is significant interest in developing methods for quantitative integration of multi-parametric (structural, functional) imaging data with the objective of building automated meta-classifiers to improve disease detection, diagnosis, and prognosis. Such techniques are required to address the differences in dimensionalities and scales of individual protocols, while deriving an integrated multi-parametric data representation which best captures all disease-pertinent information available. In this paper, we present a scheme called Enhanced Multi-Protocol Analysis via Intelligent Supervised Embedding (EMPrAvISE); a powerful, generalizable framework applicable to a variety of domains for multi-parametric data representation and fusion. Our scheme utilizes an ensemble of embeddings (via dimensionality reduction, DR); thereby exploiting the variance amongst multiple uncorrelated embeddings in a manner similar to ensemble classifier schemes (e.g. Bagging, Boosting). We apply this framework to the problem of prostate cancer (CaP) detection on 12 3 Tesla pre-operative in vivo multi-parametric (T2-weighted, Dynamic Contrast Enhanced, and Diffusion-weighted) magnetic resonance imaging (MRI) studies, in turn comprising a total of 39 2D planar MR images. We first align the different imaging protocols via automated image registration, followed by quantification of image attributes from individual protocols. Multiple embeddings are generated from the resultant high-dimensional feature space which are then combined intelligently to yield a single stable solution. Our scheme is employed in conjunction with graph embedding (for DR) and probabilistic boosting trees (PBTs) to detect CaP on multi-parametric MRI. Finally, a probabilistic pairwise Markov Random Field algorithm is used to apply spatial constraints to the result of the PBT classifier, yielding a per-voxel classification of CaP presence. Per-voxel evaluation of detection results against ground truth for CaP extent on MRI (obtained by spatially registering pre-operative MRI with available whole-mount histological specimens) reveals that EMPrAvISE yields a statistically significant improvement (AUC=0.77) over classifiers constructed from individual protocols (AUC=0.62, 0.62, 0.65, for T2w, DCE, DWI respectively) as well as one trained using multi-parametric feature concatenation (AUC=0.67).

  14. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  15. Dual frequency parametric excitation of a nonlinear, multi degree of freedom mechanical amplifier with electronically modified topology

    NASA Astrophysics Data System (ADS)

    Dolev, A.; Bucher, I.

    2018-04-01

    Mechanical or electromechanical amplifiers can exploit the high-Q and low noise features of mechanical resonance, in particular when parametric excitation is employed. Multi-frequency parametric excitation introduces tunability and is able to project weak input signals on a selected resonance. The present paper addresses multi degree of freedom mechanical amplifiers or resonators whose analysis and features require treatment of the spatial as well as temporal behavior. In some cases, virtual electronic coupling can alter the given topology of the resonator to better amplify specific inputs. An analytical development is followed by a numerical and experimental sensitivity and performance verifications, illustrating the advantages and disadvantages of such topologies.

  16. Rayleigh-type parametric chemical oscillation.

    PubMed

    Ghosh, Shyamolina; Ray, Deb Shankar

    2015-09-28

    We consider a nonlinear chemical dynamical system of two phase space variables in a stable steady state. When the system is driven by a time-dependent sinusoidal forcing of a suitable scaling parameter at a frequency twice the output frequency and the strength of perturbation exceeds a threshold, the system undergoes sustained Rayleigh-type periodic oscillation, wellknown for parametric oscillation in pipe organs and distinct from the usual forced quasiperiodic oscillation of a damped nonlinear system where the system is oscillatory even in absence of any external forcing. Our theoretical analysis of the parametric chemical oscillation is corroborated by full numerical simulation of two well known models of chemical dynamics, chlorite-iodine-malonic acid and iodine-clock reactions.

  17. Bayesian non-parametric inference for stochastic epidemic models using Gaussian Processes.

    PubMed

    Xu, Xiaoguang; Kypraios, Theodore; O'Neill, Philip D

    2016-10-01

    This paper considers novel Bayesian non-parametric methods for stochastic epidemic models. Many standard modeling and data analysis methods use underlying assumptions (e.g. concerning the rate at which new cases of disease will occur) which are rarely challenged or tested in practice. To relax these assumptions, we develop a Bayesian non-parametric approach using Gaussian Processes, specifically to estimate the infection process. The methods are illustrated with both simulated and real data sets, the former illustrating that the methods can recover the true infection process quite well in practice, and the latter illustrating that the methods can be successfully applied in different settings. © The Author 2016. Published by Oxford University Press.

  18. Synthesis and analysis of separation networks for the recovery of intracellular chemicals generated from microbial-based conversions

    DOE PAGES

    Yenkie, Kirti M.; Wu, Wenzhao; Maravelias, Christos T.

    2017-05-08

    Background. Bioseparations can contribute to more than 70% in the total production cost of a bio-based chemical, and if the desired chemical is localized intracellularly, there can be additional challenges associated with its recovery. Based on the properties of the desired chemical and other components in the stream, there can be multiple feasible options for product recovery. These options are composed of several alternative technologies, performing similar tasks. The suitability of a technology for a particular chemical depends on (1) its performance parameters, such as separation efficiency; (2) cost or amount of added separating agent; (3) properties of the bioreactormore » effluent (e.g., biomass titer, product content); and (4) final product specifications. Our goal is to first synthesize alternative separation options and then analyze how technology selection affects the overall process economics. To achieve this, we propose an optimization-based framework that helps in identifying the critical technologies and parameters. Results. We study the separation networks for two representative classes of chemicals based on their properties. The separation network is divided into three stages: cell and product isolation (stage I), product concentration (II), and product purification and refining (III). Each stage exploits differences in specific product properties for achieving the desired product quality. The cost contribution analysis for the two cases (intracellular insoluble and intracellular soluble) reveals that stage I is the key cost contributor (>70% of the overall cost). Further analysis suggests that changes in input conditions and technology performance parameters lead to new designs primarily in stage I. Conclusions. The proposed framework provides significant insights for technology selection and assists in making informed decisions regarding technologies that should be used in combination for a given set of stream/product properties and final output specifications. Additionally, the parametric sensitivity provides an opportunity to make crucial design and selection decisions in a comprehensive and rational manner. This will prove valuable in the selection of chemicals to be produced using bioconversions (bioproducts) as well as in creating better bioseparation flow sheets for detailed economic assessment and process implementation on the commercial scale.« less

  19. Synthesis and analysis of separation networks for the recovery of intracellular chemicals generated from microbial-based conversions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yenkie, Kirti M.; Wu, Wenzhao; Maravelias, Christos T.

    Background. Bioseparations can contribute to more than 70% in the total production cost of a bio-based chemical, and if the desired chemical is localized intracellularly, there can be additional challenges associated with its recovery. Based on the properties of the desired chemical and other components in the stream, there can be multiple feasible options for product recovery. These options are composed of several alternative technologies, performing similar tasks. The suitability of a technology for a particular chemical depends on (1) its performance parameters, such as separation efficiency; (2) cost or amount of added separating agent; (3) properties of the bioreactormore » effluent (e.g., biomass titer, product content); and (4) final product specifications. Our goal is to first synthesize alternative separation options and then analyze how technology selection affects the overall process economics. To achieve this, we propose an optimization-based framework that helps in identifying the critical technologies and parameters. Results. We study the separation networks for two representative classes of chemicals based on their properties. The separation network is divided into three stages: cell and product isolation (stage I), product concentration (II), and product purification and refining (III). Each stage exploits differences in specific product properties for achieving the desired product quality. The cost contribution analysis for the two cases (intracellular insoluble and intracellular soluble) reveals that stage I is the key cost contributor (>70% of the overall cost). Further analysis suggests that changes in input conditions and technology performance parameters lead to new designs primarily in stage I. Conclusions. The proposed framework provides significant insights for technology selection and assists in making informed decisions regarding technologies that should be used in combination for a given set of stream/product properties and final output specifications. Additionally, the parametric sensitivity provides an opportunity to make crucial design and selection decisions in a comprehensive and rational manner. This will prove valuable in the selection of chemicals to be produced using bioconversions (bioproducts) as well as in creating better bioseparation flow sheets for detailed economic assessment and process implementation on the commercial scale.« less

  20. Synthesis and analysis of separation networks for the recovery of intracellular chemicals generated from microbial-based conversions.

    PubMed

    Yenkie, Kirti M; Wu, Wenzhao; Maravelias, Christos T

    2017-01-01

    Bioseparations can contribute to more than 70% in the total production cost of a bio-based chemical, and if the desired chemical is localized intracellularly, there can be additional challenges associated with its recovery. Based on the properties of the desired chemical and other components in the stream, there can be multiple feasible options for product recovery. These options are composed of several alternative technologies, performing similar tasks. The suitability of a technology for a particular chemical depends on (1) its performance parameters, such as separation efficiency; (2) cost or amount of added separating agent; (3) properties of the bioreactor effluent (e.g., biomass titer, product content); and (4) final product specifications. Our goal is to first synthesize alternative separation options and then analyze how technology selection affects the overall process economics. To achieve this, we propose an optimization-based framework that helps in identifying the critical technologies and parameters. We study the separation networks for two representative classes of chemicals based on their properties. The separation network is divided into three stages: cell and product isolation (stage I), product concentration (II), and product purification and refining (III). Each stage exploits differences in specific product properties for achieving the desired product quality. The cost contribution analysis for the two cases (intracellular insoluble and intracellular soluble) reveals that stage I is the key cost contributor (>70% of the overall cost). Further analysis suggests that changes in input conditions and technology performance parameters lead to new designs primarily in stage I. The proposed framework provides significant insights for technology selection and assists in making informed decisions regarding technologies that should be used in combination for a given set of stream/product properties and final output specifications. Additionally, the parametric sensitivity provides an opportunity to make crucial design and selection decisions in a comprehensive and rational manner. This will prove valuable in the selection of chemicals to be produced using bioconversions (bioproducts) as well as in creating better bioseparation flow sheets for detailed economic assessment and process implementation on the commercial scale.

  1. Using Survival Analysis to Improve Estimates of Life Year Gains in Policy Evaluations.

    PubMed

    Meacock, Rachel; Sutton, Matt; Kristensen, Søren Rud; Harrison, Mark

    2017-05-01

    Policy evaluations taking a lifetime horizon have converted estimated changes in short-term mortality to expected life year gains using general population life expectancy. However, the life expectancy of the affected patients may differ from the general population. In trials, survival models are commonly used to extrapolate life year gains. The objective was to demonstrate the feasibility and materiality of using parametric survival models to extrapolate future survival in health care policy evaluations. We used our previous cost-effectiveness analysis of a pay-for-performance program as a motivating example. We first used the cohort of patients admitted prior to the program to compare 3 methods for estimating remaining life expectancy. We then used a difference-in-differences framework to estimate the life year gains associated with the program using general population life expectancy and survival models. Patient-level data from Hospital Episode Statistics was utilized for patients admitted to hospitals in England for pneumonia between 1 April 2007 and 31 March 2008 and between 1 April 2009 and 31 March 2010, and linked to death records for the period from 1 April 2007 to 31 March 2011. In our cohort of patients, using parametric survival models rather than general population life expectancy figures reduced the estimated mean life years remaining by 30% (9.19 v. 13.15 years, respectively). However, the estimated mean life year gains associated with the program are larger using survival models (0.380 years) compared to using general population life expectancy (0.154 years). Using general population life expectancy to estimate the impact of health care policies can overestimate life expectancy but underestimate the impact of policies on life year gains. Using a longer follow-up period improved the accuracy of estimated survival and program impact considerably.

  2. The association between automatic generic substitution and treatment persistence with oral bisphosphonates.

    PubMed

    Ström, O; Landfeldt, E

    2012-08-01

    Automatic generic substitution of alendronate products, used to reduce drug costs, and medication persistence was studied retrospectively between 2006 and 2009. During this period the number of, and the rate of substitution between, alendronate products increased while persistence decreased. Patient preferences should be considered when designing and evaluating generic policies. Automatic generic substitution (AGS) was implemented in Sweden in 2002. The objective of this study was to investigate the association between AGS and persistence with alendronate treatment of primary osteoporosis in Sweden. An open historical cohort of women and men (n = 36,433) was identified in the Swedish Prescribed Drug Register through filled prescriptions for alendronate or risedronate between 2005 and 2009. Co-morbidity data was extracted from the National Patient Register. The association between AGS and medication persistence was investigated using non-parametric and parametric survival analysis. Between 2006 and 2009, the number of alendronate products increased from 15 to 25, the proportion of prescriptions constituting a substitution increased from 10.8% to 45.2%, and the proportion of patients persisting with alendronate treatment for 12 months fell from 66.9% to 51.7%. Patients starting alendronate treatment in 2006 had lower risk of stopping treatment compared with those starting in 2007 (HR 1.34, 95% CI 1.29-1.39), 2008 (HR 1.49, 95% CI 1.43-1.55), and 2009 (HR 1.50, 95% CI 1.40-1.60). No difference was observed in persistence with proprietary risedronate during the same period. Individuals who had their alendronate product substituted at the first prescription refill had significantly higher probability of discontinuation (HR 1.25, 95% CI 1.20-1.30). AGS causes increased product substitution which appears to be associated with reduced treatment persistence. Poor health outcomes and associated costs due to forgone drug exposure should be taken into account in the design and evaluation of policies implemented to encourage utilisation of generic medicines.

  3. The analysis of incontinence episodes and other count data in patients with overactive bladder by Poisson and negative binomial regression.

    PubMed

    Martina, R; Kay, R; van Maanen, R; Ridder, A

    2015-01-01

    Clinical studies in overactive bladder have traditionally used analysis of covariance or nonparametric methods to analyse the number of incontinence episodes and other count data. It is known that if the underlying distributional assumptions of a particular parametric method do not hold, an alternative parametric method may be more efficient than a nonparametric one, which makes no assumptions regarding the underlying distribution of the data. Therefore, there are advantages in using methods based on the Poisson distribution or extensions of that method, which incorporate specific features that provide a modelling framework for count data. One challenge with count data is overdispersion, but methods are available that can account for this through the introduction of random effect terms in the modelling, and it is this modelling framework that leads to the negative binomial distribution. These models can also provide clinicians with a clearer and more appropriate interpretation of treatment effects in terms of rate ratios. In this paper, the previously used parametric and non-parametric approaches are contrasted with those based on Poisson regression and various extensions in trials evaluating solifenacin and mirabegron in patients with overactive bladder. In these applications, negative binomial models are seen to fit the data well. Copyright © 2014 John Wiley & Sons, Ltd.

  4. A review of recent developments in parametric based acoustic emission techniques applied to concrete structures

    NASA Astrophysics Data System (ADS)

    Vidya Sagar, R.; Raghu Prasad, B. K.

    2012-03-01

    This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.

  5. Improved estimation of parametric images of cerebral glucose metabolic rate from dynamic FDG-PET using volume-wise principle component analysis

    NASA Astrophysics Data System (ADS)

    Dai, Xiaoqian; Tian, Jie; Chen, Zhe

    2010-03-01

    Parametric images can represent both spatial distribution and quantification of the biological and physiological parameters of tracer kinetics. The linear least square (LLS) method is a well-estimated linear regression method for generating parametric images by fitting compartment models with good computational efficiency. However, bias exists in LLS-based parameter estimates, owing to the noise present in tissue time activity curves (TTACs) that propagates as correlated error in the LLS linearized equations. To address this problem, a volume-wise principal component analysis (PCA) based method is proposed. In this method, firstly dynamic PET data are properly pre-transformed to standardize noise variance as PCA is a data driven technique and can not itself separate signals from noise. Secondly, the volume-wise PCA is applied on PET data. The signals can be mostly represented by the first few principle components (PC) and the noise is left in the subsequent PCs. Then the noise-reduced data are obtained using the first few PCs by applying 'inverse PCA'. It should also be transformed back according to the pre-transformation method used in the first step to maintain the scale of the original data set. Finally, the obtained new data set is used to generate parametric images using the linear least squares (LLS) estimation method. Compared with other noise-removal method, the proposed method can achieve high statistical reliability in the generated parametric images. The effectiveness of the method is demonstrated both with computer simulation and with clinical dynamic FDG PET study.

  6. Outcome of temporal lobe epilepsy surgery predicted by statistical parametric PET imaging.

    PubMed

    Wong, C Y; Geller, E B; Chen, E Q; MacIntyre, W J; Morris, H H; Raja, S; Saha, G B; Lüders, H O; Cook, S A; Go, R T

    1996-07-01

    PET is useful in the presurgical evaluation of temporal lobe epilepsy. The purpose of this retrospective study is to assess the clinical use of statistical parametric imaging in predicting surgical outcome. Interictal 18FDG-PET scans in 17 patients with surgically-treated temporal lobe epilepsy (Group A-13 seizure-free, group B = 4 not seizure-free at 6 mo) were transformed into statistical parametric imaging, with each pixel representing a z-score value by using the mean and s.d. of count distribution in each individual patient, for both visual and quantitative analysis. Mean z-scores were significantly more negative in anterolateral (AL) and mesial (M) regions on the operated side than the nonoperated side in group A (AL: p < 0.00005, M: p = 0.0097), but not in group B (AL: p = 0.46, M: p = 0.08). Statistical parametric imaging correctly lateralized 16 out of 17 patients. Only the AL region, however, was significant in predicting surgical outcome (F = 29.03, p < 0.00005). Using a cut-off z-score value of -1.5, statistical parametric imaging correctly classified 92% of temporal lobes from group A and 88% of those from Group B. The preliminary results indicate that statistical parametric imaging provides both clinically useful information for lateralization in temporal lobe epilepsy and a reliable predictive indicator of clinical outcome following surgical treatment.

  7. Minimum noise impact aircraft trajectories

    NASA Technical Reports Server (NTRS)

    Jacobson, I. D.; Melton, R. G.

    1981-01-01

    Numerical optimization is used to compute the optimum flight paths, based upon a parametric form that implicitly includes some of the problem restrictions. The other constraints are formulated as penalties in the cost function. Various aircraft on multiple trajectores (landing and takeoff) can be considered. The modular design employed allows for the substitution of alternate models of the population distribution, aircraft noise, flight paths, and annoyance, or for the addition of other features (e.g., fuel consumption) in the cost function. A reduction in the required amount of searching over local minima was achieved through use of the presence of statistical lateral dispersion in the flight paths.

  8. The linear transformation model with frailties for the analysis of item response times.

    PubMed

    Wang, Chun; Chang, Hua-Hua; Douglas, Jeffrey A

    2013-02-01

    The item response times (RTs) collected from computerized testing represent an underutilized source of information about items and examinees. In addition to knowing the examinees' responses to each item, we can investigate the amount of time examinees spend on each item. In this paper, we propose a semi-parametric model for RTs, the linear transformation model with a latent speed covariate, which combines the flexibility of non-parametric modelling and the brevity as well as interpretability of parametric modelling. In this new model, the RTs, after some non-parametric monotone transformation, become a linear model with latent speed as covariate plus an error term. The distribution of the error term implicitly defines the relationship between the RT and examinees' latent speeds; whereas the non-parametric transformation is able to describe various shapes of RT distributions. The linear transformation model represents a rich family of models that includes the Cox proportional hazards model, the Box-Cox normal model, and many other models as special cases. This new model is embedded in a hierarchical framework so that both RTs and responses are modelled simultaneously. A two-stage estimation method is proposed. In the first stage, the Markov chain Monte Carlo method is employed to estimate the parametric part of the model. In the second stage, an estimating equation method with a recursive algorithm is adopted to estimate the non-parametric transformation. Applicability of the new model is demonstrated with a simulation study and a real data application. Finally, methods to evaluate the model fit are suggested. © 2012 The British Psychological Society.

  9. Formation of parametric images using mixed-effects models: a feasibility study.

    PubMed

    Huang, Husan-Ming; Shih, Yi-Yu; Lin, Chieh

    2016-03-01

    Mixed-effects models have been widely used in the analysis of longitudinal data. By presenting the parameters as a combination of fixed effects and random effects, mixed-effects models incorporating both within- and between-subject variations are capable of improving parameter estimation. In this work, we demonstrate the feasibility of using a non-linear mixed-effects (NLME) approach for generating parametric images from medical imaging data of a single study. By assuming that all voxels in the image are independent, we used simulation and animal data to evaluate whether NLME can improve the voxel-wise parameter estimation. For testing purposes, intravoxel incoherent motion (IVIM) diffusion parameters including perfusion fraction, pseudo-diffusion coefficient and true diffusion coefficient were estimated using diffusion-weighted MR images and NLME through fitting the IVIM model. The conventional method of non-linear least squares (NLLS) was used as the standard approach for comparison of the resulted parametric images. In the simulated data, NLME provides more accurate and precise estimates of diffusion parameters compared with NLLS. Similarly, we found that NLME has the ability to improve the signal-to-noise ratio of parametric images obtained from rat brain data. These data have shown that it is feasible to apply NLME in parametric image generation, and the parametric image quality can be accordingly improved with the use of NLME. With the flexibility to be adapted to other models or modalities, NLME may become a useful tool to improve the parametric image quality in the future. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Win some, lose some: parental hypertension and heart rate change in an incentive versus response cost paradigm.

    PubMed

    Hastrup, J L; Johnson, C A; Hotchkiss, A P; Kraemer, D L

    1986-11-01

    Fowles (1983), citing evidence from separate studies, suggests that both incentive and response cost paradigms increase heart rate and should be subsumed under Gray's (1975) 'appetitive motivational system'. Shock avoidance and loss of reward (response cost) contingencies, while aversive, appear to evoke this motivational system; consequently both should elicit heart rate increases independent of anxiety. The present investigation compared magnitude of heart rate changes observed under conditions of winning and losing money. Results showed: no differences between incentive and response cost conditions; no effect of state anxiety on heart rate in these conditions, despite an elevation of state anxiety on the task day relative to a subsequent relaxation day assessment; and some evidence for the presence under both such appetitive conditions of cardiovascular hyperresponsivity among offspring of hypertensive parents. The results suggest a need for systematic parametric studies of experimental conditions.

  11. Accurate analysis and visualization of cardiac (11)C-PIB uptake in amyloidosis with semiautomatic software.

    PubMed

    Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark

    2016-08-01

    (11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.

  12. The peats of Costa Rica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thayer, G.R.; Williamson, K.D. Jr.; Ramirez, O.

    The authors compare the competitive position of peat for energy with coal, oil, and cogenerative systems in gasifiers and solid-fuel boilers. They also explore the possibility for peat use in industry. To identify the major factors, they analyze costs using a Los Alamos levelized cost code, and they study parametric costs, comparing peat production in constant dollars with interest rates and return on investment. They consider costs of processing plant construction, sizes and kinds of boilers, retrofitting, peat drying, and mining methods. They examine mining requirements for Moin, Changuinola, and El Cairo and review wet mining and dewatering methods. Peatmore » can, indeed, be competitive with other energy sources, but this depends on the ratio of fuel costs to boiler costs. This ratio is nearly constant in comparison with cogeneration in a steam-only production system. For grate boilers using Costa Rican high-ash peat, and for small nonautomatic boilers now used in Costa Rica, the authors recommend combustion tests. An appendix contains a preliminary mining plan and cost estimate for the El Cairo peat deposit. 8 refs., 43 figs., 19 tabs.« less

  13. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Parametric-Studies and Data-Plotting Modules for the SOAP

    NASA Technical Reports Server (NTRS)

    2008-01-01

    "Parametric Studies" and "Data Table Plot View" are the names of software modules in the Satellite Orbit Analysis Program (SOAP). Parametric Studies enables parameterization of as many as three satellite or ground-station attributes across a range of values and computes the average, minimum, and maximum of a specified metric, the revisit time, or 21 other functions at each point in the parameter space. This computation produces a one-, two-, or three-dimensional table of data representing statistical results across the parameter space. Inasmuch as the output of a parametric study in three dimensions can be a very large data set, visualization is a paramount means of discovering trends in the data (see figure). Data Table Plot View enables visualization of the data table created by Parametric Studies or by another data source: this module quickly generates a display of the data in the form of a rotatable three-dimensional-appearing plot, making it unnecessary to load the SOAP output data into a separate plotting program. The rotatable three-dimensionalappearing plot makes it easy to determine which points in the parameter space are most desirable. Both modules provide intuitive user interfaces for ease of use.

  15. Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information

    NASA Technical Reports Server (NTRS)

    Butts, Glenn

    2007-01-01

    Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.

  16. Software cost/resource modeling: Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  17. Technology needs for lunar and Mars space transfer systems

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon R.; Cothran, Bradley C.; Donahue, Benjamin; Mcghee, Jerry

    1991-01-01

    The determination of appropriate space transportation technologies and operating modes is discussed with respect to both lunar and Mars missions. Three levels of activity are set forth to examine the sensitivity of transportation preferences including 'minimum,' 'full science,' and 'industrialization and settlement' categories. High-thrust-profile missions for lunar and Mars transportation are considered in terms of their relative advantages, and transportation options are defined in terms of propulsion and braking technologies. Costs and life-cycle cost estimates are prepared for the transportation preferences by using a parametric cost model, and a return-on-investment summary is given. Major technological needs for the programs are listed and include storable propulsion systems; cryogenic engines and fluids management; aerobraking; and nuclear thermal, nuclear electric, electric, and solar electric propulsion technologies.

  18. Fast Multiscale Algorithms for Wave Propagation in Heterogeneous Environments

    DTIC Science & Technology

    2016-01-07

    methods for waves’’, Nonlinear solvers for high- intensity focused ultrasound with application to cancer treatment, AIMS, Palo Alto, 2012. ``Hermite...formulation but different parametrizations. . . . . . . . . . . . 6 4 Density µ(t) at mode 0 for scattering of a plane Gaussian pulse from a sphere. On the...spatiotemporal scales. Two crucial components of the highly-efficient, general-purpose wave simulator we envision are • Reliable, low -cost methods for truncating

  19. Adaptive multibeam phased array design for a Spacelab experiment

    NASA Technical Reports Server (NTRS)

    Noji, T. T.; Fass, S.; Fuoco, A. M.; Wang, C. D.

    1977-01-01

    The parametric tradeoff analyses and design for an Adaptive Multibeam Phased Array (AMPA) for a Spacelab experiment are described. This AMPA Experiment System was designed with particular emphasis to maximize channel capacity and minimize implementation and cost impacts for future austere maritime and aeronautical users, operating with a low gain hemispherical coverage antenna element, low effective radiated power, and low antenna gain-to-system noise temperature ratio.

  20. Propulsion Study for Small Transport Aircraft Technology (STAT)

    NASA Technical Reports Server (NTRS)

    Gill, J. C.; Earle, R. V.; Staton, D. V.; Stolp, P. C.; Huelster, D. S.; Zolezzi, B. A.

    1980-01-01

    Propulsion requirements were determined for 0.5 and 0.7 Mach aircraft. Sensitivity studies were conducted on both these aircraft to determine parametrically the influence of propulsion characteristics on aircraft size and direct operating cost (DOC). Candidate technology elements and design features were identified and parametric studies conducted to select the STAT advanced engine cycle. Trade off studies were conducted to determine those advanced technologies and design features that would offer a reduction in DOC for operation of the STAT engines. These features were incorporated in the two STAT engines. A benefit assessment was conducted comparing the STAT engines to current technology engines of the same power and to 1985 derivatives of the current technology engines. Research and development programs were recommended as part of an overall technology development plan to ensure that full commercial development of the STAT engines could be initiated in 1988.

  1. Parametric study on laminar flow for finite wings at supersonic speeds

    NASA Technical Reports Server (NTRS)

    Garcia, Joseph Avila

    1994-01-01

    Laminar flow control has been identified as a key element in the development of the next generation of High Speed Transports. Extending the amount of laminar flow over an aircraft will increase range, payload, and altitude capabilities as well as lower fuel requirements, skin temperature, and therefore the overall cost. A parametric study to predict the extent of laminar flow for finite wings at supersonic speeds was conducted using a computational fluid dynamics (CFD) code coupled with a boundary layer stability code. The parameters investigated in this study were Reynolds number, angle of attack, and sweep. The results showed that an increase in angle of attack for specific Reynolds numbers can actually delay transition. Therefore, higher lift capability, caused by the increased angle of attack, as well as a reduction in viscous drag, due to the delay in transition, can be expected simultaneously. This results in larger payload and range.

  2. Rapid prototyping and parametric optimization of plastic acoustofluidic devices for blood-bacteria separation.

    PubMed

    Silva, R; Dow, P; Dubay, R; Lissandrello, C; Holder, J; Densmore, D; Fiering, J

    2017-09-01

    Acoustic manipulation has emerged as a versatile method for microfluidic separation and concentration of particles and cells. Most recent demonstrations of the technology use piezoelectric actuators to excite resonant modes in silicon or glass microchannels. Here, we focus on acoustic manipulation in disposable, plastic microchannels in order to enable a low-cost processing tool for point-of-care diagnostics. Unfortunately, the performance of resonant acoustofluidic devices in plastic is hampered by a lack of a predictive model. In this paper, we build and test a plastic blood-bacteria separation device informed by a design of experiments approach, parametric rapid prototyping, and screening by image-processing. We demonstrate that the new device geometry can separate bacteria from blood while operating at 275% greater flow rate as well as reduce the power requirement by 82%, while maintaining equivalent separation performance and resolution when compared to the previously published plastic acoustofluidic separation device.

  3. User data dissemination concepts for earth resources

    NASA Technical Reports Server (NTRS)

    Davies, R.; Scott, M.; Mitchell, C.; Torbett, A.

    1976-01-01

    Domestic data dissemination networks for earth-resources data in the 1985-1995 time frame were evaluated. The following topics were addressed: (1) earth-resources data sources and expected data volumes, (2) future user demand in terms of data volume and timeliness, (3) space-to-space and earth point-to-point transmission link requirements and implementation, (4) preprocessing requirements and implementation, (5) network costs, and (6) technological development to support this implementation. This study was parametric in that the data input (supply) was varied by a factor of about fifteen while the user request (demand) was varied by a factor of about nineteen. Correspondingly, the time from observation to delivery to the user was varied. This parametric evaluation was performed by a computer simulation that was based on network alternatives and resulted in preliminary transmission and preprocessing requirements. The earth-resource data sources considered were: shuttle sorties, synchronous satellites (e.g., SEOS), aircraft, and satellites in polar orbits.

  4. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  5. A financial ratio analysis of for-profit and non-profit rural referral centers.

    PubMed

    McCue, Michael J; Nayar, Preethy

    2009-01-01

    National financial data show that rural referral center (RRC) hospitals have performed well financially. RRC hospitals' median cash flow margin ratio was 10.04% in 2002 and grew to 11.04% in 2004. The aim of this study is to compare the ratio analysis of key operational and financial performance measures of for-profit RRCs to those of private, non-profit RRCs. To control for accounting aberrations within a given year, we selected RRCs that reported 3 consecutive fiscal years of Centers for Medicare and Medicaid Services (CMS) cost report data, starting with fiscal year 2004 and ending with fiscal year 2006. Given a limited sample size of 28 for-profit RRCs and 127 non-profits, we used the non-parametric median test to assess median differences in operational and key financial measures between the 2 groups. For-profit RRCs treated less complex cases and reported fewer discharges per bed and fewer occupied beds than did non-profits. However, for-profit RRCs staffed their beds with fewer full-time-equivalent (FTE) personnel and served a higher proportion of Medicaid patients. For-profit RRCs generated operating cash flow margins in excess of 19%, compared to only 8.1% for non-profits, and maintained newer plant and equipment. For-profit RRCs generated a substantially higher cash flow margin by controlling their operating costs.

  6. Epoxy matrix composites filled with micro-sized LD sludge: wear characterization and analysis

    NASA Astrophysics Data System (ADS)

    Purohit, Abhilash; Satapathy, Alok

    2016-02-01

    Owing to the very high cost of conventional filler materials in polymer composites, exploring the possibility of using low cost minerals and industrial wastes for this purpose has become the need of the hour. In view of this, the present work includes the development and the wear performance evaluation of a new class of composites consisting of epoxy and microsized LD sludge. LD sludge or the Linz-Donawitz Sludge (LDS) are the fine solid particles recovered after wet cleaning of the gas emerging from LD convertors during steel making. Epoxy composites filled with different proportions (0, 5, 10, 15 and 20 wt %) of LDS are fabricated by conventional hand lay-up technique. Dry sliding wear trials are performed on the composite specimens under different test conditions as per ASTM G 99 following a design of experiment approach based on Taguchi's orthogonal arrays. The Taguchi approach leads to the recognition of most powerful variables that predominantly control the wear rate. This parametric analysis reveals that LDS content and sliding velocity affects the specific wear rate more significantly than normal load and sliding distance. Furthermore with increase in LDS content specific wear rate of the composite decreases for a constant sliding velocity. The sliding wear behavior of these composites under an extended range of test conditions is predicted by a model based on the artificial neural network (ANN).

  7. Linear and nonlinear analysis of fluid slosh dampers

    NASA Astrophysics Data System (ADS)

    Sayar, B. A.; Baumgarten, J. R.

    1982-11-01

    A vibrating structure and a container partially filled with fluid are considered coupled in a free vibration mode. To simplify the mathematical analysis, a pendulum model to duplicate the fluid motion and a mass-spring dashpot representing the vibrating structure are used. The equations of motion are derived by Lagrange's energy approach and expressed in parametric form. For a wide range of parametric values the logarithmic decrements of the main system are calculated from theoretical and experimental response curves in the linear analysis. However, for the nonlinear analysis the theoretical and experimental response curves of the main system are compared. Theoretical predictions are justified by experimental observations with excellent agreement. It is concluded finally that for a proper selection of design parameters, containers partially filled with viscous fluids serve as good vibration dampers.

  8. Supercritical nonlinear parametric dynamics of Timoshenko microbeams

    NASA Astrophysics Data System (ADS)

    Farokhi, Hamed; Ghayesh, Mergen H.

    2018-06-01

    The nonlinear supercritical parametric dynamics of a Timoshenko microbeam subject to an axial harmonic excitation force is examined theoretically, by means of different numerical techniques, and employing a high-dimensional analysis. The time-variant axial load is assumed to consist of a mean value along with harmonic fluctuations. In terms of modelling, a continuous expression for the elastic potential energy of the system is developed based on the modified couple stress theory, taking into account small-size effects; the kinetic energy of the system is also modelled as a continuous function of the displacement field. Hamilton's principle is employed to balance the energies and to obtain the continuous model of the system. Employing the Galerkin scheme along with an assumed-mode technique, the energy terms are reduced, yielding a second-order reduced-order model with finite number of degrees of freedom. A transformation is carried out to convert the second-order reduced-order model into a double-dimensional first order one. A bifurcation analysis is performed for the system in the absence of the axial load fluctuations. Moreover, a mean value for the axial load is selected in the supercritical range, and the principal parametric resonant response, due to the time-variant component of the axial load, is obtained - as opposed to transversely excited systems, for parametrically excited system (such as our problem here), the nonlinear resonance occurs in the vicinity of twice any natural frequency of the linear system; this is accomplished via use of the pseudo-arclength continuation technique, a direct time integration, an eigenvalue analysis, and the Floquet theory for stability. The natural frequencies of the system prior to and beyond buckling are also determined. Moreover, the effect of different system parameters on the nonlinear supercritical parametric dynamics of the system is analysed, with special consideration to the effect of the length-scale parameter.

  9. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies.

    PubMed

    Häggström, Ida; Beattie, Bradley J; Schmidtlein, C Ross

    2016-06-01

    To develop and evaluate a fast and simple tool called dpetstep (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. The tool was developed in matlab using both new and previously reported modules of petstep (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). dpetstep was 8000 times faster than MC. Dynamic images from dpetstep had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dpetstep and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dpetstep images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dpetstep images and noise properties agreed better with MC. The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dpetstep to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dpetstep can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.

  10. Comparison of DIGE and post-stained gel electrophoresis with both traditional and SameSpots analysis for quantitative proteomics.

    PubMed

    Karp, Natasha A; Feret, Renata; Rubtsov, Denis V; Lilley, Kathryn S

    2008-03-01

    2-DE is an important tool in quantitative proteomics. Here, we compare the deep purple (DP) system with DIGE using both a traditional and the SameSpots approach to gel analysis. Missing values in the traditional approach were found to be a significant issue for both systems. SameSpots attempts to address the missing value problem. SameSpots was found to increase the proportion of low volume data for DP but not for DIGE. For all the analysis methods applied in this study, the assumptions of parametric tests were met. Analysis of the same images gave significantly lower noise with SameSpots (over traditional) for DP, but no difference for DIGE. We propose that SameSpots gave lower noise with DP due to the stabilisation of the spot area by the common spot outline, but this was not seen with DIGE due to the co-detection process which stabilises the area selected. For studies where measurement of small abundance changes is required, a cost-benefit analysis highlights that DIGE was significantly cheaper regardless of the analysis methods. For studies analysing large changes, DP with SameSpots could be an effective alternative to DIGE but this will be dependent on the biological noise of the system under investigation.

  11. Potentialities of TEC topping: A simplified view of parametric effects

    NASA Technical Reports Server (NTRS)

    Morris, J. F.

    1980-01-01

    An examination of the benefits of thermionic-energy-conversion (TEC)-topped power plants and methods of increasing conversion efficiency are discussed. Reductions in the cost of TEC modules yield direct decreases in the cost of electricity (COE) from TEC-topped central station power plants. Simplified COE, overall-efficiency charts presented illustrate this trend. Additional capital-cost diminution results from designing more compact furnaces with considerably increased heat transfer rates allowable and desirable for high temperature TEC and heat pipes. Such improvements can evolve of the protection from hot corrosion and slag as well as the thermal expansion compatibilities offered by silicon-carbide clads on TEC-heating surfaces. Greater efficiencies and far fewer modules are possible with high-temperature, high-power-density TEC: This decreases capital and fuel costs much more and substantially increases electric power outputs for fixed fuel inputs. In addition to more electricity, less pollution, and lower costs, TEC topping used directly in coal-combustion products contributes balance-of-payment gains.

  12. Application of Risk within Net Present Value Calculations for Government Projects

    NASA Technical Reports Server (NTRS)

    Grandl, Paul R.; Youngblood, Alisha D.; Componation, Paul; Gholston, Sampson

    2007-01-01

    In January 2004, President Bush announced a new vision for space exploration. This included retirement of the current Space Shuttle fleet by 2010 and the development of new set of launch vehicles. The President's vision did not include significant increases in the NASA budget, so these development programs need to be cost conscious. Current trade study procedures address factors such as performance, reliability, safety, manufacturing, maintainability, operations, and costs. It would be desirable, however, to have increased insight into the cost factors behind each of the proposed system architectures. This paper reports on a set of component trade studies completed on the upper stage engine for the new launch vehicles. Increased insight into architecture costs was developed by including a Net Present Value (NPV) method and applying a set of associated risks to the base parametric cost data. The use of the NPV method along with the risks was found to add fidelity to the trade study and provide additional information to support the selection of a more robust design architecture.

  13. Analysis of Lateral Rail Restraint.

    DOT National Transportation Integrated Search

    1983-09-01

    This report deals with the analysis of lateral rail strength using the results of experimental investigations and a nonlinear rail response model. Part of the analysis involves the parametric study of the influence of track parameters on lateral rail...

  14. Parametrically excited oscillation of stay cable and its control in cable-stayed bridges.

    PubMed

    Sun, Bing-nan; Wang, Zhi-gang; Ko, J M; Ni, Y Q

    2003-01-01

    This paper presents a nonlinear dynamic model for simulation and analysis of a kind of parametrically excited vibration of stay cable caused by support motion in cable-stayed bridges. The sag, inclination angle of the stay cable are considered in the model, based on which, the oscillation mechanism and dynamic response characteristics of this kind of vibration are analyzed through numerical calculation. It is noted that parametrically excited oscillation of a stay cable with certain sag, inclination angle and initial static tension force may occur in cable-stayed bridges due to deck vibration under the condition that the natural frequency of a cable approaches to about half of the first model frequency of the bridge deck system. A new vibration control system installed on the cable anchorage is proposed as a possible damping system to suppress the cable parametric oscillation. The numerical calculation results showed that with the use of this damping system, the cable oscillation due to the vibration of the deck and/or towers will be considerably reduced.

  15. Definition of NASTRAN sets by use of parametric geometry

    NASA Technical Reports Server (NTRS)

    Baughn, Terry V.; Tiv, Mehran

    1989-01-01

    Many finite element preprocessors describe finite element model geometry with points, lines, surfaces and volumes. One method for describing these basic geometric entities is by use of parametric cubics which are useful for representing complex shapes. The lines, surfaces and volumes may be discretized for follow on finite element analysis. The ability to limit or selectively recover results from the finite element model is extremely important to the analyst. Equally important is the ability to easily apply boundary conditions. Although graphical preprocessors have made these tasks easier, model complexity may not lend itself to easily identify a group of grid points desired for data recovery or application of constraints. A methodology is presented which makes use of the assignment of grid point locations in parametric coordinates. The parametric coordinates provide a convenient ordering of the grid point locations and a method for retrieving the grid point ID's from the parent geometry. The selected grid points may then be used for the generation of the appropriate set and constraint cards.

  16. Future space transportation systems analysis study. Phase 1 extension: Transportation systems reference data, volume 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Transportation mass requirements are developed for various mission and transportation modes based on vehicle systems sized to fit the exact needs of each mission. The parametric data used to derive the mass requirements for each mission and transportation mode are presented to enable accommodation of possible changes in mode options or payload definitions. The vehicle sizing and functional requirements used to derive the parametric data are described.

  17. Parametrically excited multidegree-of-freedom systems with repeated frequencies

    NASA Astrophysics Data System (ADS)

    Nayfeh, A. H.

    1983-05-01

    An analysis is presented of the linear response of multidegree-of-freedom systems with a repeated frequency of order three to a harmonic parametric excitation. The method of multiple scales is used to determine the modulation of the amplitudes and phases for two cases: fundamental resonance of the modes with the repeated frequency and combination resonance involving these modes and another mode. Conditions are then derived for determining the stability of the motion.

  18. Advanced oxygen-hydrocarbon rocket engine study

    NASA Technical Reports Server (NTRS)

    Obrien, C. J.; Salkeld, R.

    1980-01-01

    The advantages and disadvantages, system performance and operating limits, engine parametric data, and technology requirements for candidate high pressure LO2/Hydrocarbon engine systems are summarized. These summaries of parametric analysis and design provide a consistent engine system data base. Power balance data were generated for the eleven engine cycles. Engine cycle rating parameters were established and the desired condition and the effect of the parameter on the engine and/or vehicle are described.

  19. SEC sensor parametric test and evaluation system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    This system provides the necessary automated hardware required to carry out, in conjunction with the existing 70 mm SEC television camera, the sensor evaluation tests which are described in detail. The Parametric Test Set (PTS) was completed and is used in a semiautomatic data acquisition and control mode to test the development of the 70 mm SEC sensor, WX 32193. Data analysis of raw data is performed on the Princeton IBM 360-91 computer.

  20. CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions

    EPA Pesticide Factsheets

    Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.

  1. EXPLORING PATENT ACTIVITY AND ITS POTENTIAL ASSOCIATION WITH HEALTHCARE OUTCOMES: A CASE STUDY OF OSTOMY PRODUCTS IN SWEDEN.

    PubMed

    Calara, Paul Samuel; Althin, Rikard; Inglese, Gary; Nichols, Thomas

    2017-01-01

    The aim of this study was to evaluate whether ostomy industry patent activity (PA) is associated with patient outcomes and healthcare costs. Two groups of ostomy pouch users based on manufacturer PA (low or high) were compared in terms of ostomy-related wear patterns, adverse events, and healthcare expenditure. Using Swedish registry data, all patients with newly formed stomas were divided between each group and were followed during a 2-year period (2011-12). Propensity score matching and parametric duration analysis were used to compare outcomes between patients of similar characteristics such as sex, age, and ostomy surgery type. In both one- and two-piece systems, the high PA group had significantly lower monthly ostomy-related expenditure than the low PA group (one-piece: 197.47 EUR versus 233.34 EUR; two-piece: 164.00 EUR versus 278.98 EUR). Fewer pouch and skin wafer purchases per month were an important driver of cost differences. Both groups had similar likelihood of purchasing dermatological products for skin complications over time. PA in the ostomy care industry was associated with reduced healthcare costs, but not necessarily with fewer skin complications. It suggests that there is a health economic benefit from products made by patent intensive companies which may differentiate them from generic comparators, but more research is needed to understand the impact of activities conducive to medical innovation on health outcomes.

  2. Texture-based characterization of subskin features by specified laser speckle effects at λ = 650 nm region for more accurate parametric 'skin age' modelling.

    PubMed

    Orun, A B; Seker, H; Uslan, V; Goodyer, E; Smith, G

    2017-06-01

    The textural structure of 'skin age'-related subskin components enables us to identify and analyse their unique characteristics, thus making substantial progress towards establishing an accurate skin age model. This is achieved by a two-stage process. First by the application of textural analysis using laser speckle imaging, which is sensitive to textural effects within the λ = 650 nm spectral band region. In the second stage, a Bayesian inference method is used to select attributes from which a predictive model is built. This technique enables us to contrast different skin age models, such as the laser speckle effect against the more widely used normal light (LED) imaging method, whereby it is shown that our laser speckle-based technique yields better results. The method introduced here is non-invasive, low cost and capable of operating in real time; having the potential to compete against high-cost instrumentation such as confocal microscopy or similar imaging devices used for skin age identification purposes. © 2016 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  3. Seismic analysis of parallel structures coupled by lead extrusion dampers

    NASA Astrophysics Data System (ADS)

    Patel, C. C.

    2017-06-01

    In this paper, the response behaviors of two parallel structures coupled by Lead Extrusion Dampers (LED) under various earthquake ground motion excitations are investigated. The equation of motion for the two parallel, multi-degree-of-freedom (MDOF) structures connected by LEDs is formulated. To explore the viability of LED to control the responses, namely displacement, acceleration and shear force of parallel coupled structures, the numerical study is done in two parts: (1) two parallel MDOF structures connected with LEDs having same damper damping in all the dampers and (2) two parallel MDOF structures connected with LEDs having different damper damping. A parametric study is conducted to investigate the optimum damping of the dampers. Moreover, to limit the cost of the dampers, the study is conducted with only 50% of total dampers at optimal locations, instead of placing the dampers at all the floor level. Results show that LEDs connecting the parallel structures of different fundamental frequencies, the earthquake-induced responses of either structure can be effectively reduced. Further, it is not necessary to connect the two structures at all floors; however, lesser damper at appropriate locations can significantly reduce the earthquake response of the coupled system, thus reducing the cost of the dampers significantly.

  4. Bayesian spatial analysis of childhood diseases in Zimbabwe.

    PubMed

    Tsiko, Rodney Godfrey

    2015-09-02

    Many sub-Saharan countries are confronted with persistently high levels of childhood morbidity and mortality because of the impact of a range of demographic, biological and social factors or situational events that directly precipitate ill health. In particular, under-five morbidity and mortality have increased in recent decades due to childhood diarrhoea, cough and fever. Understanding the geographic distribution of such diseases and their relationships to potential risk factors can be invaluable for cost effective intervention. Bayesian semi-parametric regression models were used to quantify the spatial risk of childhood diarrhoea, fever and cough, as well as associations between childhood diseases and a range of factors, after accounting for spatial correlation between neighbouring areas. Such semi-parametric regression models allow joint analysis of non-linear effects of continuous covariates, spatially structured variation, unstructured heterogeneity, and other fixed effects on childhood diseases. Modelling and inference made use of the fully Bayesian approach via Markov Chain Monte Carlo (MCMC) simulation techniques. The analysis was based on data derived from the 1999, 2005/6 and 2010/11 Zimbabwe Demographic and Health Surveys (ZDHS). The results suggest that until recently, sex of child had little or no significant association with childhood diseases. However, a higher proportion of male than female children within a given province had a significant association with childhood cough, fever and diarrhoea. Compared to their counterparts in rural areas, children raised in an urban setting had less exposure to cough, fever and diarrhoea across all the survey years with the exception of diarrhoea in 2010. In addition, the link between sanitation, parental education, antenatal care, vaccination and childhood diseases was found to be both intuitive and counterintuitive. Results also showed marked geographical differences in the prevalence of childhood diarrhoea, fever and cough. Across all the survey years Manicaland province reported the highest cases of childhood diseases. There is also clear evidence of significant high prevalence of childhood diseases in Mashonaland than in Matabeleland provinces.

  5. Parametric study on the behavior of an innovative subsurface tension leg platform in ultra-deep water

    NASA Astrophysics Data System (ADS)

    Zhen, Xing-wei; Huang, Yi

    2017-10-01

    This study focuses on a new technology of Subsurface Tension Leg Platform (STLP), which utilizes the shallowwater rated well completion equipment and technology for the development of large oil and gas fields in ultra-deep water (UDW). Thus, the STLP concept offers attractive advantages over conventional field development concepts. STLP is basically a pre-installed Subsurface Sea-star Platform (SSP), which supports rigid risers and shallow-water rated well completion equipment. The paper details the results of the parametric study on the behavior of STLP at a water depth of 3000 m. At first, a general description of the STLP configuration and working principle is introduced. Then, the numerical models for the global analysis of the STLP in waves and current are presented. After that, extensive parametric studies are carried out with regarding to SSP/tethers system analysis, global dynamic analysis and riser interference analysis. Critical points are addressed on the mooring pattern and riser arrangement under the influence of ocean current, to ensure that the requirements on SSP stability and riser interference are well satisfied. Finally, conclusions and discussions are made. The results indicate that STLP is a competitive well and riser solution in up to 3000 m water depth for offshore petroleum production.

  6. Nonparametric functional data estimation applied to ozone data: prediction and extreme value analysis.

    PubMed

    Quintela-del-Río, Alejandro; Francisco-Fernández, Mario

    2011-02-01

    The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. An Interactive Software for Conceptual Wing Flutter Analysis and Parametric Study

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek

    1996-01-01

    An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate the flutter instability boundary of a flexible cantilever wing, when well-defined structural and aerodynamic data are not available, and then study the effect of change in Mach number, dynamic pressure, torsional frequency, sweep, mass ratio, aspect ratio, taper ratio, center of gravity, and pitch inertia, to guide the development of the concept. The software was developed for Macintosh or IBM compatible personal computers, on MathCad application software with integrated documentation, graphics, data base and symbolic mathematics. The analysis method was based on non-dimensional parametric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on torsional stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravity location and pitch inertia radius of gyration. The parametric plots were compiled in a Vought Corporation report from a vast data base of past experiments and wind-tunnel tests. The computer program was utilized for flutter analysis of the outer wing of a Blended-Wing-Body concept, proposed by McDonnell Douglas Corp. Using a set of assumed data, preliminary flutter boundary and flutter dynamic pressure variation with altitude, Mach number and torsional stiffness were determined.

  8. Benchmark dose analysis via nonparametric regression modeling

    PubMed Central

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose-response modeling. It is a well-known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low-dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal-response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap-based confidence limits for the BMD. We explore the confidence limits’ small-sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty. PMID:23683057

  9. Parametric Covariance Model for Horizon-Based Optical Navigation

    NASA Technical Reports Server (NTRS)

    Hikes, Jacob; Liounis, Andrew J.; Christian, John A.

    2016-01-01

    This Note presents an entirely parametric version of the covariance for horizon-based optical navigation measurements. The covariance can be written as a function of only the spacecraft position, two sensor design parameters, the illumination direction, the size of the observed planet, the size of the lit arc to be used, and the total number of observed horizon points. As a result, one may now more clearly understand the sensitivity of horizon-based optical navigation performance as a function of these key design parameters, which is insight that was obscured in previous (and nonparametric) versions of the covariance. Finally, the new parametric covariance is shown to agree with both the nonparametric analytic covariance and results from a Monte Carlo analysis.

  10. Parametric interactions in presence of different size colloids in semiconductor quantum plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanshpal, R., E-mail: ravivanshpal@gmail.com; Sharma, Uttam; Dubey, Swati

    2015-07-31

    Present work is an attempt to investigate the effect of different size colloids on parametric interaction in semiconductor quantum plasma. Inclusion of quantum effect is being done in this analysis through quantum correction term in classical hydrodynamic model of homogeneous semiconductor plasma. The effect is associated with purely quantum origin using quantum Bohm potential and quantum statistics. Colloidal size and quantum correction term modify the parametric dispersion characteristics of ion implanted semiconductor plasma medium. It is found that quantum effect on colloids is inversely proportional to their size. Moreover critical size of implanted colloids for the effective quantum correction ismore » determined which is found to be equal to the lattice spacing of the crystal.« less

  11. Managing EEE part standardisation and procurement

    NASA Astrophysics Data System (ADS)

    Serieys, C.; Bensoussan, A.; Petitmangin, A.; Rigaud, M.; Barbaresco, P.; Lyan, C.

    2002-12-01

    This paper presents the development activities in space components selection and procurement dealing with a new data base tool implemented at Alcatel Space using TransForm softwaa re configurator developed by Techform S.A. Based on TransForm, Access Ingenierie has devv eloped a software product named OLG@DOS which facilitate the part nomenclatures analyses for new equipment design and manufacturing in term of ACCESS data base implementation. Hi-Rel EEE part type technical, production and quality information are collected and compiled usingproduction data base issued from production tools implemented for equipment definition, description and production based on Manufacturing Resource Planning (MRP II Control Open) and Parametric Design Manager (PDM Work Manager). The analysis of any new equipment nomenclature may be conducted through this means for standardisation purpose, cost containment program and management procurement activities as well as preparation of Component reviews as Part Approval Document and Declared Part List validation.

  12. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  13. Quantitative evaluation of a thrust vector controlled transport at the conceptual design phase

    NASA Astrophysics Data System (ADS)

    Ricketts, Vincent Patrick

    The impetus to innovate, to push the bounds and break the molds of evolutionary design trends, often comes from competition but sometimes requires catalytic political legislature. For this research endeavor, the 'catalyzing legislation' comes in response to the rise in cost of fossil fuels and the request put forth by NASA on aircraft manufacturers to show reduced aircraft fuel consumption of +60% within 30 years. This necessitates that novel technologies be considered to achieve these values of improved performance. One such technology is thrust vector control (TVC). The beneficial characteristic of thrust vector control technology applied to the traditional tail-aft configuration (TAC) commercial transport is its ability to retain the operational advantage of this highly evolved aircraft type like cabin evacuation, ground operation, safety, and certification. This study explores if the TVC transport concept offers improved flight performance due to synergistically reducing the traditional empennage size, overall resulting in reduced weight and drag, and therefore reduced aircraft fuel consumption. In particular, this study explores if the TVC technology in combination with the reduced empennage methodology enables the TAC aircraft to synergistically evolve while complying with current safety and certification regulation. This research utilizes the multi-disciplinary parametric sizing software, AVD Sizing, developed by the Aerospace Vehicle Design (AVD) Laboratory. The sizing software is responsible for visualizing the total system solution space via parametric trades and is capable of determining if the TVC technology can enable the TAC aircraft to synergistically evolve, showing marked improvements in performance and cost. This study indicates that the TVC plus reduced empennage methodology shows marked improvements in performance and cost.

  14. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    PubMed

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. Centrifugal compressor surge detecting method based on wavelet analysis of unsteady pressure fluctuations in typical stages

    NASA Astrophysics Data System (ADS)

    Izmaylov, R.; Lebedev, A.

    2015-08-01

    Centrifugal compressors are complex energy equipment. Automotive control and protection system should meet the requirements: of operation reliability and durability. In turbocompressors there are at least two dangerous areas: surge and rotating stall. Antisurge protecting systems usually use parametric or feature methods. As a rule industrial system are parametric. The main disadvantages of anti-surge parametric systems are difficulties in mass flow measurements in natural gas pipeline compressor. The principal idea of feature method is based on the experimental fact: as a rule just before the onset of surge rotating or precursor stall established in compressor. In this case the problem consists in detecting of unsteady pressure or velocity fluctuations characteristic signals. Wavelet analysis is the best method for detecting onset of rotating stall in spite of high level of spurious signals (rotating wakes, turbulence, etc.). This method is compatible with state of the art DSP systems of industrial control. Examples of wavelet analysis application for detecting onset of rotating stall in typical stages centrifugal compressor are presented. Experimental investigations include unsteady pressure measurement and sophisticated data acquisition system. Wavelet transforms used biorthogonal wavelets in Mathlab systems.

  16. The composition of M-type asteroids: Synthesis of spectroscopic and radar observations

    NASA Astrophysics Data System (ADS)

    Neeley, J. R.; Ockert-Bell, M. E.; Clark, B. E.; Shepard, M. K.; Cloutis, E. A.; Fornasier, S.; Bus, S. J.

    2011-10-01

    This work updates our and expands our long term radar-driven observational campaign of 27 main-belt asteroids (MBAs) focused on Bus-DeMeo Xc- and Xk-type objects (Tholen X and M class asteroids) using the Arecibo radar and NASA Infrared Telescope Facilities (IRTF). Seventeen of our targets were near-simultaneously observed with radar and those observations are described in companion paper (Shepard et al., 2010). We utilized visible wavelength for a more complete compositional analysis of our targets. Compositional evidence is derived from our target asteroid spectra using three different methods: 1) a χ2 search for spectral matches in the RELAB database, 2) parametric comparisons with meteorites and 3) linear discriminant analysis. This paper synthesizes the results of the RELAB search, parametric comparisons, and linear discriminant analysis with compositional suggestions based on radar observations. We find that for six of seventeen targets with radar data, our spectral results are consistent with their radar analog (16 Psyche, 21 Lutetia, 69 Hesperia, 135 Hertha, 216 Kleopatra, and 497 Iva). For twenty out of twenty-seven objects our statistical comparisons with RELAB meteorites result in consistent analog identification, providing a degree of confidence in our parametric methods.

  17. Comparing Costs of Telephone versus Face-to-Face Extended Care Programs for the Management of Obesity in Rural Settings

    PubMed Central

    Radcliff, Tiffany A.; Bobroff, Linda B.; Lutes, Lesley D.; Durning, Patricia E.; Daniels, Michael J.; Limacher, Marian C.; Janicke, David M.; Martin, A. Daniel; Perri, Michael G.

    2012-01-01

    Background A major challenge following successful weight loss is continuing the behaviors required for long-term weight maintenance. This challenge may be exacerbated in rural areas with limited local support resources. Objective This study describes and compares program costs and cost-effectiveness for 12-month extended care lifestyle maintenance programs following an initial 6-month weight loss program. Design A 1-year prospective controlled randomized clinical trial. Participants/Setting The study included 215 female participants age 50 or older from rural areas who completed an initial 6-month lifestyle program for weight loss. The study was conducted from June 1, 2003, to May 31, 2007. Intervention The intervention was delivered through local Cooperative Extension Service offices in rural Florida. Participants were randomly-assigned to a 12-month extended care program using either individual telephone counseling (n=67), group face-to-face counseling (n=74), or a mail/control group (n=74). Main Outcome Measures Program delivery costs, weight loss, and self-reported health status were directly assessed through questionnaires and program activity logs. Costs were estimated across a range of enrollment sizes to allow inferences beyond the study sample. Statistical Analyses Performed Non-parametric and parametric tests of differences across groups for program outcomes were combined with direct program cost estimates and expected value calculations to determine which scales of operation favored alternative formats for lifestyle maintenance. Results Median weight regain during the intervention year was 1.7 kg for participants in the face-to-face format, 2.1 kg for the telephone format, and 3.1 kg for the mail/control format. For a typical group size of 13 participants, the face-to-face format had higher fixed costs, which translated into higher overall program costs ($420 per participant) when compared to individual telephone counseling ($268 per participant) and control ($226 per participant) programs. While the net weight lost after the 12-month maintenance program was higher for the face-to-face and telephone programs compared to the control group, the average cost per expected kilogram of weight lost was higher for the face-to-face program ($47/kg) compared to the other two programs (approximately $33/kg for telephone and control). Conclusions Both the scale of operations and local demand for programs are important considerations in selecting a delivery format for lifestyle maintenance. In this study, the telephone format had a lower cost, but similar outcomes compared to the face-to-face format. PMID:22818246

  18. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-04-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques, though the proposed methodology can be used to evaluate any other ABI parametric image estimation technique.

  19. Noise and Analyzer-Crystal Angular Position Analysis for Analyzer-Based Phase-Contrast Imaging

    PubMed Central

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-01-01

    The analyzer-based phase-contrast X-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile (AIP) of the X-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this manuscript is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the Multiple-Image Radiography (MIR), Diffraction Enhanced Imaging (DEI) and Scatter Diffraction Enhanced Imaging (S-DEI) estimation techniques, though the proposed methodology can be used to evaluate any other ABI parametric image estimation technique. PMID:24651402

  20. Stability analysis of a time-periodic 2-dof MEMS structure

    NASA Astrophysics Data System (ADS)

    Kniffka, Till Jochen; Welte, Johannes; Ecker, Horst

    2012-11-01

    Microelectromechanical systems (MEMS) are becoming important for all kinds of industrial applications. Among them are filters in communication devices, due to the growing demand for efficient and accurate filtering of signals. In recent developments single degree of freedom (1-dof) oscillators, that are operated at a parametric resonances, are employed for such tasks. Typically vibration damping is low in such MEM systems. While parametric excitation (PE) is used so far to take advantage of a parametric resonance, this contribution suggests to also exploit parametric anti-resonances in order to improve the damping behavior of such systems. Modeling aspects of a 2-dof MEM system and first results of the analysis of the non-linear and the linearized system are the focus of this paper. In principle the investigated system is an oscillating mechanical system with two degrees of freedom x = [x1x2]T that can be described by Mx+Cx+K1x+K3(x2)x+Fes(x,V(t)) = 0. The system is inherently non-linear because of the cubic mechanical stiffness K3 of the structure, but also because of electrostatic forces (1+cos(ωt))Fes(x) that act on the system. Electrostatic forces are generated by comb drives and are proportional to the applied time-periodic voltage V(t). These drives also provide the means to introduce time-periodic coefficients, i.e. parametric excitation (1+cos(ωt)) with frequency ω. For a realistic MEM system the coefficients of the non-linear set of differential equations need to be scaled for efficient numerical treatment. The final mathematical model is a set of four non-linear time-periodic homogeneous differential equations of first order. Numerical results are obtained from two different methods. The linearized time-periodic (LTP) system is studied by calculating the Monodromy matrix of the system. The eigenvalues of this matrix decide on the stability of the LTP-system. To study the unabridged non-linear system, the bifurcation software ManLab is employed. Continuation analysis including stability evaluations are executed and show the frequency ranges for which the 2-dof system becomes unstable due to parametric resonances. Moreover, the existence of frequency intervals are shown where enhanced damping for the system is observed for this MEMS. The results from the stability studies are confirmed by simulation results.

Top