Sample records for validation parameters including

  1. Reaction time as an indicator of insufficient effort: Development and validation of an embedded performance validity parameter.

    PubMed

    Stevens, Andreas; Bahlo, Simone; Licha, Christina; Liske, Benjamin; Vossler-Thies, Elisabeth

    2016-11-30

    Subnormal performance in attention tasks may result from various sources including lack of effort. In this report, the derivation and validation of a performance validity parameter for reaction time is described, using a set of malingering-indices ("Slick-criteria"), and 3 independent samples of participants (total n =893). The Slick-criteria yield an estimate of the probability of malingering based on the presence of an external incentive, evidence from neuropsychological testing, from self-report and clinical data. In study (1) a validity parameter is derived using reaction time data of a sample, composed of inpatients with recent severe brain lesions not involved in litigation and of litigants with and without brain lesion. In study (2) the validity parameter is tested in an independent sample of litigants. In study (3) the parameter is applied to an independent sample comprising cooperative and non-cooperative testees. Logistic regression analysis led to a derived validity parameter based on median reaction time and standard deviation. It performed satisfactorily in studies (2) and (3) (study 2 sensitivity=0.94, specificity=1.00; study 3 sensitivity=0.79, specificity=0.87). The findings suggest that median reaction time and standard deviation may be used as indicators of negative response bias. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Center of pressure based segment inertial parameters validation

    PubMed Central

    Rezzoug, Nasser; Gorce, Philippe; Isableu, Brice; Venture, Gentiane

    2017-01-01

    By proposing efficient methods for estimating Body Segment Inertial Parameters’ (BSIP) estimation and validating them with a force plate, it is possible to improve the inverse dynamic computations that are necessary in multiple research areas. Until today a variety of studies have been conducted to improve BSIP estimation but to our knowledge a real validation has never been completely successful. In this paper, we propose a validation method using both kinematic and kinetic parameters (contact forces) gathered from optical motion capture system and a force plate respectively. To compare BSIPs, we used the measured contact forces (Force plate) as the ground truth, and reconstructed the displacements of the Center of Pressure (COP) using inverse dynamics from two different estimation techniques. Only minor differences were seen when comparing the estimated segment masses. Their influence on the COP computation however is large and the results show very distinguishable patterns of the COP movements. Improving BSIP techniques is crucial and deviation from the estimations can actually result in large errors. This method could be used as a tool to validate BSIP estimation techniques. An advantage of this approach is that it facilitates the comparison between BSIP estimation methods and more specifically it shows the accuracy of those parameters. PMID:28662090

  3. Post-processing of seismic parameter data based on valid seismic event determination

    DOEpatents

    McEvilly, Thomas V.

    1985-01-01

    An automated seismic processing system and method are disclosed, including an array of CMOS microprocessors for unattended battery-powered processing of a multi-station network. According to a characterizing feature of the invention, each channel of the network is independently operable to automatically detect, measure times and amplitudes, and compute and fit Fast Fourier transforms (FFT's) for both P- and S- waves on analog seismic data after it has been sampled at a given rate. The measured parameter data from each channel are then reviewed for event validity by a central controlling microprocessor and if determined by preset criteria to constitute a valid event, the parameter data are passed to an analysis computer for calculation of hypocenter location, running b-values, source parameters, event count, P- wave polarities, moment-tensor inversion, and Vp/Vs ratios. The in-field real-time analysis of data maximizes the efficiency of microearthquake surveys allowing flexibility in experimental procedures, with a minimum of traditional labor-intensive postprocessing. A unique consequence of the system is that none of the original data (i.e., the sensor analog output signals) are necessarily saved after computation, but rather, the numerical parameters generated by the automatic analysis are the sole output of the automated seismic processor.

  4. Random sampling and validation of covariance matrices of resonance parameters

    NASA Astrophysics Data System (ADS)

    Plevnik, Lucijan; Zerovnik, Gašper

    2017-09-01

    Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices) in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  5. An Innovative Software Tool Suite for Power Plant Model Validation and Parameter Calibration using PMU Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yuanyuan; Diao, Ruisheng; Huang, Renke

    Maintaining good quality of power plant stability models is of critical importance to ensure the secure and economic operation and planning of today’s power grid with its increasing stochastic and dynamic behavior. According to North American Electric Reliability (NERC) standards, all generators in North America with capacities larger than 10 MVA are required to validate their models every five years. Validation is quite costly and can significantly affect the revenue of generator owners, because the traditional staged testing requires generators to be taken offline. Over the past few years, validating and calibrating parameters using online measurements including phasor measurement unitsmore » (PMUs) and digital fault recorders (DFRs) has been proven to be a cost-effective approach. In this paper, an innovative open-source tool suite is presented for validating power plant models using PPMV tool, identifying bad parameters with trajectory sensitivity analysis, and finally calibrating parameters using an ensemble Kalman filter (EnKF) based algorithm. The architectural design and the detailed procedures to run the tool suite are presented, with results of test on a realistic hydro power plant using PMU measurements for 12 different events. The calibrated parameters of machine, exciter, governor and PSS models demonstrate much better performance than the original models for all the events and show the robustness of the proposed calibration algorithm.« less

  6. Identification of modal parameters including unmeasured forces and transient effects

    NASA Astrophysics Data System (ADS)

    Cauberghe, B.; Guillaume, P.; Verboven, P.; Parloo, E.

    2003-08-01

    In this paper, a frequency-domain method to estimate modal parameters from short data records with known input (measured) forces and unknown input forces is presented. The method can be used for an experimental modal analysis, an operational modal analysis (output-only data) and the combination of both. A traditional experimental and operational modal analysis in the frequency domain starts respectively, from frequency response functions and spectral density functions. To estimate these functions accurately sufficient data have to be available. The technique developed in this paper estimates the modal parameters directly from the Fourier spectra of the outputs and the known input. Instead of using Hanning windows on these short data records the transient effects are estimated simultaneously with the modal parameters. The method is illustrated, tested and validated by Monte Carlo simulations and experiments. The presented method to process short data sequences leads to unbiased estimates with a small variance in comparison to the more traditional approaches.

  7. Concurrent validity of the Microsoft Kinect for Windows v2 for measuring spatiotemporal gait parameters.

    PubMed

    Dolatabadi, Elham; Taati, Babak; Mihailidis, Alex

    2016-09-01

    This paper presents a study to evaluate the concurrent validity of the Microsoft Kinect for Windows v2 for measuring the spatiotemporal parameters of gait. Twenty healthy adults performed several sequences of walks across a GAITRite mat under three different conditions: usual pace, fast pace, and dual task. Each walking sequence was simultaneously captured with two Kinect for Windows v2 and the GAITRite system. An automated algorithm was employed to extract various spatiotemporal features including stance time, step length, step time and gait velocity from the recorded Kinect v2 sequences. Accuracy in terms of reliability, concurrent validity and limits of agreement was examined for each gait feature under different walking conditions. The 95% Bland-Altman limits of agreement were narrow enough for the Kinect v2 to be a valid tool for measuring all reported spatiotemporal parameters of gait in all three conditions. An excellent intraclass correlation coefficient (ICC2, 1) ranging from 0.9 to 0.98 was observed for all gait measures across different walking conditions. The inter trial reliability of all gait parameters were shown to be strong for all walking types (ICC3, 1 > 0.73). The results of this study suggest that the Kinect for Windows v2 has the capacity to measure selected spatiotemporal gait parameters for healthy adults. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Estimation of nonlinear pilot model parameters including time delay.

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Roland, V. R.; Wells, W. R.

    1972-01-01

    Investigation of the feasibility of using a Kalman filter estimator for the identification of unknown parameters in nonlinear dynamic systems with a time delay. The problem considered is the application of estimation theory to determine the parameters of a family of pilot models containing delayed states. In particular, the pilot-plant dynamics are described by differential-difference equations of the retarded type. The pilot delay, included as one of the unknown parameters to be determined, is kept in pure form as opposed to the Pade approximations generally used for these systems. Problem areas associated with processing real pilot response data are included in the discussion.

  9. Validity of a smartphone protractor to measure sagittal parameters in adult spinal deformity.

    PubMed

    Kunkle, William Aaron; Madden, Michael; Potts, Shannon; Fogelson, Jeremy; Hershman, Stuart

    2017-10-01

    Smartphones have become an integral tool in the daily life of health-care professionals (Franko 2011). Their ease of use and wide availability often make smartphones the first tool surgeons use to perform measurements. This technique has been validated for certain orthopedic pathologies (Shaw 2012; Quek 2014; Milanese 2014; Milani 2014), but never to assess sagittal parameters in adult spinal deformity (ASD). This study was designed to assess the validity, reproducibility, precision, and efficiency of using a smartphone protractor application to measure sagittal parameters commonly measured in ASD assessment and surgical planning. This study aimed to (1) determine the validity of smartphone protractor applications, (2) determine the intra- and interobserver reliability of smartphone protractor applications when used to measure sagittal parameters in ASD, (3) determine the efficiency of using a smartphone protractor application to measure sagittal parameters, and (4) elucidate whether a physician's level of experience impacts the reliability or validity of using a smartphone protractor application to measure sagittal parameters in ASD. An experimental validation study was carried out. Thirty standard 36″ standing lateral radiographs were examined. Three separate measurements were performed using a marker and protractor; then at a separate time point, three separate measurements were performed using a smartphone protractor application for all 30 radiographs. The first 10 radiographs were then re-measured two more times, for a total of three measurements from both the smartphone protractor and marker and protractor. The parameters included lumbar lordosis, pelvic incidence, and pelvic tilt. Three raters performed all measurements-a junior level orthopedic resident, a senior level orthopedic resident, and a fellowship-trained spinal deformity surgeon. All data, including the time to perform the measurements, were recorded, and statistical analysis was performed to

  10. Validation of Cloud Parameters Derived from Geostationary Satellites, AVHRR, MODIS, and VIIRS Using SatCORPS Algorithms

    NASA Technical Reports Server (NTRS)

    Minnis, P.; Sun-Mack, S.; Bedka, K. M.; Yost, C. R.; Trepte, Q. Z.; Smith, W. L., Jr.; Painemal, D.; Chen, Y.; Palikonda, R.; Dong, X.; hide

    2016-01-01

    Validation is a key component of remote sensing that can take many different forms. The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) is applied to many different imager datasets including those from the geostationary satellites, Meteosat, Himiwari-8, INSAT-3D, GOES, and MTSAT, as well as from the low-Earth orbiting satellite imagers, MODIS, AVHRR, and VIIRS. While each of these imagers have similar sets of channels with wavelengths near 0.65, 3.7, 11, and 12 micrometers, many differences among them can lead to discrepancies in the retrievals. These differences include spatial resolution, spectral response functions, viewing conditions, and calibrations, among others. Even when analyzed with nearly identical algorithms, it is necessary, because of those discrepancies, to validate the results from each imager separately in order to assess the uncertainties in the individual parameters. This paper presents comparisons of various SatCORPS-retrieved cloud parameters with independent measurements and retrievals from a variety of instruments. These include surface and space-based lidar and radar data from CALIPSO and CloudSat, respectively, to assess the cloud fraction, height, base, optical depth, and ice water path; satellite and surface microwave radiometers to evaluate cloud liquid water path; surface-based radiometers to evaluate optical depth and effective particle size; and airborne in-situ data to evaluate ice water content, effective particle size, and other parameters. The results of comparisons are compared and contrasted and the factors influencing the differences are discussed.

  11. Experience of the JPL Exploratory Data Analysis Team at validating HIRS2/MSU cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Granger-Gallegos, Stephanie; Pursch, Andrew; Delgenio, Anthony

    1992-01-01

    Validation of the HIRS2/MSU cloud parameters began with the cloud/climate feedback problem. The derived effective cloud amount is less sensitive to surface temperature for higher clouds. This occurs because as the cloud elevation increases, the difference between surface temperature and cloud temperature increases, so only a small change in cloud amount is needed to effect a large change in radiance at the detector. By validating the cloud parameters it is meant 'developing a quantitative sense for the physical meaning of the measured parameters', by: (1) identifying the assumptions involved in deriving parameters from the measured radiances, (2) testing the input data and derived parameters for statistical error, sensitivity, and internal consistency, and (3) comparing with similar parameters obtained from other sources using other techniques.

  12. The development and validation of different decision-making tools to predict urine culture growth out of urine flow cytometry parameter.

    PubMed

    Müller, Martin; Seidenberg, Ruth; Schuh, Sabine K; Exadaktylos, Aristomenis K; Schechter, Clyde B; Leichtle, Alexander B; Hautz, Wolf E

    2018-01-01

    Patients presenting with suspected urinary tract infection are common in every day emergency practice. Urine flow cytometry has replaced microscopic urine evaluation in many emergency departments, but interpretation of the results remains challenging. The aim of this study was to develop and validate tools that predict urine culture growth out of urine flow cytometry parameter. This retrospective study included all adult patients that presented in a large emergency department between January and July 2017 with a suspected urinary tract infection and had a urine flow cytometry as well as a urine culture obtained. The objective was to identify urine flow cytometry parameters that reliably predict urine culture growth and mixed flora growth. The data set was split into a training (70%) and a validation set (30%) and different decision-making approaches were developed and validated. Relevant urine culture growth (respectively mixed flora growth) was found in 40.2% (7.2% respectively) of the 613 patients included. The number of leukocytes and bacteria in flow cytometry were highly associated with urine culture growth, but mixed flora growth could not be sufficiently predicted from the urine flow cytometry parameters. A decision tree, predictive value figures, a nomogram, and a cut-off table to predict urine culture growth from bacteria and leukocyte count were developed, validated and compared. Urine flow cytometry parameters are insufficient to predict mixed flora growth. However, the prediction of urine culture growth based on bacteria and leukocyte count is highly accurate and the developed tools should be used as part of the decision-making process of ordering a urine culture or starting an antibiotic therapy if a urogenital infection is suspected.

  13. The development and validation of different decision-making tools to predict urine culture growth out of urine flow cytometry parameter

    PubMed Central

    Seidenberg, Ruth; Schuh, Sabine K.; Exadaktylos, Aristomenis K.; Schechter, Clyde B.; Leichtle, Alexander B.; Hautz, Wolf E.

    2018-01-01

    Objective Patients presenting with suspected urinary tract infection are common in every day emergency practice. Urine flow cytometry has replaced microscopic urine evaluation in many emergency departments, but interpretation of the results remains challenging. The aim of this study was to develop and validate tools that predict urine culture growth out of urine flow cytometry parameter. Methods This retrospective study included all adult patients that presented in a large emergency department between January and July 2017 with a suspected urinary tract infection and had a urine flow cytometry as well as a urine culture obtained. The objective was to identify urine flow cytometry parameters that reliably predict urine culture growth and mixed flora growth. The data set was split into a training (70%) and a validation set (30%) and different decision-making approaches were developed and validated. Results Relevant urine culture growth (respectively mixed flora growth) was found in 40.2% (7.2% respectively) of the 613 patients included. The number of leukocytes and bacteria in flow cytometry were highly associated with urine culture growth, but mixed flora growth could not be sufficiently predicted from the urine flow cytometry parameters. A decision tree, predictive value figures, a nomogram, and a cut-off table to predict urine culture growth from bacteria and leukocyte count were developed, validated and compared. Conclusions Urine flow cytometry parameters are insufficient to predict mixed flora growth. However, the prediction of urine culture growth based on bacteria and leukocyte count is highly accurate and the developed tools should be used as part of the decision-making process of ordering a urine culture or starting an antibiotic therapy if a urogenital infection is suspected. PMID:29474463

  14. Method of validating measurement data of a process parameter from a plurality of individual sensor inputs

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1998-01-01

    A method for generating a validated measurement of a process parameter at a point in time by using a plurality of individual sensor inputs from a scan of said sensors at said point in time. The sensor inputs from said scan are stored and a first validation pass is initiated by computing an initial average of all stored sensor inputs. Each sensor input is deviation checked by comparing each input including a preset tolerance against the initial average input. If the first deviation check is unsatisfactory, the sensor which produced the unsatisfactory input is flagged as suspect. It is then determined whether at least two of the inputs have not been flagged as suspect and are therefore considered good inputs. If two or more inputs are good, a second validation pass is initiated by computing a second average of all the good sensor inputs, and deviation checking the good inputs by comparing each good input including a present tolerance against the second average. If the second deviation check is satisfactory, the second average is displayed as the validated measurement and the suspect sensor as flagged as bad. A validation fault occurs if at least two inputs are not considered good, or if the second deviation check is not satisfactory. In the latter situation the inputs from each of all the sensors are compared against the last validated measurement and the value from the sensor input that deviates the least from the last valid measurement is displayed.

  15. Revisiting Hansen Solubility Parameters by Including Thermodynamics.

    PubMed

    Louwerse, Manuel J; Maldonado, Ana; Rousseau, Simon; Moreau-Masselon, Chloe; Roux, Bernard; Rothenberg, Gadi

    2017-11-03

    The Hansen solubility parameter approach is revisited by implementing the thermodynamics of dissolution and mixing. Hansen's pragmatic approach has earned its spurs in predicting solvents for polymer solutions, but for molecular solutes improvements are needed. By going into the details of entropy and enthalpy, several corrections are suggested that make the methodology thermodynamically sound without losing its ease of use. The most important corrections include accounting for the solvent molecules' size, the destruction of the solid's crystal structure, and the specificity of hydrogen-bonding interactions, as well as opportunities to predict the solubility at extrapolated temperatures. Testing the original and the improved methods on a large industrial dataset including solvent blends, fit qualities improved from 0.89 to 0.97 and the percentage of correct predictions rose from 54 % to 78 %. Full Matlab scripts are included in the Supporting Information, allowing readers to implement these improvements on their own datasets. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Automated extraction and validation of children's gait parameters with the Kinect.

    PubMed

    Motiian, Saeid; Pergami, Paola; Guffey, Keegan; Mancinelli, Corrie A; Doretto, Gianfranco

    2015-12-02

    Gait analysis for therapy regimen prescription and monitoring requires patients to physically access clinics with specialized equipment. The timely availability of such infrastructure at the right frequency is especially important for small children. Besides being very costly, this is a challenge for many children living in rural areas. This is why this work develops a low-cost, portable, and automated approach for in-home gait analysis, based on the Microsoft Kinect. A robust and efficient method for extracting gait parameters is introduced, which copes with the high variability of noisy Kinect skeleton tracking data experienced across the population of young children. This is achieved by temporally segmenting the data with an approach based on coupling a probabilistic matching of stride template models, learned offline, with the estimation of their global and local temporal scaling. A preliminary study conducted on healthy children between 2 and 4 years of age is performed to analyze the accuracy, precision, repeatability, and concurrent validity of the proposed method against the GAITRite when measuring several spatial and temporal children's gait parameters. The method has excellent accuracy and good precision, with segmenting temporal sequences of body joint locations into stride and step cycles. Also, the spatial and temporal gait parameters, estimated automatically, exhibit good concurrent validity with those provided by the GAITRite, as well as very good repeatability. In particular, on a range of nine gait parameters, the relative and absolute agreements were found to be good and excellent, and the overall agreements were found to be good and moderate. This work enables and validates the automated use of the Kinect for children's gait analysis in healthy subjects. In particular, the approach makes a step forward towards developing a low-cost, portable, parent-operated in-home tool for clinicians assisting young children.

  17. Concurrent validity and reliability of wireless instrumented insoles measuring postural balance and temporal gait parameters.

    PubMed

    Oerbekke, Michiel S; Stukstette, Mirelle J; Schütte, Kurt; de Bie, Rob A; Pisters, Martijn F; Vanwanseele, Benedicte

    2017-01-01

    The OpenGo seems promising to take gait analysis out of laboratory settings due to its capability of long-term measurements and mobility. However, the OpenGo's concurrent validity and reliability need to be assessed to determine if the instrument is suitable for validation in patient samples. Twenty healthy volunteers participated. Center of pressure data were collected under eyes open and closed conditions with participants performing unilateral stance trials on the gold standard (AMTI OR6-7 force plate) while wearing the OpenGo. Temporal gait data (stance time, gait cycle time, and cadence) were collected at a self-selected comfortable walking speed with participants performing test-retest trials on an instrumented treadmill while wearing the OpenGo. Validity was assessed using Bland-Altman plots. Reliability was assessed with Intraclass Correlation Coefficient (2,1) and smallest detectable changes were calculated. Negative means of differences were found in all measured parameters, illustrating lower scores for the OpenGo on average. The OpenGo showed negative upper limits of agreement in center of pressure parameters on the mediolateral axis. Temporal reliability ICCs ranged from 0.90-0.93. Smallest detectable changes for both stance times were 0.04 (left) and 0.05 (right) seconds, for gait cycle time 0.08s, and for cadence 4.5 steps per minute. The OpenGo is valid and reliable for the measurement of temporal gait parameters during walking. Measurements of center of pressure parameters during unilateral stance are not considered valid. The OpenGo seems a promising instrument for clinically screening and monitoring temporal gait parameters in patients, however validation in patient populations is needed. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. 40 CFR 60.4410 - How do I establish a valid parameter range if I have chosen to continuously monitor parameters?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 6 2011-07-01 2011-07-01 false How do I establish a valid parameter range if I have chosen to continuously monitor parameters? 60.4410 Section 60.4410 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of...

  19. Is the smile line a valid parameter for esthetic evaluation? A systematic literature review.

    PubMed

    Passia, Nicole; Blatz, Markus; Strub, Jörg Rudolf

    2011-01-01

    The "smile line" is commonly used as a parameter to evaluate and categorize a person's smile. This systematic literature review assessed the existing evidence on the validity and universal applicability of this parameter. The latter was evaluated based on studies on smile perception by orthodontists, general clinicians, and laypeople. A review of the literature published between October 1973 and January 2010 was conducted with the electronic database Pubmed and the search terms "smile," "smile line," "smile arc," and "smile design." The search yielded 309 articles, of which nine studies were included based on the selection criteria. The selected studies typically correlate the smile line with the position of the upper lip during a smile while, on average, 75 to 100% of the maxillary anterior teeth are exposed. A virtual line that connects the incisal edges of the maxillary anterior teeth commonly follows the upper border of the lower lip. Average and parallel smile lines are most common, influenced by the age and gender of a person. Orthodontists, general clinicians, and laypeople have similar preferences and rate average smile lines as most attractive. The smile line is a valid tool to assess the esthetic appearance of a smile. It can be applied universally as clinicians and laypersons perceive and judge it similarly.

  20. Validity and repeatability of inertial measurement units for measuring gait parameters.

    PubMed

    Washabaugh, Edward P; Kalyanaraman, Tarun; Adamczyk, Peter G; Claflin, Edward S; Krishnan, Chandramouli

    2017-06-01

    Inertial measurement units (IMUs) are small wearable sensors that have tremendous potential to be applied to clinical gait analysis. They allow objective evaluation of gait and movement disorders outside the clinic and research laboratory, and permit evaluation on large numbers of steps. However, repeatability and validity data of these systems are sparse for gait metrics. The purpose of this study was to determine the validity and between-day repeatability of spatiotemporal metrics (gait speed, stance percent, swing percent, gait cycle time, stride length, cadence, and step duration) as measured with the APDM Opal IMUs and Mobility Lab system. We collected data on 39 healthy subjects. Subjects were tested over two days while walking on a standard treadmill, split-belt treadmill, or overground, with IMUs placed in two locations: both feet and both ankles. The spatiotemporal measurements taken with the IMU system were validated against data from an instrumented treadmill, or using standard clinical procedures. Repeatability and minimally detectable change (MDC) of the system was calculated between days. IMUs displayed high to moderate validity when measuring most of the gait metrics tested. Additionally, these measurements appear to be repeatable when used on the treadmill and overground. The foot configuration of the IMUs appeared to better measure gait parameters; however, both the foot and ankle configurations demonstrated good repeatability. In conclusion, the IMU system in this study appears to be both accurate and repeatable for measuring spatiotemporal gait parameters in healthy young adults. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Parameter Validation for Evaluation of Spaceflight Hardware Reusability

    NASA Technical Reports Server (NTRS)

    Childress-Thompson, Rhonda; Dale, Thomas L.; Farrington, Phillip

    2017-01-01

    Within recent years, there has been an influx of companies around the world pursuing reusable systems for space flight. Much like NASA, many of these new entrants are learning that reusable systems are complex and difficult to acheive. For instance, in its first attempts to retrieve spaceflight hardware for future reuse, SpaceX unsuccessfully tried to land on a barge at sea, resulting in a crash-landing. As this new generation of launch developers continues to develop concepts for reusable systems, having a systematic approach for determining the most effective systems for reuse is paramount. Three factors that influence the effective implementation of reusability are cost, operability and reliability. Therefore, a method that integrates these factors into the decision-making process must be utilized to adequately determine whether hardware used in space flight should be reused or discarded. Previous research has identified seven features that contribute to the successful implementation of reusability for space flight applications, defined reusability for space flight applications, highlighted the importance of reusability, and presented areas that hinder successful implementation of reusability. The next step is to ensure that the list of reusability parameters previously identified is comprehensive, and any duplication is either removed or consolidated. The characteristics to judge the seven features as good indicators for successful reuse are identified and then assessed using multiattribute decision making. Next, discriminators in the form of metrics or descriptors are assigned to each parameter. This paper explains the approach used to evaluate these parameters, define the Measures of Effectiveness (MOE) for reusability, and quantify these parameters. Using the MOEs, each parameter is assessed for its contribution to the reusability of the hardware. Potential data sources needed to validate the approach will be identified.

  2. Support Vector Data Description Model to Map Specific Land Cover with Optimal Parameters Determined from a Window-Based Validation Set.

    PubMed

    Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang

    2017-04-26

    This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.

  3. DES Y1 Results: Validating Cosmological Parameter Estimation Using Simulated Dark Energy Surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacCrann, N.; et al.

    We use mock galaxy survey simulations designed to resemble the Dark Energy Survey Year 1 (DES Y1) data to validate and inform cosmological parameter estimation. When similar analysis tools are applied to both simulations and real survey data, they provide powerful validation tests of the DES Y1 cosmological analyses presented in companion papers. We use two suites of galaxy simulations produced using different methods, which therefore provide independent tests of our cosmological parameter inference. The cosmological analysis we aim to validate is presented in DES Collaboration et al. (2017) and uses angular two-point correlation functions of galaxy number counts and weak lensing shear, as well as their cross-correlation, in multiple redshift bins. While our constraints depend on the specific set of simulated realisations available, for both suites of simulations we find that the input cosmology is consistent with the combined constraints from multiple simulated DES Y1 realizations in themore » $$\\Omega_m-\\sigma_8$$ plane. For one of the suites, we are able to show with high confidence that any biases in the inferred $$S_8=\\sigma_8(\\Omega_m/0.3)^{0.5}$$ and $$\\Omega_m$$ are smaller than the DES Y1 $$1-\\sigma$$ uncertainties. For the other suite, for which we have fewer realizations, we are unable to be this conclusive; we infer a roughly 70% probability that systematic biases in the recovered $$\\Omega_m$$ and $$S_8$$ are sub-dominant to the DES Y1 uncertainty. As cosmological analyses of this kind become increasingly more precise, validation of parameter inference using survey simulations will be essential to demonstrate robustness.« less

  4. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    PubMed

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  5. Validating a large geophysical data set: Experiences with satellite-derived cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie

    1992-01-01

    We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed

  6. Multi-parameter Observations and Validation of Pre-earthquake Atmospheric Signals

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Pulinets, S. A.; Hattori, K.; Mogi, T.; Kafatos, M.

    2014-12-01

    We are presenting the latest development in multi-sensors observations of short-term pre-earthquake phenomena preceding major earthquakes. We are exploring the potential of pre-seismic atmospheric and ionospheric signals to alert for large earthquakes. To achieve this, we start validating anomalous ionospheric /atmospheric signals in retrospective and prospective modes. The integrated satellite and terrestrial framework (ISTF) is our method for validation and is based on a joint analysis of several physical and environmental parameters (Satellite thermal infrared radiation (OLR), electron concentration in the ionosphere (GPS/TEC), VHF-bands radio waves, radon/ion activities, air temperature and seismicity patterns) that were found to be associated with earthquakes. The science rationale for multidisciplinary analysis is based on concept Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) [Pulinets and Ouzounov, 2011], which explains the synergy of different geospace processes and anomalous variations, usually named short-term pre-earthquake anomalies. Our validation processes consist in two steps: (1) A continuous retrospective analysis preformed over two different regions with high seismicity- Taiwan and Japan for 2003-2009 The retrospective tests (100+ major earthquakes, M>5.9, Taiwan and Japan) show OLR anomalous behavior before all of these events with no false negatives. False alarm ratio for false positives is less then 25%. (2) Prospective testing using multiple parameters with potential for M5.5+ events. The initial testing shows systematic appearance of atmospheric anomalies in advance (days) to the M5.5+ events for Taiwan and Japan (Honshu and Hokkaido areas). Our initial prospective results suggest that our approach show a systematic appearance of atmospheric anomalies, one to several days prior to the largest earthquakes That feature could be further studied and tested for advancing the multi-sensors detection of pre-earthquake atmospheric signals.

  7. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    1999-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include ASTER, CERES, MISR, MODIS and MOPITT. In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS), AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2, though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community. Detailed information about the EOS Terra validation Program can be found on the EOS Validation program

  8. The Model Human Processor and the Older Adult: Parameter Estimation and Validation Within a Mobile Phone Task

    PubMed Central

    Jastrzembski, Tiffany S.; Charness, Neil

    2009-01-01

    The authors estimate weighted mean values for nine information processing parameters for older adults using the Card, Moran, and Newell (1983) Model Human Processor model. The authors validate a subset of these parameters by modeling two mobile phone tasks using two different phones and comparing model predictions to a sample of younger (N = 20; Mage = 20) and older (N = 20; Mage = 69) adults. Older adult models fit keystroke-level performance at the aggregate grain of analysis extremely well (R = 0.99) and produced equivalent fits to previously validated younger adult models. Critical path analyses highlighted points of poor design as a function of cognitive workload, hardware/software design, and user characteristics. The findings demonstrate that estimated older adult information processing parameters are valid for modeling purposes, can help designers understand age-related performance using existing interfaces, and may support the development of age-sensitive technologies. PMID:18194048

  9. The Model Human Processor and the older adult: parameter estimation and validation within a mobile phone task.

    PubMed

    Jastrzembski, Tiffany S; Charness, Neil

    2007-12-01

    The authors estimate weighted mean values for nine information processing parameters for older adults using the Card, Moran, and Newell (1983) Model Human Processor model. The authors validate a subset of these parameters by modeling two mobile phone tasks using two different phones and comparing model predictions to a sample of younger (N = 20; M-sub(age) = 20) and older (N = 20; M-sub(age) = 69) adults. Older adult models fit keystroke-level performance at the aggregate grain of analysis extremely well (R = 0.99) and produced equivalent fits to previously validated younger adult models. Critical path analyses highlighted points of poor design as a function of cognitive workload, hardware/software design, and user characteristics. The findings demonstrate that estimated older adult information processing parameters are valid for modeling purposes, can help designers understand age-related performance using existing interfaces, and may support the development of age-sensitive technologies.

  10. Validation of Essential Acoustic Parameters for Highly Urgent In-Vehicle Collision Warnings.

    PubMed

    Lewis, Bridget A; Eisert, Jesse L; Baldwin, Carryl L

    2018-03-01

    Objective The aim of this study was to validate the importance of key acoustic criteria for use as in-vehicle forward collision warning (FCW) systems. Background Despite recent advances in vehicle safety, automobile crashes remain one of the leading causes of death. As automation allows for more control of noncritical functions by the vehicle, the potential for disengagement and distraction from the driving task also increases. It is, therefore, as important as ever that in-vehicle safety-critical interfaces are intuitive and unambiguous, promoting effective collision avoidance responses upon first exposure even under divided-attention conditions. Method The current study used a driving simulator to assess the effectiveness of two warnings, one that met all essential acoustic parameters, one that met only some essential parameters, and a no-warning control in the context of a lead vehicle-following task in conjunction with a cognitive distractor task and collision event. Results Participants receiving an FCW comprising five essential acoustic components had improved collision avoidance responses relative to a no-warning condition and an FCW missing essential elements on their first exposure. Responses to a consistently good warning (GMU Prime) improved with subsequent exposures, whereas continued exposure to the less optimal FCW (GMU Sub-Prime) resulted in poorer performance even relative to receiving no warning at all. Conclusions This study provides support for previous warning design studies and for the validity of five key acoustic parameters essential for the design of effective in-vehicle FCWs. Application Results from this study have implications for the design of auditory FCWs and in-vehicle display design.

  11. Validity of a questionnaire measuring motives for choosing foods including sustainable concerns.

    PubMed

    Sautron, Valérie; Péneau, Sandrine; Camilleri, Géraldine M; Muller, Laurent; Ruffieux, Bernard; Hercberg, Serge; Méjean, Caroline

    2015-04-01

    Since the 1990s, sustainability of diet has become an increasingly important concern for consumers. However, there is no validated multidimensional measurement of motivation in the choice of foods including a concern for sustainability currently available. In the present study, we developed a questionnaire that measures food choice motives during purchasing, and we tested its psychometric properties. The questionnaire included 104 items divided into four predefined dimensions (environmental, health and well-being, economic and miscellaneous). It was administered to 1000 randomly selected subjects participating in the Nutrinet-Santé cohort study. Among 637 responders, one-third found the questionnaire complex or too long, while one-quarter found it difficult to fill in. Its underlying structure was determined by exploratory factor analysis and then internally validated by confirmatory factor analysis. Reliability was also assessed by internal consistency of selected dimensions and test-retest repeatability. After selecting the most relevant items, first-order analysis highlighted nine main dimensions: labeled ethics and environment, local and traditional production, taste, price, environmental limitations, health, convenience, innovation and absence of contaminants. The model demonstrated excellent internal validity (adjusted goodness of fit index = 0.97; standardized root mean square residuals = 0.07) and satisfactory reliability (internal consistency = 0.96, test-retest repeatability coefficient ranged between 0.31 and 0.68 over a mean 4-week period). This study enabled precise identification of the various dimensions in food choice motives and proposed an original, internally valid tool applicable to large populations for assessing consumer food motivation during purchasing, particularly in terms of sustainability. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Discrimination of Clover and Citrus Honeys from Egypt According to Floral Type Using Easily Assessable Physicochemical Parameters and Discriminant Analysis: An External Validation of the Chemometric Approach.

    PubMed

    Karabagias, Ioannis K; Karabournioti, Sofia

    2018-05-03

    Twenty-two honey samples, namely clover and citrus honeys, were collected from the greater Cairo area during the harvesting year 2014⁻2015. The main purpose of the present study was to characterize the aforementioned honey types and to investigate whether the use of easily assessable physicochemical parameters, including color attributes in combination with chemometrics, could differentiate honey floral origin. Parameters taken into account were: pH, electrical conductivity, ash, free acidity, lactonic acidity, total acidity, moisture content, total sugars (degrees Brix-°Bx), total dissolved solids and their ratio to total acidity, salinity, CIELAB color parameters, along with browning index values. Results showed that all honey samples analyzed met the European quality standards set for honey and had variations in the aforementioned physicochemical parameters depending on floral origin. Application of linear discriminant analysis showed that eight physicochemical parameters, including color, could classify Egyptian honeys according to floral origin ( p < 0.05). Correct classification rate was 95.5% using the original method and 90.9% using the cross validation method. The discriminatory ability of the developed model was further validated using unknown honey samples. The overall correct classification rate was not affected. Specific physicochemical parameter analysis in combination with chemometrics has the potential to enhance the differences in floral honeys produced in a given geographical zone.

  13. Discrimination of Clover and Citrus Honeys from Egypt According to Floral Type Using Easily Assessable Physicochemical Parameters and Discriminant Analysis: An External Validation of the Chemometric Approach

    PubMed Central

    Karabournioti, Sofia

    2018-01-01

    Twenty-two honey samples, namely clover and citrus honeys, were collected from the greater Cairo area during the harvesting year 2014–2015. The main purpose of the present study was to characterize the aforementioned honey types and to investigate whether the use of easily assessable physicochemical parameters, including color attributes in combination with chemometrics, could differentiate honey floral origin. Parameters taken into account were: pH, electrical conductivity, ash, free acidity, lactonic acidity, total acidity, moisture content, total sugars (degrees Brix-°Bx), total dissolved solids and their ratio to total acidity, salinity, CIELAB color parameters, along with browning index values. Results showed that all honey samples analyzed met the European quality standards set for honey and had variations in the aforementioned physicochemical parameters depending on floral origin. Application of linear discriminant analysis showed that eight physicochemical parameters, including color, could classify Egyptian honeys according to floral origin (p < 0.05). Correct classification rate was 95.5% using the original method and 90.9% using the cross validation method. The discriminatory ability of the developed model was further validated using unknown honey samples. The overall correct classification rate was not affected. Specific physicochemical parameter analysis in combination with chemometrics has the potential to enhance the differences in floral honeys produced in a given geographical zone. PMID:29751543

  14. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    2000-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra

  15. C -parameter distribution at N 3 LL ' including power corrections

    DOE PAGES

    Hoang, André H.; Kolodrubetz, Daniel W.; Mateu, Vicent; ...

    2015-05-15

    We compute the e⁺e⁻ C-parameter distribution using the soft-collinear effective theory with a resummation to next-to-next-to-next-to-leading-log prime accuracy of the most singular partonic terms. This includes the known fixed-order QCD results up to O(α 3 s), a numerical determination of the two-loop nonlogarithmic term of the soft function, and all logarithmic terms in the jet and soft functions up to three loops. Our result holds for C in the peak, tail, and far tail regions. Additionally, we treat hadronization effects using a field theoretic nonperturbative soft function, with moments Ω n. To eliminate an O(Λ QCD) renormalon ambiguity in themore » soft function, we switch from the MS¯ to a short distance “Rgap” scheme to define the leading power correction parameter Ω 1. We show how to simultaneously account for running effects in Ω 1 due to renormalon subtractions and hadron-mass effects, enabling power correction universality between C-parameter and thrust to be tested in our setup. We discuss in detail the impact of resummation and renormalon subtractions on the convergence. In the relevant fit region for αs(m Z) and Ω 1, the perturbative uncertainty in our cross section is ≅ 2.5% at Q=m Z.« less

  16. The Model Human Processor and the Older Adult: Parameter Estimation and Validation within a Mobile Phone Task

    ERIC Educational Resources Information Center

    Jastrzembski, Tiffany S.; Charness, Neil

    2007-01-01

    The authors estimate weighted mean values for nine information processing parameters for older adults using the Card, Moran, and Newell (1983) Model Human Processor model. The authors validate a subset of these parameters by modeling two mobile phone tasks using two different phones and comparing model predictions to a sample of younger (N = 20;…

  17. Generator Dynamic Model Validation and Parameter Calibration Using Phasor Measurements at the Point of Connection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhenyu; Du, Pengwei; Kosterev, Dmitry

    2013-05-01

    Disturbance data recorded by phasor measurement units (PMU) offers opportunities to improve the integrity of dynamic models. However, manually tuning parameters through play-back events demands significant efforts and engineering experiences. In this paper, a calibration method using the extended Kalman filter (EKF) technique is proposed. The formulation of EKF with parameter calibration is discussed. Case studies are presented to demonstrate its validity. The proposed calibration method is cost-effective, complementary to traditional equipment testing for improving dynamic model quality.

  18. Fault-tolerant clock synchronization validation methodology. [in computer systems

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  19. An individual and dynamic Body Segment Inertial Parameter validation method using ground reaction forces.

    PubMed

    Hansen, Clint; Venture, Gentiane; Rezzoug, Nasser; Gorce, Philippe; Isableu, Brice

    2014-05-07

    Over the last decades a variety of research has been conducted with the goal to improve the Body Segment Inertial Parameters (BSIP) estimations but to our knowledge a real validation has never been completely successful, because no ground truth is available. The aim of this paper is to propose a validation method for a BSIP identification method (IM) and to confirm the results by comparing them with recalculated contact forces using inverse dynamics to those obtained by a force plate. Furthermore, the results are compared with the recently proposed estimation method by Dumas et al. (2007). Additionally, the results are cross validated with a high velocity overarm throwing movement. Throughout conditions higher correlations, smaller metrics and smaller RMSE can be found for the proposed BSIP estimation (IM) which shows its advantage compared to recently proposed methods as of Dumas et al. (2007). The purpose of the paper is to validate an already proposed method and to show that this method can be of significant advantage compared to conventional methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Preliminary validation of assays to measure parameters of calcium metabolism in captive Asian and African elephants in western Europe.

    PubMed

    van Sonsbeek, Gerda R; van der Kolk, Johannes H; van Leeuwen, Johannes P T M; Schaftenaar, Willem

    2011-05-01

    Hypocalcemia is a well known cause of dystocia in animals, including elephants in captivity. In order to study calcium metabolism in elephants, it is of utmost importance to use properly validated assays, as these might be prone to specific matrix effects in elephant blood. The aim of the current study was to conduct preliminary work for validation of various parameters involved in calcium metabolism in both blood and urine of captive elephants. Basal values of these parameters were compared between Asian elephants (Elephas maximus) and African elephants (Loxodonta africana). Preliminary testing of total calcium, inorganic phosphorus, and creatinine appeared valid for use in plasma and creatinine in urine in both species. Furthermore, measurements of bone alkaline phosphatase and N-terminal telopeptide of type I collagen appeared valid for use in Asian elephants. Mean heparinized plasma ionized calcium concentration and pH were not significantly affected by 3 cycles of freezing and thawing. Storage at 4 °C, room temperature, and 37 °C for 6, 12, and 24 hr did not alter the heparinized plasma ionized calcium concentration in Asian elephants. The following linear regression equation using pH (range: 6.858-7.887) and ionized calcium concentration in heparinized plasma was utilized: iCa(7.4) (mmol/l) = -2.1075 + 0.3130·pH(actual) + 0.8296·iCa(actual) (mmol/l). Mean basal values for pH and plasma in Asian elephant whole blood were 7.40 ± 0.048 and 7.49 ± 0.077, respectively. The urinary specific gravity and creatinine concentrations in both Asian and African elephants were significantly correlated and both were significantly lower in Asian elephants. © 2011 The Author(s)

  1. The dynamical core of the Aeolus 1.0 statistical-dynamical atmosphere model: validation and parameter optimization

    NASA Astrophysics Data System (ADS)

    Totz, Sonja; Eliseev, Alexey V.; Petri, Stefan; Flechsig, Michael; Caesar, Levke; Petoukhov, Vladimir; Coumou, Dim

    2018-02-01

    We present and validate a set of equations for representing the atmosphere's large-scale general circulation in an Earth system model of intermediate complexity (EMIC). These dynamical equations have been implemented in Aeolus 1.0, which is a statistical-dynamical atmosphere model (SDAM) and includes radiative transfer and cloud modules (Coumou et al., 2011; Eliseev et al., 2013). The statistical dynamical approach is computationally efficient and thus enables us to perform climate simulations at multimillennia timescales, which is a prime aim of our model development. Further, this computational efficiency enables us to scan large and high-dimensional parameter space to tune the model parameters, e.g., for sensitivity studies.Here, we present novel equations for the large-scale zonal-mean wind as well as those for planetary waves. Together with synoptic parameterization (as presented by Coumou et al., 2011), these form the mathematical description of the dynamical core of Aeolus 1.0.We optimize the dynamical core parameter values by tuning all relevant dynamical fields to ERA-Interim reanalysis data (1983-2009) forcing the dynamical core with prescribed surface temperature, surface humidity and cumulus cloud fraction. We test the model's performance in reproducing the seasonal cycle and the influence of the El Niño-Southern Oscillation (ENSO). We use a simulated annealing optimization algorithm, which approximates the global minimum of a high-dimensional function.With non-tuned parameter values, the model performs reasonably in terms of its representation of zonal-mean circulation, planetary waves and storm tracks. The simulated annealing optimization improves in particular the model's representation of the Northern Hemisphere jet stream and storm tracks as well as the Hadley circulation.The regions of high azonal wind velocities (planetary waves) are accurately captured for all validation experiments. The zonal-mean zonal wind and the integrated lower

  2. On the Simulation of Sea States with High Significant Wave Height for the Validation of Parameter Retrieval Algorithms for Future Altimetry Missions

    NASA Astrophysics Data System (ADS)

    Kuschenerus, Mieke; Cullen, Robert

    2016-08-01

    To ensure reliability and precision of wave height estimates for future satellite altimetry missions such as Sentinel 6, reliable parameter retrieval algorithms that can extract significant wave heights up to 20 m have to be established. The retrieved parameters, i.e. the retrieval methods need to be validated extensively on a wide range of possible significant wave heights. Although current missions require wave height retrievals up to 20 m, there is little evidence of systematic validation of parameter retrieval methods for sea states with wave heights above 10 m. This paper provides a definition of a set of simulated sea states with significant wave height up to 20 m, that allow simulation of radar altimeter response echoes for extreme sea states in SAR and low resolution mode. The simulated radar responses are used to derive significant wave height estimates, which can be compared with the initial models, allowing precision estimations of the applied parameter retrieval methods. Thus we establish a validation method for significant wave height retrieval for sea states causing high significant wave heights, to allow improved understanding and planning of future satellite altimetry mission validation.

  3. Validity criteria for the diagnosis of fatty liver by M probe-based controlled attenuation parameter.

    PubMed

    Wong, Vincent Wai-Sun; Petta, Salvatore; Hiriart, Jean-Baptiste; Cammà, Calogero; Wong, Grace Lai-Hung; Marra, Fabio; Vergniol, Julien; Chan, Anthony Wing-Hung; Tuttolomondo, Antonino; Merrouche, Wassil; Chan, Henry Lik-Yuen; Le Bail, Brigitte; Arena, Umberto; Craxì, Antonio; de Lédinghen, Victor

    2017-09-01

    Controlled attenuation parameter (CAP) can be performed together with liver stiffness measurement (LSM) by transient elastography (TE) and is often used to diagnose fatty liver. We aimed to define the validity criteria of CAP. CAP was measured by the M probe prior to liver biopsy in 754 consecutive patients with different liver diseases at three centers in Europe and Hong Kong (derivation cohort, n=340; validation cohort, n=414; 101 chronic hepatitis B, 154 chronic hepatitis C, 349 non-alcoholic fatty liver disease, 37 autoimmune hepatitis, 49 cholestatic liver disease, 64 others; 277 F3-4; age 52±14; body mass index 27.2±5.3kg/m 2 ). The primary outcome was the diagnosis of fatty liver, defined as steatosis involving ≥5% of hepatocytes. The area under the receiver-operating characteristics curve (AUROC) for CAP diagnosis of fatty liver was 0.85 (95% CI 0.82-0.88). The interquartile range (IQR) of CAP had a negative correlation with CAP (r=-0.32, p<0.001), suggesting the IQR-to-median ratio of CAP would be an inappropriate validity parameter. In the derivation cohort, the IQR of CAP was associated with the accuracy of CAP (AUROC 0.86, 0.89 and 0.76 in patients with IQR of CAP <20 [15% of patients], 20-39 [51%], and ≥40dB/m [33%], respectively). Likewise, the AUROC of CAP in the validation cohort was 0.90 and 0.77 in patients with IQR of CAP <40 and ≥40dB/m, respectively (p=0.004). The accuracy of CAP in detecting grade 2 and 3 steatosis was lower among patients with body mass index ≥30kg/m 2 and F3-4 fibrosis. The validity of CAP for the diagnosis of fatty liver is lower if the IQR of CAP is ≥40dB/m. Lay summary: Controlled attenuation parameter (CAP) is measured by transient elastography (TE) for the detection of fatty liver. In this large study, using liver biopsy as a reference, we show that the variability of CAP measurements based on its interquartile range can reflect the accuracy of fatty liver diagnosis. In contrast, other clinical factors such

  4. Deriving aerosol parameters from in-situ spectrometer measurements for validation of remote sensing products

    NASA Astrophysics Data System (ADS)

    Riedel, Sebastian; Janas, Joanna; Gege, Peter; Oppelt, Natascha

    2017-10-01

    Uncertainties of aerosol parameters are the limiting factor for atmospheric correction over inland and coastal waters. For validating remote sensing products from these optically complex and spatially inhomogeneous waters the spatial resolution of automated sun photometer networks like AERONET is too coarse and additional measurements on the test site are required. We have developed a method which allows the derivation of aerosol parameters from measurements with any spectrometer with suitable spectral range and resolution. This method uses a pair of downwelling irradiance and sky radiance measurements for the extraction of the turbidity coefficient and aerosol Ångström exponent. The data can be acquired fast and reliable at almost any place during a wide range of weather conditions. A comparison to aerosol parameters measured with a Cimel sun photometer provided by AERONET shows a reasonable agreement for the Ångström exponent. The turbidity coefficient did not agree well with AERONET values due to fit ambiguities, indicating that future research should focus on methods to handle parameter correlations within the underlying model.

  5. Reliability and Validity of Kinetic and Kinematic Parameters Determined With Force Plates Embedded Under a Soil-Filled Baseball Mound.

    PubMed

    Yanai, Toshimasa; Matsuo, Akifumi; Maeda, Akira; Nakamoto, Hiroki; Mizutani, Mirai; Kanehisa, Hiroaki; Fukunaga, Tetsuo

    2017-08-01

    We developed a force measurement system in a soil-filled mound for measuring ground reaction forces (GRFs) acting on baseball pitchers and examined the reliability and validity of kinetic and kinematic parameters determined from the GRFs. Three soil-filled trays of dimensions that satisfied the official baseball rules were fixed onto 3 force platforms. Eight collegiate pitchers wearing baseball shoes with metal cleats were asked to throw 5 fastballs with maximum effort from the mound toward a catcher. The reliability of each parameter was determined for each subject as the coefficient of variation across the 5 pitches. The validity of the measurements was tested by comparing the outcomes either with the true values or the corresponding values computed from a motion capture system. The coefficients of variation in the repeated measurements of the peak forces ranged from 0.00 to 0.17, and were smaller for the pivot foot than the stride foot. The mean absolute errors in the impulses determined over the entire duration of pitching motion were 5.3 N˙s, 1.9 N˙s, and 8.2 N˙s for the X-, Y-, and Z-directions, respectively. These results suggest that the present method is reliable and valid for determining selected kinetic and kinematic parameters for analyzing pitching performance.

  6. Hydrological Relevant Parameters from Remote Sensing - Spatial Modelling Input and Validation Basis

    NASA Astrophysics Data System (ADS)

    Hochschild, V.

    2012-12-01

    This keynote paper will demonstrate how multisensoral remote sensing data is used as spatial input for mesoscale hydrological modeling as well as for sophisticated validation purposes. The tasks of Water Resources Management are subject as well as the role of remote sensing in regional catchment modeling. Parameters derived from remote sensing discussed in this presentation will be land cover, topographical information from digital elevation models, biophysical vegetation parameters, surface soil moisture, evapotranspiration estimations, lake level measurements, determination of snow covered area, lake ice cycles, soil erosion type, mass wasting monitoring, sealed area, flash flood estimation. The actual possibilities of recent satellite and airborne systems are discussed, as well as the data integration into GIS and hydrological modeling, scaling issues and quality assessment will be mentioned. The presentation will provide an overview of own research examples from Germany, Tibet and Africa (Ethiopia, South Africa) as well as other international research activities. Finally the paper gives an outlook on upcoming sensors and concludes the possibilities of remote sensing in hydrology.

  7. Parasitic Parameters Extraction for InP DHBT Based on EM Method and Validation up to H-Band

    NASA Astrophysics Data System (ADS)

    Li, Oupeng; Zhang, Yong; Wang, Lei; Xu, Ruimin; Cheng, Wei; Wang, Yuan; Lu, Haiyan

    2017-05-01

    This paper presents a small-signal model for InGaAs/InP double heterojunction bipolar transistor (DHBT). Parasitic parameters of access via and electrode finger are extracted by 3-D electromagnetic (EM) simulation. By analyzing the equivalent circuit of seven special structures and using the EM simulation results, the parasitic parameters are extracted systematically. Compared with multi-port s-parameter EM model, the equivalent circuit model has clear physical intension and avoids the complex internal ports setting. The model is validated on a 0.5 × 7 μm2 InP DHBT up to 325 GHz. The model provides a good fitting result between measured and simulated multi-bias s-parameters in full band. At last, an H-band amplifier is designed and fabricated for further verification. The measured amplifier performance is highly agreed with the model prediction, which indicates the model has good accuracy in submillimeterwave band.

  8. Power extraction calculation improvement when local parameters are included

    NASA Astrophysics Data System (ADS)

    Flores-Mateos, L. M.; Hartnett, M.

    2016-02-01

    The improvement of the tidal resource assessment will be studied by comparing two approaches in a two-dimensional, finite difference, hydrodynamic model DIVAST-ADI; in a channel of non-varying cross-sectional area that connects two large basins. The first strategy, considers a constant trust coefficient; the second one, use the local field parameters around the turbine. These parameters are obtained after applying the open channel theory in the tidal stream and after considering the turbine as a linear momentum actuator disk. The parameters correspond to the upstream and downstream, with respect to the turbine, speeds and depths; also the blockage ratio, the wake velocity and the bypass coefficients and they have already been incorporated in the model. The figure (a) shows the numerical configuration at high tide developed with DIVAST-ADI. The experiment undertakes two open boundary conditions. The first one is a sinusoidal forcing introduced as a water level located at (I, J=1) and the second one, indicate that a zero velocity and a constant water depth were kept (I, J=362); when the turbine is introduced it is placed in the middle of the channel (I=161, J=181). The influence of the turbine in the velocity and elevation around the turbine region is evident; figure (b) and (c) shows that the turbine produces a discontinuity in the depth and velocity profile, when we plot a transect along the channel. Finally, the configuration implemented reproduced with satisfactory accuracy the quasi-steady flow condition, even without presenting shock-capturing capability. Also, the range of the parameters 0.01<α 4<0.55, $0

  9. F-18 High Alpha Research Vehicle (HARV) parameter identification flight test maneuvers for optimal input design validation and lateral control effectiveness

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1995-01-01

    Flight test maneuvers are specified for the F-18 High Alpha Research Vehicle (HARV). The maneuvers were designed for open loop parameter identification purposes, specifically for optimal input design validation at 5 degrees angle of attack, identification of individual strake effectiveness at 40 and 50 degrees angle of attack, and study of lateral dynamics and lateral control effectiveness at 40 and 50 degrees angle of attack. Each maneuver is to be realized by applying square wave inputs to specific control effectors using the On-Board Excitation System (OBES). Maneuver descriptions and complete specifications of the time/amplitude points define each input are included, along with plots of the input time histories.

  10. Cloud parameters from zenith transmittances measured by sky radiometer at surface: Method development and satellite product validation

    NASA Astrophysics Data System (ADS)

    Khatri, Pradeep; Hayasaka, Tadahiro; Iwabuchi, Hironobu; Takamura, Tamio; Irie, Hitoshi; Nakajima, Takashi Y.; Letu, Husi; Kai, Qin

    2017-04-01

    Clouds are known to have profound impacts on atmospheric radiation and water budget, climate change, atmosphere-surface interaction, and so on. Cloud optical thickness (COT) and effective radius (Re) are two fundamental cloud parameters required to study clouds from climatological and hydrological point of view. Large spatial-temporal coverages of those cloud parameters from space observation have proved to be very useful for cloud research; however, validation of space-based products is still a challenging task due to lack of reliable data. Ground-based remote sensing instruments, such as sky radiometers distributed around the world through international observation networks of SKYNET (http://atmos2.cr.chiba-u.jp/skynet/) and AERONET (https://aeronet.gsfc.nasa.gov/) have a great potential to produce ground-truth cloud parameters at different parts of the globe to validate satellite products. Focusing to the sky radiometers of SKYNET and AERONET, a few cloud retrieval methods exists, but those methods have some difficulties to address the problem when cloud is optically thin. It is because the observed transmittances at two wavelengths can be originated from more than one set of COD and Re, and the choice of the most plausible set is difficult. At the same time, calibration issue, especially for the wavelength of near infrared (NIR) region, which is important to retrieve Re, is also a difficult task at present. As a result, instruments need to be calibrated at a high mountain or calibration terms need to be transferred from a standard instrument. Taking those points on account, we developed a new retrieval method emphasizing to overcome above-mentioned difficulties. We used observed transmittances of multiple wavelengths to overcome the first problem. We further proposed a method to obtain calibration constant of NIR wavelength channel using observation data. Our cloud retrieval method is found to produce relatively accurate COD and Re when validated them using

  11. Validation of photosynthetic-fluorescence parameters as biomarkers for isoproturon toxic effect on alga Scenedesmus obliquus.

    PubMed

    Dewez, David; Didur, Olivier; Vincent-Héroux, Jonathan; Popovic, Radovan

    2008-01-01

    Photosynthetic-fluorescence parameters were investigated to be used as valid biomarkers of toxicity when alga Scenedesmus obliquus was exposed to isoproturon [3-(4-isopropylphenyl)-1,1-dimethylurea] effect. Chlorophyll fluorescence induction of algal cells treated with isoproturon showed inactivation of photosystem II (PSII) reaction centers and strong inhibition of PSII electron transport. A linear correlation was found (R2>or=0.861) between the change of cells density affected by isoproturon and the change of effective PSII quantum yield (PhiM'), photochemical quenching (qP) and relative photochemical quenching (qP(rel)) values. The cells density was also linearly dependent (R2=0.838) on the relative unquenched fluorescence parameter (UQF(rel)). Non-linear correlation was found (R2=0.937) only between cells density and the energy transfer efficiency from absorbed light to PSII reaction center (ABS/RC). The order of sensitivity determined by the EC-50% was: UQF(rel)>PhiM'>qP>qP(rel)>ABS/RC. Correlations between cells density and those photosynthetic-fluorescence parameters provide supporting evidence to use them as biomarkers of toxicity for environmental pollutants.

  12. Validation of Slosh Model Parameters and Anti-Slosh Baffle Designs of Propellant Tanks by Using Lateral Slosh Testing

    NASA Technical Reports Server (NTRS)

    Perez, Jose G.; Parks, Russel, A.; Lazor, Daniel R.

    2012-01-01

    The slosh dynamics of propellant tanks can be represented by an equivalent mass-pendulum-dashpot mechanical model. The parameters of this equivalent model, identified as slosh mechanical model parameters, are slosh frequency, slosh mass, and pendulum hinge point location. They can be obtained by both analysis and testing for discrete fill levels. Anti-slosh baffles are usually needed in propellant tanks to control the movement of the fluid inside the tank. Lateral slosh testing, involving both random excitation testing and free-decay testing, are performed to validate the slosh mechanical model parameters and the damping added to the fluid by the anti-slosh baffles. Traditional modal analysis procedures were used to extract the parameters from the experimental data. Test setup of sub-scale tanks will be described. A comparison between experimental results and analysis will be presented.

  13. Validation of Slosh Model Parameters and Anti-Slosh Baffle Designs of Propellant Tanks by Using Lateral Slosh Testing

    NASA Technical Reports Server (NTRS)

    Perez, Jose G.; Parks, Russel A.; Lazor, Daniel R.

    2012-01-01

    The slosh dynamics of propellant tanks can be represented by an equivalent pendulum-mass mechanical model. The parameters of this equivalent model, identified as slosh model parameters, are slosh mass, slosh mass center of gravity, slosh frequency, and smooth-wall damping. They can be obtained by both analysis and testing for discrete fill heights. Anti-slosh baffles are usually needed in propellant tanks to control the movement of the fluid inside the tank. Lateral slosh testing, involving both random testing and free-decay testing, are performed to validate the slosh model parameters and the damping added to the fluid by the anti-slosh baffles. Traditional modal analysis procedures are used to extract the parameters from the experimental data. Test setup of sub-scale test articles of cylindrical and spherical shapes will be described. A comparison between experimental results and analysis will be presented.

  14. Results and Validation of MODIS Aerosol Retrievals Over Land and Ocean

    NASA Technical Reports Server (NTRS)

    Remer, Lorraine; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The MODerate Resolution Imaging Spectroradiometer (MODIS) instrument aboard the Terra spacecraft has been retrieving aerosol parameters since late February 2000. Initial qualitative checking of the products showed very promising results including matching of land and ocean retrievals at coastlines. Using AERONET ground-based radiometers as our primary validation tool, we have established quantitative validation as well. Our results show that for most aerosol types, the MODIS products fall within the pre-launch estimated uncertainties. Surface reflectance and aerosol model assumptions appear to be sufficiently accurate for the optical thickness retrieval. Dust provides a possible exception, which may be due to non-spherical effects. Over ocean the MODIS products include information on particle size, and these parameters are also validated with AERONET retrievals.

  15. Results and Validation of MODIS Aerosol Retrievals over Land and Ocean

    NASA Technical Reports Server (NTRS)

    Remer, L. A.; Kaufman, Y. J.; Tanre, D.; Ichoku, C.; Chu, D. A.; Mattoo, S.; Levy, R.; Martins, J. V.; Li, R.-R.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The MODerate Resolution Imaging Spectroradiometer (MODIS) instrument aboard the Terra spacecraft has been retrieving aerosol parameters since late February 2000. Initial qualitative checking of the products showed very promising results including matching of land and ocean retrievals at coastlines. Using AERONET ground-based radiometers as our primary validation tool, we have established quantitative validation as well. Our results show that for most aerosol types, the MODIS products fall within the pre-launch estimated uncertainties. Surface reflectance and aerosol model assumptions appear to be sufficiently accurate for the optical thickness retrieval. Dust provides a possible exception, which may be due to non-spherical effects. Over ocean the MODIS products include information on particle size, and these parameters are also validated with AERONET retrievals.

  16. Computer simulation of Cerebral Arteriovenous Malformation-validation analysis of hemodynamics parameters.

    PubMed

    Kumar, Y Kiran; Mehta, Shashi Bhushan; Ramachandra, Manjunath

    2017-01-01

    The purpose of this work is to provide some validation methods for evaluating the hemodynamic assessment of Cerebral Arteriovenous Malformation (CAVM). This article emphasizes the importance of validating noninvasive measurements for CAVM patients, which are designed using lumped models for complex vessel structure. The validation of the hemodynamics assessment is based on invasive clinical measurements and cross-validation techniques with the Philips proprietary validated software's Qflow and 2D Perfursion. The modeling results are validated for 30 CAVM patients for 150 vessel locations. Mean flow, diameter, and pressure were compared between modeling results and with clinical/cross validation measurements, using an independent two-tailed Student t test. Exponential regression analysis was used to assess the relationship between blood flow, vessel diameter, and pressure between them. Univariate analysis is used to assess the relationship between vessel diameter, vessel cross-sectional area, AVM volume, AVM pressure, and AVM flow results were performed with linear or exponential regression. Modeling results were compared with clinical measurements from vessel locations of cerebral regions. Also, the model is cross validated with Philips proprietary validated software's Qflow and 2D Perfursion. Our results shows that modeling results and clinical results are nearly matching with a small deviation. In this article, we have validated our modeling results with clinical measurements. The new approach for cross-validation is proposed by demonstrating the accuracy of our results with a validated product in a clinical environment.

  17. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  18. DBCG hypo trial validation of radiotherapy parameters from a national data bank versus manual reporting.

    PubMed

    Brink, Carsten; Lorenzen, Ebbe L; Krogh, Simon Long; Westberg, Jonas; Berg, Martin; Jensen, Ingelise; Thomsen, Mette Skovhus; Yates, Esben Svitzer; Offersen, Birgitte Vrou

    2018-01-01

    The current study evaluates the data quality achievable using a national data bank for reporting radiotherapy parameters relative to the classical manual reporting method of selected parameters. The data comparison is based on 1522 Danish patients of the DBCG hypo trial with data stored in the Danish national radiotherapy data bank. In line with standard DBCG trial practice selected parameters were also reported manually to the DBCG database. Categorical variables are compared using contingency tables, and comparison of continuous parameters is presented in scatter plots. For categorical variables 25 differences between the data bank and manual values were located. Of these 23 were related to mistakes in the manual reported value whilst the remaining two were a wrong classification in the data bank. The wrong classification in the data bank was related to lack of dose information, since the two patients had been treated with an electron boost based on a manual calculation, thus data was not exported to the data bank, and this was not detected prior to comparison with the manual data. For a few database fields in the manual data an ambiguity of the parameter definition of the specific field is seen in the data. This was not the case for the data bank, which extract all data consistently. In terms of data quality the data bank is superior to manually reported values. However, there is a need to allocate resources for checking the validity of the available data as well as ensuring that all relevant data is present. The data bank contains more detailed information, and thus facilitates research related to the actual dose distribution in the patients.

  19. Validity of segmental bioelectrical impedance analysis for estimating fat-free mass in children including overweight individuals.

    PubMed

    Ohta, Megumi; Midorikawa, Taishi; Hikihara, Yuki; Masuo, Yoshihisa; Sakamoto, Shizuo; Torii, Suguru; Kawakami, Yasuo; Fukunaga, Tetsuo; Kanehisa, Hiroaki

    2017-02-01

    This study examined the validity of segmental bioelectrical impedance (BI) analysis for predicting the fat-free masses (FFMs) of whole-body and body segments in children including overweight individuals. The FFM and impedance (Z) values of arms, trunk, legs, and whole body were determined using a dual-energy X-ray absorptiometry and segmental BI analyses, respectively, in 149 boys and girls aged 6 to 12 years, who were divided into model-development (n = 74), cross-validation (n = 35), and overweight (n = 40) groups. Simple regression analysis was applied to (length) 2 /Z (BI index) for each of the whole-body and 3 segments to develop the prediction equations of the measured FFM of the related body part. In the model-development group, the BI index of each of the 3 segments and whole body was significantly correlated to the measured FFM (R 2 = 0.867-0.932, standard error of estimation = 0.18-1.44 kg (5.9%-8.7%)). There was no significant difference between the measured and predicted FFM values without systematic error. The application of each equation derived in the model-development group to the cross-validation and overweight groups did not produce significant differences between the measured and predicted FFM values and systematic errors, with an exception that the arm FFM in the overweight group was overestimated. Segmental bioelectrical impedance analysis is useful for predicting the FFM of each of whole-body and body segments in children including overweight individuals, although the application for estimating arm FFM in overweight individuals requires a certain modification.

  20. Excavator Design Validation

    NASA Technical Reports Server (NTRS)

    Pholsiri, Chalongrath; English, James; Seberino, Charles; Lim, Yi-Je

    2010-01-01

    The Excavator Design Validation tool verifies excavator designs by automatically generating control systems and modeling their performance in an accurate simulation of their expected environment. Part of this software design includes interfacing with human operations that can be included in simulation-based studies and validation. This is essential for assessing productivity, versatility, and reliability. This software combines automatic control system generation from CAD (computer-aided design) models, rapid validation of complex mechanism designs, and detailed models of the environment including soil, dust, temperature, remote supervision, and communication latency to create a system of high value. Unique algorithms have been created for controlling and simulating complex robotic mechanisms automatically from just a CAD description. These algorithms are implemented as a commercial cross-platform C++ software toolkit that is configurable using the Extensible Markup Language (XML). The algorithms work with virtually any mobile robotic mechanisms using module descriptions that adhere to the XML standard. In addition, high-fidelity, real-time physics-based simulation algorithms have also been developed that include models of internal forces and the forces produced when a mechanism interacts with the outside world. This capability is combined with an innovative organization for simulation algorithms, new regolith simulation methods, and a unique control and study architecture to make powerful tools with the potential to transform the way NASA verifies and compares excavator designs. Energid's Actin software has been leveraged for this design validation. The architecture includes parametric and Monte Carlo studies tailored for validation of excavator designs and their control by remote human operators. It also includes the ability to interface with third-party software and human-input devices. Two types of simulation models have been adapted: high-fidelity discrete

  1. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    PubMed

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Validation of Cardiovascular Parameters during NASA's Functional Task Test

    NASA Technical Reports Server (NTRS)

    Arzeno, N. M.; Stenger, M. B.; Bloomberg, J. J.; Platts, S. H.

    2009-01-01

    Microgravity exposure causes physiological deconditioning and impairs crewmember task performance. The Functional Task Test (FTT) is designed to correlate these physiological changes to performance in a series of operationally-relevant tasks. One of these, the Recovery from Fall/Stand Test (RFST), tests both the ability to recover from a prone position and cardiovascular responses to orthostasis. PURPOSE: Three minutes were chosen for the duration of this test, yet it is unknown if this is long enough to induce cardiovascular responses similar to the operational 5 min stand test. The purpose of this study was to determine the validity and reliability of heart rate variability (HRV) analysis of a 3 min stand and to examine the effect of spaceflight on these measures. METHODS: To determine the validity of using 3 vs. 5 min of standing to assess HRV, ECG was collected from 7 healthy subjects who participated in a 6 min RFST. Mean R-R interval (RR) and spectral HRV were measured in minutes 0-3 and 0-5 following the heart rate transient due to standing. Significant differences between the segments were determined by a paired t-test. To determine the reliability of the 3-min stand test, 13 healthy subjects completed 3 trials of the FTT on separate days, including the RFST with a 3 min stand. Analysis of variance (ANOVA) was performed on the HRV measures. One crewmember completed the FTT before a 14-day mission, on landing day (R+0) and one (R+1) day after returning to Earth. RESULTS VALIDITY: HRV measures reflecting autonomic activity were not significantly different during the 0-3 and 0-5 min segments. RELIABILITY: The average coefficient of variation for RR, systolic (SBP) and diastolic blood pressures during the RFST were less than 8% for the 3 sessions. ANOVA results yielded a greater inter-subject variability (p<0.006) than inter-session variability (p>0.05) for HRV in the RFST. SPACEFLIGHT: Lower RR and higher SBP were observed on R+0 in rest and stand. On R+1

  3. Systems and methods for measuring a parameter of a landfill including a barrier cap and wireless sensor systems and methods

    DOEpatents

    Kunerth, Dennis C.; Svoboda, John M.; Johnson, James T.

    2007-03-06

    A method of measuring a parameter of a landfill including a cap, without passing wires through the cap, includes burying a sensor apparatus in the landfill prior to closing the landfill with the cap; providing a reader capable of communicating with the sensor apparatus via radio frequency (RF); placing an antenna above the barrier, spaced apart from the sensor apparatus; coupling the antenna to the reader either before or after placing the antenna above the barrier; providing power to the sensor apparatus, via the antenna, by generating a field using the reader; accumulating and storing power in the sensor apparatus; sensing a parameter of the landfill using the sensor apparatus while using power; and transmitting the sensed parameter to the reader via a wireless response signal. A system for measuring a parameter of a landfill is also provided.

  4. Fault parameter constraints using relocated earthquakes: A validation of first-motion focal-mechanism data

    USGS Publications Warehouse

    Kilb, Debi; Hardebeck, J.L.

    2006-01-01

    We estimate the strike and dip of three California fault segments (Calaveras, Sargent, and a portion of the San Andreas near San Jaun Bautistia) based on principle component analysis of accurately located microearthquakes. We compare these fault orientations with two different first-motion focal mechanism catalogs: the Northern California Earthquake Data Center (NCEDC) catalog, calculated using the FPFIT algorithm (Reasenberg and Oppenheimer, 1985), and a catalog created using the HASH algorithm that tests mechanism stability relative to seismic velocity model variations and earthquake location (Hardebeck and Shearer, 2002). We assume any disagreement (misfit >30° in strike, dip, or rake) indicates inaccurate focal mechanisms in the catalogs. With this assumption, we can quantify the parameters that identify the most optimally constrained focal mechanisms. For the NCEDC/FPFIT catalogs, we find that the best quantitative discriminator of quality focal mechanisms is the station distribution ratio (STDR) parameter, an indicator of how the stations are distributed about the focal sphere. Requiring STDR > 0.65 increases the acceptable mechanisms from 34%–37% to 63%–68%. This suggests stations should be uniformly distributed surrounding, rather than aligning, known fault traces. For the HASH catalogs, the fault plane uncertainty (FPU) parameter is the best discriminator, increasing the percent of acceptable mechanisms from 63%–78% to 81%–83% when FPU ≤ 35°. The overall higher percentage of acceptable mechanisms and the usefulness of the formal uncertainty in identifying quality mechanisms validate the HASH approach of testing for mechanism stability.

  5. Development and validity of methods for the estimation of temporal gait parameters from heel-attached inertial sensors in younger and older adults.

    PubMed

    Misu, Shogo; Asai, Tsuyoshi; Ono, Rei; Sawa, Ryuichi; Tsutsumimoto, Kota; Ando, Hiroshi; Doi, Takehiko

    2017-09-01

    The heel is likely a suitable location to which inertial sensors are attached for the detection of gait events. However, there are few studies to detect gait events and determine temporal gait parameters using sensors attached to the heels. We developed two methods to determine temporal gait parameters: detecting heel-contact using acceleration and detecting toe-off using angular velocity data (acceleration-angular velocity method; A-V method), and detecting both heel-contact and toe-off using angular velocity data (angular velocity-angular velocity method; V-V method). The aim of this study was to examine the concurrent validity of the A-V and V-V methods against the standard method, and to compare their accuracy. Temporal gait parameters were measured in 10 younger and 10 older adults. The intra-class correlation coefficients were excellent in both methods compared with the standard method (0.80 to 1.00). The root mean square errors of stance and swing time in the A-V method were smaller than the V-V method in older adults, although there were no significant discrepancies in the other comparisons. Our study suggests that inertial sensors attached to the heels, using the A-V method in particular, provide a valid measurement of temporal gait parameters. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. The SEGUE Stellar Parameter Pipeline. II. Validation with Galactic Globular and Open Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Y.S.; Beers, T.C.; Sivarani, T.

    2007-10-01

    The authors validate the performance and accuracy of the current SEGUE (Sloan Extension for Galactic Understanding and Exploration) Stellar Parameter Pipeline (SSPP), which determines stellar atmospheric parameters (effective temperature, surface gravity, and metallicity) by comparing derived overall metallicities and radial velocities from selected likely members of three globular clusters (M 13, M 15, and M 2) and two open clusters (NGC 2420 and M 67) to the literature values. Spectroscopic and photometric data obtained during the course of the original Sloan Digital Sky Survey (SDSS-1) and its first extension (SDSS-II/SEGUE) are used to determine stellar radial velocities and atmospheric parametermore » estimates for stars in these clusters. Based on the scatter in the metallicities derived for the members of each cluster, they quantify the typical uncertainty of the SSPP values, {sigma}([Fe/H]) = 0.13 dex for stars in the range of 4500 K {le} T{sub eff} {le} 7500 K and 2.0 {le} log g {le} 5.0, at least over the metallicity interval spanned by the clusters studied (-2.3 {le} [Fe/H] < 0). The surface gravities and effective temperatures derived by the SSPP are also compared with those estimated from the comparison of the color-magnitude diagrams with stellar evolution models; they find satisfactory agreement. At present, the SSPP underestimates [Fe/H] for near-solar-metallicity stars, represented by members of M 67 in this study, by {approx} 0.3 dex.« less

  7. Parameter screening: the use of a dummy parameter to identify non-influential parameters in a global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Khorashadi Zadeh, Farkhondeh; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2017-04-01

    Parameter estimation is a major concern in hydrological modeling, which may limit the use of complex simulators with a large number of parameters. To support the selection of parameters to include in or exclude from the calibration process, Global Sensitivity Analysis (GSA) is widely applied in modeling practices. Based on the results of GSA, the influential and the non-influential parameters are identified (i.e. parameters screening). Nevertheless, the choice of the screening threshold below which parameters are considered non-influential is a critical issue, which has recently received more attention in GSA literature. In theory, the sensitivity index of a non-influential parameter has a value of zero. However, since numerical approximations, rather than analytical solutions, are utilized in GSA methods to calculate the sensitivity indices, small but non-zero indices may be obtained for the indices of non-influential parameters. In order to assess the threshold that identifies non-influential parameters in GSA methods, we propose to calculate the sensitivity index of a "dummy parameter". This dummy parameter has no influence on the model output, but will have a non-zero sensitivity index, representing the error due to the numerical approximation. Hence, the parameters whose indices are above the sensitivity index of the dummy parameter can be classified as influential, whereas the parameters whose indices are below this index are within the range of the numerical error and should be considered as non-influential. To demonstrated the effectiveness of the proposed "dummy parameter approach", 26 parameters of a Soil and Water Assessment Tool (SWAT) model are selected to be analyzed and screened, using the variance-based Sobol' and moment-independent PAWN methods. The sensitivity index of the dummy parameter is calculated from sampled data, without changing the model equations. Moreover, the calculation does not even require additional model evaluations for the Sobol

  8. Validation and Parameter Sensitivity Tests for Reconstructing Swell Field Based on an Ensemble Kalman Filter

    PubMed Central

    Wang, Xuan; Tandeo, Pierre; Fablet, Ronan; Husson, Romain; Guan, Lei; Chen, Ge

    2016-01-01

    The swell propagation model built on geometric optics is known to work well when simulating radiated swells from a far located storm. Based on this simple approximation, satellites have acquired plenty of large samples on basin-traversing swells induced by fierce storms situated in mid-latitudes. How to routinely reconstruct swell fields with these irregularly sampled observations from space via known swell propagation principle requires more examination. In this study, we apply 3-h interval pseudo SAR observations in the ensemble Kalman filter (EnKF) to reconstruct a swell field in ocean basin, and compare it with buoy swell partitions and polynomial regression results. As validated against in situ measurements, EnKF works well in terms of spatial–temporal consistency in far-field swell propagation scenarios. Using this framework, we further address the influence of EnKF parameters, and perform a sensitivity analysis to evaluate estimations made under different sets of parameters. Such analysis is of key interest with respect to future multiple-source routinely recorded swell field data. Satellite-derived swell data can serve as a valuable complementary dataset to in situ or wave re-analysis datasets. PMID:27898005

  9. Evaluation of MuSyQ land surface albedo based on LAnd surface Parameters VAlidation System (LAPVAS)

    NASA Astrophysics Data System (ADS)

    Dou, B.; Wen, J.; Xinwen, L.; Zhiming, F.; Wu, S.; Zhang, Y.

    2016-12-01

    satellite derived Land surface albedo is an essential climate variable which controls the earth energy budget and it can be used in applications such as climate change, hydrology, and numerical weather prediction. However, the accuracy and uncertainty of surface albedo products should be evaluated with a reliable reference truth data prior to applications. A new comprehensive and systemic project of china, called the Remote Sensing Application Network (CRSAN), has been launched recent years. Two subjects of this project is developing a Multi-source data Synergized Quantitative Remote Sensin g Production System ( MuSyQ ) and a Web-based validation system named LAnd surface remote sensing Product VAlidation System (LAPVAS) , which aims to generate a quantitative remote sensing product for ecosystem and environmental monitoring and validate them with a reference validation data and a standard validation system, respectively. Land surface BRDF/albedo is one of product datasets of MuSyQ which has a pentad period with 1km spatial resolution and is derived by Multi-sensor Combined BRDF Inversion ( MCBI ) Model. In this MuSyQ albedo evaluation, a multi-validation strategy is implemented by LAPVAS, including directly and multi-scale validation with field measured albedo and cross validation with MODIS albedo product with different land cover. The results reveal that MuSyQ albedo data with a 5-day temporal resolution is in higher sensibility and accuracy during land cover change period, e.g. snowing. But results without regard to snow or changed land cover, MuSyQ albedo generally is in similar accuracy with MODIS albedo and meet the climate modeling requirement of an absolute accuracy of 0.05.

  10. Fracture mechanics validity limits

    NASA Technical Reports Server (NTRS)

    Lambert, Dennis M.; Ernst, Hugo A.

    1994-01-01

    Fracture behavior is characteristics of a dramatic loss of strength compared to elastic deformation behavior. Fracture parameters have been developed and exhibit a range within which each is valid for predicting growth. Each is limited by the assumptions made in its development: all are defined within a specific context. For example, the stress intensity parameters, K, and the crack driving force, G, are derived using an assumption of linear elasticity. To use K or G, the zone of plasticity must be small as compared to the physical dimensions of the object being loaded. This insures an elastic response, and in this context, K and G will work well. Rice's J-integral has been used beyond the limits imposed on K and G. J requires an assumption of nonlinear elasticity, which is not characteristic of real material behavior, but is thought to be a reasonable approximation if unloading is kept to a minimum. As well, the constraint cannot change dramatically (typically, the crack extension is limited to ten-percent of the initial remaining ligament length). Rice, et al investigated the properties required of J-type parameters, J(sub x), and showed that the time rate, dJ(sub x)/dt, must not be a function of the crack extension rate, da/dt. Ernst devised the modified-J parameter, J(sub M), that meets this criterion. J(sub M) correlates fracture data to much higher crack growth than does J. Ultimately, a limit of the validity of J(sub M) is anticipated, and this has been estimated to be at a crack extension of about 40-percent of the initial remaining ligament length. None of the various parameters can be expected to describe fracture in an environment of gross plasticity, in which case the process is better described by deformation parameters, e.g., stress and strain. In the current study, various schemes to identify the onset of the plasticity-dominated behavior, i.e., the end of fracture mechanics validity, are presented. Each validity limit parameter is developed in

  11. Validation of Cardiovascular Parameters During NASA's Functional Task Test

    NASA Technical Reports Server (NTRS)

    Arzeno, N. M.; Stenger, M. B.; Bloomberg, J. J.; Platts, Steven H.

    2008-01-01

    Microgravity-induced physiological changes, including cardiovascular deconditioning may impair crewmembers f capabilities during exploration missions on the Moon and Mars. The Functional Task Test (FTT), which will be used to assess task performance in short and long duration astronauts, consists of 7 functional tests to evaluate crewmembers f ability to perform activities to be conducted in a partial-gravity environment or following an emergency landing on Earth. The Recovery from Fall/Stand Test (RFST) tests both the subject fs ability to get up from a prone position and orthostatic intolerance. PURPOSE: Crewmembers have never become presyncopal in the first 3 min of quiet stand, yet it is unknown whether 3 min is long enough to cause similar heart rate fluctuations to a 5-min stand. The purpose of this study was to validate and test the reliability of heart rate variability (HRV) analysis of a 3-min quiet stand. METHODS: To determine the validity of using 3 vs. 5-min of standing to assess HRV, 7 healthy subjects remained in a prone position for 2 min, stood up quickly and stood quietly for 6 min. ECG and continuous blood pressure data were recorded. Mean R-R interval and spectral HRV were measured in minutes 0-3 and 0-5 following the heart rate transient due to standing. Significant differences between the segments were determined by a paired t-test. To determine the reliability of the 3-min stand test, 13 healthy subjects completed 3 trials of the complete FTT on separate days, including the RFST with a 3-min stand test. Analysis of variance (ANOVA) was performed on the HRV measures. RESULTS: Spectral HRV measures reflecting autonomic activity were not different (p>0.05) during the 0-3 and 0-5 min segment (mean R-R interval: 738+/-74 ms, 728+/-69 ms; low frequency to high frequency ratio: 6.5+/-2.2, 7.7+/-2.7; normalized high frequency: 0.19+/-0.03, 0.18+/-0.04). The average coefficient of variation for mean R-R interval, systolic and diastolic blood pressures

  12. Diagnostic validity of hematologic parameters in evaluation of massive pulmonary embolism.

    PubMed

    Ates, Hale; Ates, Ihsan; Kundi, Harun; Yilmaz, Fatma Meric

    2017-09-01

    The aim of this study was to determine the hematologic parameter with the highest diagnostic differentiation in the identification of massive acute pulmonary embolism (APE). A retrospective study was performed on patients diagnosing with APE between June 2014 and June 2016. All radiological and laboratory parameters of patients were scanned through the electronic information management system of the hospital. PLR was obtained from the ratio of platelet count to lymphocyte count, NLR was obtained from the ratio of neutrophil count to lymphocyte count, WMR was obtained from white blood cell in mean platelet volume ratio, MPR was obtained from the ratio of mean platelet volume to platelet count, and RPR was obtained from the ratio of red distribution width to platelet count. Six hundred and thirty-nine patients consisting of 292 males (45.7%) and 347 females (54.3%) were included in the research. Independent predictors of massive risk as compared to sub-massive group were; pulmonary arterial systolic pressure (PASP) (OR=1.40; P=.001), PLR (OR=1.59; P<.001), NLR (OR=2.22; P<.001), WMR (OR=1.22; P<.001), MPR (OR=0.33; P<.001), and RPR (OR=0.68; P<.001). Upon evaluation of the diagnostic differentiation of these risk factors for massive APE by employing receiver operating characteristic curve analysis, it was determined that PLR (AUC±SE=0.877±0.015; P<.001), and NLR (AUC±SE=0.893±0.013; P<.001) have similar diagnostic differentiation in diagnosing massive APE and these two parameters are superior over PASP, MPR, WMR, and RPR. We determined that the levels of NLR and PLR are superior to other parameters in the determination of clinical severity in APE cases. © 2016 Wiley Periodicals, Inc.

  13. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines. Copyright © 2016. Published by Elsevier B.V.

  14. Validating proposed migration equation and parameters' values as a tool to reproduce and predict 137Cs vertical migration activity in Spanish soils.

    PubMed

    Olondo, C; Legarda, F; Herranz, M; Idoeta, R

    2017-04-01

    This paper shows the procedure performed to validate the migration equation and the migration parameters' values presented in a previous paper (Legarda et al., 2011) regarding the migration of 137 Cs in Spanish mainland soils. In this paper, this model validation has been carried out checking experimentally obtained activity concentration values against those predicted by the model. This experimental data come from the measured vertical activity profiles of 8 new sampling points which are located in northern Spain. Before testing predicted values of the model, the uncertainty of those values has been assessed with the appropriate uncertainty analysis. Once establishing the uncertainty of the model, both activity concentration values, experimental versus model predicted ones, have been compared. Model validation has been performed analyzing its accuracy, studying it as a whole and also at different depth intervals. As a result, this model has been validated as a tool to predict 137 Cs behaviour in a Mediterranean environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Validation conform ISO-15189 of assays in the field of autoimmunity: Joint efforts in The Netherlands.

    PubMed

    Mulder, Leontine; van der Molen, Renate; Koelman, Carin; van Leeuwen, Ester; Roos, Anja; Damoiseaux, Jan

    2018-05-01

    ISO 15189:2012 requires validation of methods used in the medical laboratory, and lists a series of performance parameters for consideration to include. Although these performance parameters are feasible for clinical chemistry analytes, application in the validation of autoimmunity tests is a challenge. Lack of gold standards or reference methods in combination with the scarcity of well-defined diagnostic samples of patients with rare diseases make validation of new assays difficult. The present manuscript describes the initiative of Dutch medical immunology laboratory specialists to combine efforts and perform multi-center validation studies of new assays in the field of autoimmunity. Validation data and reports are made available to interested Dutch laboratory specialists. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  17. Validity of the temporal-to-nasal macular ganglion cell-inner plexiform layer thickness ratio as a diagnostic parameter in early glaucoma.

    PubMed

    Park, Jung-Won; Jung, Hyun-Ho; Heo, Hwan; Park, Sang-Woo

    2015-08-01

    To evaluate the diagnostic validity of temporal-to-nasal macular ganglion cell-inner plexiform layer thickness (TNM) ratio using Cirrus high definition-optical coherence tomography (HD-OCT) in patients with early glaucomatous damage. Enrolled participants included 130 normal controls, 50 patients with preperimetric glaucoma and 106 patients with early glaucoma. The patients with early glaucoma were classified into two subgroups according to the pattern of the visual field (VF) defects: the paracentral scotoma (PCS, n = 54) and the peripheral scotoma (PPS, n = 52). The thickness of the macular ganglion cell-inner plexiform layer (mGCIPL) and circumpapillary retinal nerve fibre layer (cpRNFL) was measured by Cirrus HD-OCT, and the average, superior and inferior TNM ratio was calculated. The average TNM ratio is a sum of superotemporal and inferotemporal mGCIPL thicknesses divided by the sum of superonasal and inferonasal mGCIPL thicknesses. Area under the receiver operating characteristic curve (AROC) of each parameter was compared between the groups. The parameter with the best AROC was the average TNM ratio and inferotemporal mGCIPL thickness in the PCS group and average cpRNFL thickness in the PPS group. The AROCs of the average, superior and inferior TNM ratio (p < 0.001, p = 0.007 and p < 0.001, respectively), minimum, average, inferotemporal and inferior mGCIPL thickness (p = 0.004, p = 0.003, p = 0.002 and p = 0.001, respectively) of the PCS were significantly higher than those of the PPS. However, the AROCs of the all cpRNFL thickness parameters did not show statistically significant differences between two subgroups. Asymmetry of temporal-to-nasal mGCIPL thickness could be an important parameter in the diagnosis of early glaucoma with paracentral VF defects. © 2015 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  18. A PARMELA model of the CEBAF injector valid over a wide range of beam parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuhong Zhang; Kevin Beard; Jay Benesch

    A PARMELA model of the CEBAF injector valid over a wide range of beam parameters Yuhong Zhang, Kevin Beard, Jay Benesch, Yu-Chiu Chao, Arne Freyberger, Joseph Grames, Reza Kazimi, Geoff Krafft, Rui Li, Lia Merminga, Matt Poelker, Michael Tiefenback, Byung Yunn Thomas Jefferson National Accelerator Facility 12000 Jefferson Avenue, Newport News, VA 23606 USA An earlier PARMELA model of the Jefferson Lab CEBAF photoinjector was recently revised. The initial phase space distribution of an electron bunch was determined by measuring spot size and pulselength of the driver laser and by beam emittance measurements. The improved model has been used formore » simulations of the simultaneous delivery of the Hall A beam required for a hypernuclear experiment, and the Hall C beam required for the G0 parity violation experiment.« less

  19. Utility of pedometers for assessing physical activity: construct validity.

    PubMed

    Tudor-Locke, Catrine; Williams, Joel E; Reis, Jared P; Pluto, Delores

    2004-01-01

    Valid assessment of physical activity is necessary to fully understand this important health-related behaviour for research, surveillance, intervention and evaluation purposes. This article is the second in a companion set exploring the validity of pedometer-assessed physical activity. The previous article published in Sports Medicine dealt with convergent validity (i.e. the extent to which an instrument's output is associated with that of other instruments intended to measure the same exposure of interest). The present focus is on construct validity. Construct validity is the extent to which the measurement corresponds with other measures of theoretically-related parameters. Construct validity is typically evaluated by correlational analysis, that is, the magnitude of concordance between two measures (e.g. pedometer-determined steps/day and a theoretically-related parameter such as age, anthropometric measures and fitness). A systematic literature review produced 29 articles published since > or =1980 directly relevant to construct validity of pedometers in relation to age, anthropometric measures and fitness. Reported correlations were combined and a median r-value was computed. Overall, there was a weak inverse relationship (median r = -0.21) between age and pedometer-determined physical activity. A weak inverse relationship was also apparent with both body mass index and percentage overweight (median r = -0.27 and r = -0.22, respectively). Positive relationships regarding indicators of fitness ranged from weak to moderate depending on the fitness measure utilised: 6-minute walk test (median r = 0.69), timed treadmill test (median r = 0.41) and estimated maximum oxygen uptake (median r = 0.22). Studies are warranted to assess the relationship of pedometer-determined physical activity with other important health-related outcomes including blood pressure and physiological parameters such as blood glucose and lipid profiles. The aggregated evidence of convergent

  20. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    PubMed

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  1. HIV Model Parameter Estimates from Interruption Trial Data including Drug Efficacy and Reservoir Dynamics

    PubMed Central

    Luo, Rutao; Piovoso, Michael J.; Martinez-Picado, Javier; Zurakowski, Ryan

    2012-01-01

    Mathematical models based on ordinary differential equations (ODE) have had significant impact on understanding HIV disease dynamics and optimizing patient treatment. A model that characterizes the essential disease dynamics can be used for prediction only if the model parameters are identifiable from clinical data. Most previous parameter identification studies for HIV have used sparsely sampled data from the decay phase following the introduction of therapy. In this paper, model parameters are identified from frequently sampled viral-load data taken from ten patients enrolled in the previously published AutoVac HAART interruption study, providing between 69 and 114 viral load measurements from 3–5 phases of viral decay and rebound for each patient. This dataset is considerably larger than those used in previously published parameter estimation studies. Furthermore, the measurements come from two separate experimental conditions, which allows for the direct estimation of drug efficacy and reservoir contribution rates, two parameters that cannot be identified from decay-phase data alone. A Markov-Chain Monte-Carlo method is used to estimate the model parameter values, with initial estimates obtained using nonlinear least-squares methods. The posterior distributions of the parameter estimates are reported and compared for all patients. PMID:22815727

  2. A systematic review of validated methods for identifying anaphylaxis, including anaphylactic shock and angioneurotic edema, using administrative and claims data.

    PubMed

    Schneider, Gary; Kachroo, Sumesh; Jones, Natalie; Crean, Sheila; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W

    2012-01-01

    The Food and Drug Administration's Mini-Sentinel pilot program initially aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest from administrative and claims data. This article summarizes the process and findings of the algorithm review of anaphylaxis. PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the anaphylaxis health outcome of interest. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify anaphylaxis and including validation estimates of the coding algorithms. Our search revealed limited literature focusing on anaphylaxis that provided administrative and claims data-based algorithms and validation estimates. Only four studies identified via literature searches provided validated algorithms; however, two additional studies were identified by Mini-Sentinel collaborators and were incorporated. The International Classification of Diseases, Ninth Revision, codes varied, as did the positive predictive value, depending on the cohort characteristics and the specific codes used to identify anaphylaxis. Research needs to be conducted on designing validation studies to test anaphylaxis algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.

  3. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    NASA Astrophysics Data System (ADS)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  4. The 4-parameter Compressible Packing Model (CPM) including a critical cavity size ratio

    NASA Astrophysics Data System (ADS)

    Roquier, Gerard

    2017-06-01

    The 4-parameter Compressible Packing Model (CPM) has been developed to predict the packing density of mixtures constituted by bidisperse spherical particles. The four parameters are: the wall effect and the loosening effect coefficients, the compaction index and a critical cavity size ratio. The two geometrical interactions have been studied theoretically on the basis of a spherical cell centered on a secondary class bead. For the loosening effect, a critical cavity size ratio, below which a fine particle can be inserted into a small cavity created by touching coarser particles, is introduced. This is the only parameter which requires adaptation to extend the model to other types of particles. The 4-parameter CPM demonstrates its efficiency on frictionless glass beads (300 values), spherical particles numerically simulated (20 values), round natural particles (125 values) and crushed particles (335 values) with correlation coefficients equal to respectively 99.0%, 98.7%, 97.8%, 96.4% and mean deviations equal to respectively 0.007, 0.006, 0.007, 0.010.

  5. Calculation of Optical Parameters of Liquid Crystals

    NASA Astrophysics Data System (ADS)

    Kumar, A.

    2007-12-01

    Validation of a modified four-parameter model describing temperature effect on liquid crystal refractive indices is being reported in the present article. This model is based upon the Vuks equation. Experimental data of ordinary and extraordinary refractive indices for two liquid crystal samples MLC-9200-000 and MLC-6608 are used to validate the above-mentioned theoretical model. Using these experimental data, birefringence, order parameter, normalized polarizabilities, and the temperature gradient of refractive indices are determined. Two methods: directly using birefringence measurements and using Haller's extrapolation procedure are adopted for the determination of order parameter. Both approches of order parameter calculation are compared. The temperature dependences of all these parameters are discussed. A close agreement between theory and experiment is obtained.

  6. Validation of refraction and anterior segment parameters by a new multi-diagnostic platform (VX120).

    PubMed

    Gordon-Shaag, Ariela; Piñero, David P; Kahloun, Cyril; Markov, David; Parnes, Tzadok; Gantz, Liat; Shneor, Einat

    2018-03-08

    The VX120 (Visionix Luneau, France) is a novel multi-diagnostic platform that combines Hartmann-Shack based autorefraction, Placido-disk based corneal-topography and anterior segment measurements made with a stationary-Scheimpflug camera. We investigate the agreement between different parameters measured by the VX120 with accepted or gold-standard techniques to test if they are interchangeable, as well as to evaluate the repeatability and reproducibility. The right-eyes of healthy subjects were included in the study. Autorefraction of the VX120 was compared to subjective refraction. Agreement of anterior segment parameters was compared to the Sirius (CSO, Italy) including autokeratometry, central corneal thickness (CCT), iridiocorneal angle (IA). Inter and intra-test repeatability of the above parameters was assessed. Results were analyzed using Bland and Altman analyses. A total of 164 eyes were evaluated. The mean difference between VX120 autorefraction and subjective refraction for sphere, spherical equivalent (SE), and cylinder was 0.01±0.43D, 0.14±0.47D, and -0.26±0.30D, respectively and high correlation was found to all parameter (r>0.75) except for J 45 (r=0.61). The mean difference between VX120 and the Sirius system for CCT, IA, and keratometry (k1 and k2) was -3.51±8.64μm, 7.6±4.2°, 0.003±0.06mm and 0.004±0.04mm, respectively and high correlation was found to all parameter (r>0.97) except for IA (r=0.67). Intrasession repeatability of VX120 refraction, CCT, IA and keratometry yielded low within-subject standard deviations. Inter-session repeatability showed no statistically significant difference for most of the parameters measured. The VX120 provides consistent refraction and most anterior segment measurements in normal healthy eyes, with high levels of intra and inter-session repeatability. Copyright © 2018. Published by Elsevier España, S.L.U.

  7. Quantitative validation of carbon-fiber laminate low velocity impact simulations

    DOE PAGES

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    2015-09-26

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  8. 40 CFR 761.389 - Testing parameter requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... variable testing parameters described in this section which may be used in the validation study. The conditions demonstrated in the validation study for these variables shall become the required conditions for.... During the validation study, use the same ratio of contaminated surface area to soak solvent volume as...

  9. 40 CFR 761.389 - Testing parameter requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... variable testing parameters described in this section which may be used in the validation study. The conditions demonstrated in the validation study for these variables shall become the required conditions for.... During the validation study, use the same ratio of contaminated surface area to soak solvent volume as...

  10. 40 CFR 761.389 - Testing parameter requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... variable testing parameters described in this section which may be used in the validation study. The conditions demonstrated in the validation study for these variables shall become the required conditions for.... During the validation study, use the same ratio of contaminated surface area to soak solvent volume as...

  11. Lumped-parameters equivalent circuit for condenser microphones modeling.

    PubMed

    Esteves, Josué; Rufer, Libor; Ekeom, Didace; Basrour, Skandar

    2017-10-01

    This work presents a lumped parameters equivalent model of condenser microphone based on analogies between acoustic, mechanical, fluidic, and electrical domains. Parameters of the model were determined mainly through analytical relations and/or finite element method (FEM) simulations. Special attention was paid to the air gap modeling and to the use of proper boundary condition. Corresponding lumped-parameters were obtained as results of FEM simulations. Because of its simplicity, the model allows a fast simulation and is readily usable for microphone design. This work shows the validation of the equivalent circuit on three real cases of capacitive microphones, including both traditional and Micro-Electro-Mechanical Systems structures. In all cases, it has been demonstrated that the sensitivity and other related data obtained from the equivalent circuit are in very good agreement with available measurement data.

  12. Determination of polarimetric parameters of honey by near-infrared transflectance spectroscopy.

    PubMed

    García-Alvarez, M; Ceresuela, S; Huidobro, J F; Hermida, M; Rodríguez-Otero, J L

    2002-01-30

    NIR transflectance spectroscopy was used to determine polarimetric parameters (direct polarization, polarization after inversion, specific rotation in dry matter, and polarization due to nonmonosaccharides) and sucrose in honey. In total, 156 honey samples were collected during 1992 (45 samples), 1995 (56 samples), and 1996 (55 samples). Samples were analyzed by NIR spectroscopy and polarimetric methods. Calibration (118 samples) and validation (38 samples) sets were made up; honeys from the three years were included in both sets. Calibrations were performed by modified partial least-squares regression and scatter correction by standard normal variation and detrend methods. For direct polarization, polarization after inversion, specific rotation in dry matter, and polarization due to nonmonosaccharides, good statistics (bias, SEV, and R(2)) were obtained for the validation set, and no statistically (p = 0.05) significant differences were found between instrumental and polarimetric methods for these parameters. Statistical data for sucrose were not as good as those of the other parameters. Therefore, NIR spectroscopy is not an effective method for quantitative analysis of sucrose in these honey samples. However, NIR spectroscopy may be an acceptable method for semiquantitative evaluation of sucrose for honeys, such as those in our study, containing up to 3% of sucrose. Further work is necessary to validate the uncertainty at higher levels.

  13. Experimental Validation and Combustion Modeling of a JP-8 Surrogate in a Single Cylinder Diesel Engine

    DTIC Science & Technology

    2014-04-15

    SINGLE CYLINDER DIESEL ENGINE Amit Shrestha, Umashankar Joshi, Ziliang Zheng, Tamer Badawy, Naeim A. Henein, Wayne State University, Detroit, MI, USA...13-03-2014 4. TITLE AND SUBTITLE EXPERIMENTAL VALIDATION AND COMBUSTION MODELING OF A JP-8 SURROGATE IN A SINGLE CYLINDER DIESEL ENGINE 5a...INTERNATIONAL UNCLASSIFIED • Validate a two-component JP-8 surrogate in a single cylinder diesel engine. Validation parameters include – Ignition delay

  14. Description of the CERES Ocean Validation Experiment (COVE), A Dedicated EOS Validation Test Site

    NASA Astrophysics Data System (ADS)

    Rutledge, K.; Charlock, T.; Smith, B.; Jin, Z.; Rose, F.; Denn, F.; Rutan, D.; Haeffelin, M.; Su, W.; Xhang, T.; Jay, M.

    2001-12-01

    A unique test site located in the mid-Atlantic coastal marine waters has been used by several EOS projects for validation measurements. A common theme across these projects is the need for a stable measurement site within the marine environment for long-term, high quality radiation measurements. The site was initiated by NASA's Clouds and the Earths Radiant Energy System (CERES) project. One of CERES's challenging goals is to provide upwelling and downwelling shortwave fluxes at several pressure altitudes within the atmosphere and at the surface. Operationally the radiative transfer model of Fu and Liou (1996, 1998), the CERES instrument measured radiances and various other EOS platform data are being used to accomplish this goal. We present here, a component of the CERES/EOS validation effort that is focused to verify and optimize the prediction algorithms for radiation parameters associated with the marine coastal and oceanic surface types of the planet. For this validation work, the CERES Ocean Validation Experiment (COVE) was developed to provide detailed high-frequency and long-duration measurements for radiation and their associated dependent variables. The CERES validations also include analytical efforts which will not be described here (but see Charlock et.al, Su et.al., Smith et.al-Fall 2001 AGU Meeting) The COVE activity is based on a rigid ocean platform which is located approximately twenty kilometers off of the coast of Virginia Beach, Virginia. The once-manned US Coast Guard facility rises 35 meters from the ocean surface allowing the radiation instruments to be well above the splash zone. The depth of the sea is eleven meters at the site. A power and communications system has been installed for present and future requirements. Scientific measurements at the site have primarily been developed within the framework of established national and international monitoring programs. These include the Baseline Surface Radiation Network of the World

  15. Validation of whitecap fraction and breaking wave parameters from WAVEWATCH-III using in situ and remote-sensing data

    NASA Astrophysics Data System (ADS)

    Leckler, F.; Hanafin, J. A.; Ardhuin, F.; Filipot, J.; Anguelova, M. D.; Moat, B. I.; Yelland, M.; Prytherch, J.

    2012-12-01

    Whitecaps are the main sink of wave energy. Although the exact processes are still unknown, it is clear that they play a significant role in momentum exchange between atmosphere and ocean, and also influence gas and aerosol exchange. Recently, modeling of whitecap properties was implemented in the spectral wave model WAVEWATCH-III ®. This modeling takes place in the context of the Oceanflux-Greenhouse Gas project, to provide a climatology of breaking waves for gas transfer studies. We present here a validation study for two different wave breaking parameterizations implemented in the spectral wave model WAVEWATCH-III ®. The model parameterizations use different approaches related to the steepness of the carrying waves to estimate breaking wave probabilities. That of Ardhuin et al. (2010) is based on the hypothesis that breaking probabilities become significant when the saturation spectrum exceeds a threshold, and includes a modification to allow for greater breaking in the mean wave direction, to agree with observations. It also includes suppression of shorter waves by longer breaking waves. In the second, (Filipot and Ardhuin, 2012) breaking probabilities are defined at different scales using wave steepness, then the breaking wave height distribution is integrated over all scales. We also propose an adaptation of the latter to make it self-consistent. The breaking probabilities parameterized by Filipot and Ardhuin (2012) are much larger for dominant waves than those from the other parameterization, and show better agreement with modeled statistics of breaking crest lengths measured during the FAIRS experiment. This stronger breaking also has an impact on the shorter waves due to the parameterization of short wave damping associated with large breakers, and results in a different distribution of the breaking crest lengths. Converted to whitecap coverage using Reul and Chapron (2003), both parameterizations agree reasonably well with commonly-used empirical fits

  16. DEM modeling of ball mills with experimental validation: influence of contact parameters on charge motion and power draw

    NASA Astrophysics Data System (ADS)

    Boemer, Dominik; Ponthot, Jean-Philippe

    2017-01-01

    Discrete element method simulations of a 1:5-scale laboratory ball mill are presented in this paper to study the influence of the contact parameters on the charge motion and the power draw. The position density limit is introduced as an efficient mathematical tool to describe and to compare the macroscopic charge motion in different scenarios, i.a. with different values of the contact parameters. While the charge motion and the power draw are relatively insensitive to the stiffness and the damping coefficient of the linear spring-slider-damper contact law, the coefficient of friction has a strong influence since it controls the sliding propensity of the charge. Based on the experimental calibration and validation by charge motion photographs and power draw measurements, the descriptive and predictive capabilities of the position density limit and the discrete element method are demonstrated, i.e. the real position of the charge is precisely delimited by the respective position density limit and the power draw can be predicted with an accuracy of about 5 %.

  17. Dynamic Parameter Identification of Subject-Specific Body Segment Parameters Using Robotics Formalism: Case Study Head Complex.

    PubMed

    Díaz-Rodríguez, Miguel; Valera, Angel; Page, Alvaro; Besa, Antonio; Mata, Vicente

    2016-05-01

    Accurate knowledge of body segment inertia parameters (BSIP) improves the assessment of dynamic analysis based on biomechanical models, which is of paramount importance in fields such as sport activities or impact crash test. Early approaches for BSIP identification rely on the experiments conducted on cadavers or through imaging techniques conducted on living subjects. Recent approaches for BSIP identification rely on inverse dynamic modeling. However, most of the approaches are focused on the entire body, and verification of BSIP for dynamic analysis for distal segment or chain of segments, which has proven to be of significant importance in impact test studies, is rarely established. Previous studies have suggested that BSIP should be obtained by using subject-specific identification techniques. To this end, our paper develops a novel approach for estimating subject-specific BSIP based on static and dynamics identification models (SIM, DIM). We test the validity of SIM and DIM by comparing the results using parameters obtained from a regression model proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230). Both SIM and DIM are developed considering robotics formalism. First, the static model allows the mass and center of gravity (COG) to be estimated. Second, the results from the static model are included in the dynamics equation allowing us to estimate the moment of inertia (MOI). As a case study, we applied the approach to evaluate the dynamics modeling of the head complex. Findings provide some insight into the validity not only of the proposed method but also of the application proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230) for dynamic modeling of body segments.

  18. An Efficient Data Partitioning to Improve Classification Performance While Keeping Parameters Interpretable.

    PubMed

    Korjus, Kristjan; Hebart, Martin N; Vicente, Raul

    2016-01-01

    Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier's generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term "Cross-validation and cross-testing" improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do.

  19. Optimization of multilayer neural network parameters for speaker recognition

    NASA Astrophysics Data System (ADS)

    Tovarek, Jaromir; Partila, Pavol; Rozhon, Jan; Voznak, Miroslav; Skapa, Jan; Uhrin, Dominik; Chmelikova, Zdenka

    2016-05-01

    This article discusses the impact of multilayer neural network parameters for speaker identification. The main task of speaker identification is to find a specific person in the known set of speakers. It means that the voice of an unknown speaker (wanted person) belongs to a group of reference speakers from the voice database. One of the requests was to develop the text-independent system, which means to classify wanted person regardless of content and language. Multilayer neural network has been used for speaker identification in this research. Artificial neural network (ANN) needs to set parameters like activation function of neurons, steepness of activation functions, learning rate, the maximum number of iterations and a number of neurons in the hidden and output layers. ANN accuracy and validation time are directly influenced by the parameter settings. Different roles require different settings. Identification accuracy and ANN validation time were evaluated with the same input data but different parameter settings. The goal was to find parameters for the neural network with the highest precision and shortest validation time. Input data of neural networks are a Mel-frequency cepstral coefficients (MFCC). These parameters describe the properties of the vocal tract. Audio samples were recorded for all speakers in a laboratory environment. Training, testing and validation data set were split into 70, 15 and 15 %. The result of the research described in this article is different parameter setting for the multilayer neural network for four speakers.

  20. Adaptive firefly algorithm: parameter analysis and its application.

    PubMed

    Cheung, Ngaam J; Ding, Xue-Ming; Shen, Hong-Bin

    2014-01-01

    As a nature-inspired search algorithm, firefly algorithm (FA) has several control parameters, which may have great effects on its performance. In this study, we investigate the parameter selection and adaptation strategies in a modified firefly algorithm - adaptive firefly algorithm (AdaFa). There are three strategies in AdaFa including (1) a distance-based light absorption coefficient; (2) a gray coefficient enhancing fireflies to share difference information from attractive ones efficiently; and (3) five different dynamic strategies for the randomization parameter. Promising selections of parameters in the strategies are analyzed to guarantee the efficient performance of AdaFa. AdaFa is validated over widely used benchmark functions, and the numerical experiments and statistical tests yield useful conclusions on the strategies and the parameter selections affecting the performance of AdaFa. When applied to the real-world problem - protein tertiary structure prediction, the results demonstrated improved variants can rebuild the tertiary structure with the average root mean square deviation less than 0.4Å and 1.5Å from the native constrains with noise free and 10% Gaussian white noise.

  1. Adaptive Firefly Algorithm: Parameter Analysis and its Application

    PubMed Central

    Shen, Hong-Bin

    2014-01-01

    As a nature-inspired search algorithm, firefly algorithm (FA) has several control parameters, which may have great effects on its performance. In this study, we investigate the parameter selection and adaptation strategies in a modified firefly algorithm — adaptive firefly algorithm (AdaFa). There are three strategies in AdaFa including (1) a distance-based light absorption coefficient; (2) a gray coefficient enhancing fireflies to share difference information from attractive ones efficiently; and (3) five different dynamic strategies for the randomization parameter. Promising selections of parameters in the strategies are analyzed to guarantee the efficient performance of AdaFa. AdaFa is validated over widely used benchmark functions, and the numerical experiments and statistical tests yield useful conclusions on the strategies and the parameter selections affecting the performance of AdaFa. When applied to the real-world problem — protein tertiary structure prediction, the results demonstrated improved variants can rebuild the tertiary structure with the average root mean square deviation less than 0.4Å and 1.5Å from the native constrains with noise free and 10% Gaussian white noise. PMID:25397812

  2. Retrieval with Infrared Atmospheric Sounding Interferometer and Validation during JAIVEx

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Larar, Allen M.; Smith, William L.; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    A state-of-the-art IR-only retrieval algorithm has been developed with an all-season-global EOF Physical Regression and followed by 1-D Var. Physical Iterative Retrieval for IASI, AIRS, and NAST-I. The benefits of this retrieval are to produce atmospheric structure with a single FOV horizontal resolution (approx. 15 km for IASI and AIRS), accurate profiles above the cloud (at least) or down to the surface, surface parameters, and/or cloud microphysical parameters. Initial case study and validation indicates that surface, cloud, and atmospheric structure (include TBL) are well captured by IASI and AIRS measurements. Coincident dropsondes during the IASI and AIRS overpasses are used to validate atmospheric conditions, and accurate retrievals are obtained with an expected vertical resolution. JAIVEx has provided the data needed to validate the retrieval algorithm and its products which allows us to assess the instrument ability and/or performance. Retrievals with global coverage are under investigation for detailed retrieval assessment. It is greatly desired that these products be used for testing the impact on Atmospheric Data Assimilation and/or Numerical Weather Prediction.

  3. Development and Validation of Limited-Sampling Strategies for Predicting Amoxicillin Pharmacokinetic and Pharmacodynamic Parameters

    PubMed Central

    Suarez-Kurtz, Guilherme; Ribeiro, Frederico Mota; Vicente, Flávio L.; Struchiner, Claudio J.

    2001-01-01

    Amoxicillin plasma concentrations (n = 1,152) obtained from 48 healthy subjects in two bioequivalence studies were used to develop limited-sampling strategy (LSS) models for estimating the area under the concentration-time curve (AUC), the maximum concentration of drug in plasma (Cmax), and the time interval of concentration above MIC susceptibility breakpoints in plasma (T>MIC). Each subject received 500-mg amoxicillin, as reference and test capsules or suspensions, and plasma concentrations were measured by a validated microbiological assay. Linear regression analysis and a “jack-knife” procedure revealed that three-point LSS models accurately estimated (R2, 0.92; precision, <5.8%) the AUC from 0 h to infinity (AUC0-∞) of amoxicillin for the four formulations tested. Validation tests indicated that a three-point LSS model (1, 2, and 5 h) developed for the reference capsule formulation predicts the following accurately (R2, 0.94 to 0.99): (i) the individual AUC0-∞ for the test capsule formulation in the same subjects, (ii) the individual AUC0-∞ for both reference and test suspensions in 24 other subjects, and (iii) the average AUC0-∞ following single oral doses (250 to 1,000 mg) of various amoxicillin formulations in 11 previously published studies. A linear regression equation was derived, using the same sampling time points of the LSS model for the AUC0-∞, but using different coefficients and intercept, for estimating Cmax. Bioequivalence assessments based on LSS-derived AUC0-∞'s and Cmax's provided results similar to those obtained using the original values for these parameters. Finally, two-point LSS models (R2 = 0.86 to 0.95) were developed for T>MICs of 0.25 or 2.0 μg/ml, which are representative of microorganisms susceptible and resistant to amoxicillin. PMID:11600352

  4. Effect of Preload Alterations on Left Ventricular Systolic Parameters Including Speckle-Tracking Echocardiography Radial Strain During General Anesthesia.

    PubMed

    Weber, Ulrike; Base, Eva; Ristl, Robin; Mora, Bruno

    2015-08-01

    Frequently used parameters for evaluation of left ventricular systolic function are load-sensitive. However, the impact of preload alterations on speckle-tracking echocardiographic parameters during anesthesia has not been validated. Therefore, two-dimensional (2D) speckle-tracking echocardiography radial strain (RS) was assessed during general anesthesia, simulating 3 different preload conditions. Single-center prospective observational study. University hospital. Thirty-three patients with normal left ventricular systolic function undergoing major surgery. Transgastric views of the midpapillary level of the left ventricle were acquired at 3 different positions. Fractional shortening (FS), fractional area change (FAC), and 2D speckle-tracking echocardiography RS were analyzed in the transgastric midpapillary view. Considerable correlation above 0.5 was found for FAC and FS in the zero and Trendelenburg positions (r = 0.629, r = 0.587), and for RS and FAC in the anti-Trendelenburg position (r = 0.518). In the repeated-measures analysis, significant differences among the values measured at the 3 positions were found for FAC and FS. For FAC, there were differences up to 2.8 percentage points between the anti-Trendelenburg position and the other 2 positions. For FS, only the difference between position zero and anti-Trendelenburg was significant, with an observed change of 1.66. Two-dimensional RS was not significantly different at all positions, with observed changes below 1 percentage point. Alterations in preload did not result in clinically relevant changes of RS, FS, or FAC. Observed changes for RS were smallest; however, the variation of RS was larger than that of FS or FAC. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. In vitro flow assessment: from PC-MRI to computational fluid dynamics including fluid-structure interaction

    NASA Astrophysics Data System (ADS)

    Kratzke, Jonas; Rengier, Fabian; Weis, Christian; Beller, Carsten J.; Heuveline, Vincent

    2016-04-01

    Initiation and development of cardiovascular diseases can be highly correlated to specific biomechanical parameters. To examine and assess biomechanical parameters, numerical simulation of cardiovascular dynamics has the potential to complement and enhance medical measurement and imaging techniques. As such, computational fluid dynamics (CFD) have shown to be suitable to evaluate blood velocity and pressure in scenarios, where vessel wall deformation plays a minor role. However, there is a need for further validation studies and the inclusion of vessel wall elasticity for morphologies being subject to large displacement. In this work, we consider a fluid-structure interaction (FSI) model including the full elasticity equation to take the deformability of aortic wall soft tissue into account. We present a numerical framework, in which either a CFD study can be performed for less deformable aortic segments or an FSI simulation for regions of large displacement such as the aortic root and arch. Both of the methods are validated by means of an aortic phantom experiment. The computational results are in good agreement with 2D phase-contrast magnetic resonance imaging (PC-MRI) velocity measurements as well as catheter-based pressure measurements. The FSI simulation shows a characteristic vessel compliance effect on the flow field induced by the elasticity of the vessel wall, which the CFD model is not capable of. The in vitro validated FSI simulation framework can enable the computation of complementary biomechanical parameters such as the stress distribution within the vessel wall.

  6. In silico target prediction for elucidating the mode of action of herbicides including prospective validation.

    PubMed

    Chiddarwar, Rucha K; Rohrer, Sebastian G; Wolf, Antje; Tresch, Stefan; Wollenhaupt, Sabrina; Bender, Andreas

    2017-01-01

    The rapid emergence of pesticide resistance has given rise to a demand for herbicides with new mode of action (MoA). In the agrochemical sector, with the availability of experimental high throughput screening (HTS) data, it is now possible to utilize in silico target prediction methods in the early discovery phase to suggest the MoA of a compound via data mining of bioactivity data. While having been established in the pharmaceutical context, in the agrochemical area this approach poses rather different challenges, as we have found in this work, partially due to different chemistry, but even more so due to different (usually smaller) amounts of data, and different ways of conducting HTS. With the aim to apply computational methods for facilitating herbicide target identification, 48,000 bioactivity data against 16 herbicide targets were processed to train Laplacian modified Naïve Bayesian (NB) classification models. The herbicide target prediction model ("HerbiMod") is an ensemble of 16 binary classification models which are evaluated by internal, external and prospective validation sets. In addition to the experimental inactives, 10,000 random agrochemical inactives were included in the training process, which showed to improve the overall balanced accuracy of our models up to 40%. For all the models, performance in terms of balanced accuracy of≥80% was achieved in five-fold cross validation. Ranking target predictions was addressed by means of z-scores which improved predictivity over using raw scores alone. An external testset of 247 compounds from ChEMBL and a prospective testset of 394 compounds from BASF SE tested against five well studied herbicide targets (ACC, ALS, HPPD, PDS and PROTOX) were used for further validation. Only 4% of the compounds in the external testset lied in the applicability domain and extrapolation (and correct prediction) was hence impossible, which on one hand was surprising, and on the other hand illustrated the utilization of

  7. Incorporating priors on expert performance parameters for segmentation validation and label fusion: a maximum a posteriori STAPLE

    PubMed Central

    Commowick, Olivier; Warfield, Simon K

    2010-01-01

    In order to evaluate the quality of segmentations of an image and assess intra- and inter-expert variability in segmentation performance, an Expectation Maximization (EM) algorithm for Simultaneous Truth And Performance Level Estimation (STAPLE) was recently developed. This algorithm, originally presented for segmentation validation, has since been used for many applications, such as atlas construction and decision fusion. However, the manual delineation of structures of interest is a very time consuming and burdensome task. Further, as the time required and burden of manual delineation increase, the accuracy of the delineation is decreased. Therefore, it may be desirable to ask the experts to delineate only a reduced number of structures or the segmentation of all structures by all experts may simply not be achieved. Fusion from data with some structures not segmented by each expert should be carried out in a manner that accounts for the missing information. In other applications, locally inconsistent segmentations may drive the STAPLE algorithm into an undesirable local optimum, leading to misclassifications or misleading experts performance parameters. We present a new algorithm that allows fusion with partial delineation and which can avoid convergence to undesirable local optima in the presence of strongly inconsistent segmentations. The algorithm extends STAPLE by incorporating prior probabilities for the expert performance parameters. This is achieved through a Maximum A Posteriori formulation, where the prior probabilities for the performance parameters are modeled by a beta distribution. We demonstrate that this new algorithm enables dramatically improved fusion from data with partial delineation by each expert in comparison to fusion with STAPLE. PMID:20879379

  8. Incorporating priors on expert performance parameters for segmentation validation and label fusion: a maximum a posteriori STAPLE.

    PubMed

    Commowick, Olivier; Warfield, Simon K

    2010-01-01

    In order to evaluate the quality of segmentations of an image and assess intra- and inter-expert variability in segmentation performance, an Expectation Maximization (EM) algorithm for Simultaneous Truth And Performance Level Estimation (STAPLE) was recently developed. This algorithm, originally presented for segmentation validation, has since been used for many applications, such as atlas construction and decision fusion. However, the manual delineation of structures of interest is a very time consuming and burdensome task. Further, as the time required and burden of manual delineation increase, the accuracy of the delineation is decreased. Therefore, it may be desirable to ask the experts to delineate only a reduced number of structures or the segmentation of all structures by all experts may simply not be achieved. Fusion from data with some structures not segmented by each expert should be carried out in a manner that accounts for the missing information. In other applications, locally inconsistent segmentations may drive the STAPLE algorithm into an undesirable local optimum, leading to misclassifications or misleading experts performance parameters. We present a new algorithm that allows fusion with partial delineation and which can avoid convergence to undesirable local optima in the presence of strongly inconsistent segmentations. The algorithm extends STAPLE by incorporating prior probabilities for the expert performance parameters. This is achieved through a Maximum A Posteriori formulation, where the prior probabilities for the performance parameters are modeled by a beta distribution. We demonstrate that this new algorithm enables dramatically improved fusion from data with partial delineation by each expert in comparison to fusion with STAPLE.

  9. Hyperspectral signature analysis of skin parameters

    NASA Astrophysics Data System (ADS)

    Vyas, Saurabh; Banerjee, Amit; Garza, Luis; Kang, Sewon; Burlina, Philippe

    2013-02-01

    The temporal analysis of changes in biological skin parameters, including melanosome concentration, collagen concentration and blood oxygenation, may serve as a valuable tool in diagnosing the progression of malignant skin cancers and in understanding the pathophysiology of cancerous tumors. Quantitative knowledge of these parameters can also be useful in applications such as wound assessment, and point-of-care diagnostics, amongst others. We propose an approach to estimate in vivo skin parameters using a forward computational model based on Kubelka-Munk theory and the Fresnel Equations. We use this model to map the skin parameters to their corresponding hyperspectral signature. We then use machine learning based regression to develop an inverse map from hyperspectral signatures to skin parameters. In particular, we employ support vector machine based regression to estimate the in vivo skin parameters given their corresponding hyperspectral signature. We build on our work from SPIE 2012, and validate our methodology on an in vivo dataset. This dataset consists of 241 signatures collected from in vivo hyperspectral imaging of patients of both genders and Caucasian, Asian and African American ethnicities. In addition, we also extend our methodology past the visible region and through the short-wave infrared region of the electromagnetic spectrum. We find promising results when comparing the estimated skin parameters to the ground truth, demonstrating good agreement with well-established physiological precepts. This methodology can have potential use in non-invasive skin anomaly detection and for developing minimally invasive pre-screening tools.

  10. An Efficient Data Partitioning to Improve Classification Performance While Keeping Parameters Interpretable

    PubMed Central

    Korjus, Kristjan; Hebart, Martin N.; Vicente, Raul

    2016-01-01

    Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier’s generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term “Cross-validation and cross-testing” improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do. PMID:27564393

  11. Generalized Grueneisen tensor from solid nonlinearity parameters

    NASA Technical Reports Server (NTRS)

    Cantrell, J. H., Jr.

    1980-01-01

    Anharmonic effects in solids are often described in terms of generalized Grueneisen parameters which measure the strain dependence of the lattice vibrational frequencies. The relationship between these parameters and the solid nonlinearity parameters measured directly in ultrasonic harmonic generation experiments is derived using an approach valid for normal-mode elastic wave propagation in any crystalline direction. The resulting generalized Grueneisen parameters are purely isentropic in contrast to the Brugger-Grueneisen parameters which are of a mixed thermodynamic state. Experimental data comparing the isentropic generalized Grueneisen parameters and the Brugger-Grueneisen parameters are presented.

  12. Reliable prediction of clinical outcome in patients with chronic HCV infection and compensated advanced hepatic fibrosis: a validated model using objective and readily available clinical parameters.

    PubMed

    van der Meer, Adriaan J; Hansen, Bettina E; Fattovich, Giovanna; Feld, Jordan J; Wedemeyer, Heiner; Dufour, Jean-François; Lammert, Frank; Duarte-Rojo, Andres; Manns, Michael P; Ieluzzi, Donatella; Zeuzem, Stefan; Hofmann, W Peter; de Knegt, Robert J; Veldt, Bart J; Janssen, Harry L A

    2015-02-01

    Reliable tools to predict long-term outcome among patients with well compensated advanced liver disease due to chronic HCV infection are lacking. Risk scores for mortality and for cirrhosis-related complications were constructed with Cox regression analysis in a derivation cohort and evaluated in a validation cohort, both including patients with chronic HCV infection and advanced fibrosis. In the derivation cohort, 100/405 patients died during a median 8.1 (IQR 5.7-11.1) years of follow-up. Multivariate Cox analyses showed age (HR=1.06, 95% CI 1.04 to 1.09, p<0.001), male sex (HR=1.91, 95% CI 1.10 to 3.29, p=0.021), platelet count (HR=0.91, 95% CI 0.87 to 0.95, p<0.001) and log10 aspartate aminotransferase/alanine aminotransferase ratio (HR=1.30, 95% CI 1.12 to 1.51, p=0.001) were independently associated with mortality (C statistic=0.78, 95% CI 0.72 to 0.83). In the validation cohort, 58/296 patients with cirrhosis died during a median of 6.6 (IQR 4.4-9.0) years. Among patients with estimated 5-year mortality risks <5%, 5-10% and >10%, the observed 5-year mortality rates in the derivation cohort and validation cohort were 0.9% (95% CI 0.0 to 2.7) and 2.6% (95% CI 0.0 to 6.1), 8.1% (95% CI 1.8 to 14.4) and 8.0% (95% CI 1.3 to 14.7), 21.8% (95% CI 13.2 to 30.4) and 20.9% (95% CI 13.6 to 28.1), respectively (C statistic in validation cohort = 0.76, 95% CI 0.69 to 0.83). The risk score for cirrhosis-related complications also incorporated HCV genotype (C statistic = 0.80, 95% CI 0.76 to 0.83 in the derivation cohort; and 0.74, 95% CI 0.68 to 0.79 in the validation cohort). Prognosis of patients with chronic HCV infection and compensated advanced liver disease can be accurately assessed with risk scores including readily available objective clinical parameters. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Scaling of hydrologic and erosion parameters derived from rainfall simulation

    NASA Astrophysics Data System (ADS)

    Sheridan, Gary; Lane, Patrick; Noske, Philip; Sherwin, Christopher

    2010-05-01

    Rainfall simulation experiments conducted at the temporal scale of minutes and the spatial scale of meters are often used to derive parameters for erosion and water quality models that operate at much larger temporal and spatial scales. While such parameterization is convenient, there has been little effort to validate this approach via nested experiments across these scales. In this paper we first review the literature relevant to some of these long acknowledged issues. We then present rainfall simulation and erosion plot data from a range of sources, including mining, roading, and forestry, to explore the issues associated with the scaling of parameters such as infiltration properties and erodibility coefficients.

  14. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  15. Breakdown parameter for kinetic modeling of multiscale gas flows.

    PubMed

    Meng, Jianping; Dongari, Nishanth; Reese, Jason M; Zhang, Yonghao

    2014-06-01

    Multiscale methods built purely on the kinetic theory of gases provide information about the molecular velocity distribution function. It is therefore both important and feasible to establish new breakdown parameters for assessing the appropriateness of a fluid description at the continuum level by utilizing kinetic information rather than macroscopic flow quantities alone. We propose a new kinetic criterion to indirectly assess the errors introduced by a continuum-level description of the gas flow. The analysis, which includes numerical demonstrations, focuses on the validity of the Navier-Stokes-Fourier equations and corresponding kinetic models and reveals that the new criterion can consistently indicate the validity of continuum-level modeling in both low-speed and high-speed flows at different Knudsen numbers.

  16. Early prediction of intensive care unit-acquired weakness using easily available parameters: a prospective observational study.

    PubMed

    Wieske, Luuk; Witteveen, Esther; Verhamme, Camiel; Dettling-Ihnenfeldt, Daniela S; van der Schaaf, Marike; Schultz, Marcus J; van Schaik, Ivo N; Horn, Janneke

    2014-01-01

    An early diagnosis of Intensive Care Unit-acquired weakness (ICU-AW) using muscle strength assessment is not possible in most critically ill patients. We hypothesized that development of ICU-AW can be predicted reliably two days after ICU admission, using patient characteristics, early available clinical parameters, laboratory results and use of medication as parameters. Newly admitted ICU patients mechanically ventilated ≥2 days were included in this prospective observational cohort study. Manual muscle strength was measured according to the Medical Research Council (MRC) scale, when patients were awake and attentive. ICU-AW was defined as an average MRC score <4. A prediction model was developed by selecting predictors from an a-priori defined set of candidate predictors, based on known risk factors. Discriminative performance of the prediction model was evaluated, validated internally and compared to the APACHE IV and SOFA score. Of 212 included patients, 103 developed ICU-AW. Highest lactate levels, treatment with any aminoglycoside in the first two days after admission and age were selected as predictors. The area under the receiver operating characteristic curve of the prediction model was 0.71 after internal validation. The new prediction model improved discrimination compared to the APACHE IV and the SOFA score. The new early prediction model for ICU-AW using a set of 3 easily available parameters has fair discriminative performance. This model needs external validation.

  17. Groundwater Model Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process ofmore » stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of

  18. Validation of sterilizing grade filtration.

    PubMed

    Jornitz, M W; Meltzer, T H

    2003-01-01

    Validation consideration of sterilizing grade filters, namely 0.2 micron, changed when FDA voiced concerns about the validity of Bacterial Challenge tests performed in the past. Such validation exercises are nowadays considered to be filter qualification. Filter validation requires more thorough analysis, especially Bacterial Challenge testing with the actual drug product under process conditions. To do so, viability testing is a necessity to determine the Bacterial Challenge test methodology. Additionally to these two compulsory tests, other evaluations like extractable, adsorption and chemical compatibility tests should be considered. PDA Technical Report # 26, Sterilizing Filtration of Liquids, describes all parameters and aspects required for the comprehensive validation of filters. The report is a most helpful tool for validation of liquid filters used in the biopharmaceutical industry. It sets the cornerstones of validation requirements and other filtration considerations.

  19. Simplification and Validation of a Spectral-Tensor Model for Turbulence Including Atmospheric Stability

    NASA Astrophysics Data System (ADS)

    Chougule, Abhijit; Mann, Jakob; Kelly, Mark; Larsen, Gunner C.

    2018-06-01

    A spectral-tensor model of non-neutral, atmospheric-boundary-layer turbulence is evaluated using Eulerian statistics from single-point measurements of the wind speed and temperature at heights up to 100 m, assuming constant vertical gradients of mean wind speed and temperature. The model has been previously described in terms of the dissipation rate ɛ , the length scale of energy-containing eddies L, a turbulence anisotropy parameter Γ, the Richardson number Ri, and the normalized rate of destruction of temperature variance η _θ ≡ ɛ _θ /ɛ . Here, the latter two parameters are collapsed into a single atmospheric stability parameter z / L using Monin-Obukhov similarity theory, where z is the height above the Earth's surface, and L is the Obukhov length corresponding to Ri,η _θ. Model outputs of the one-dimensional velocity spectra, as well as cospectra of the streamwise and/or vertical velocity components, and/or temperature, and cross-spectra for the spatial separation of all three velocity components and temperature, are compared with measurements. As a function of the four model parameters, spectra and cospectra are reproduced quite well, but horizontal temperature fluxes are slightly underestimated in stable conditions. In moderately unstable stratification, our model reproduces spectra only up to a scale ˜ 1 km. The model also overestimates coherences for vertical separations, but is less severe in unstable than in stable cases.

  20. Prognostic value of exercise echocardiography: validation of a new risk index combining echocardiographic, treadmill, and exercise electrocardiographic parameters.

    PubMed

    Mazur, Wojciech; Rivera, Jose M; Khoury, Alexander F; Basu, Abhijeet G; Perez-Verdia, Alejandro; Marks, Gary F; Chang, Su Min; Olmos, Leopoldo; Quiñones, Miguel A; Zoghbi, William A

    2003-04-01

    Exercise (Ex) echocardiography has been shown to have significant prognostic power, independent of other known predictors of risk from an Ex stress test. The purpose of this study was to evaluate a risk index, incorporating echocardiographic and conventional Ex variables, for a more comprehensive risk stratification and identification of a very low-risk group. Two consecutive, mutually exclusive populations referred for treadmill Ex echocardiography with the Bruce protocol were investigated: hypothesis-generating (388 patients; 268 males; age 55 +/- 13 years) and hypothesis-testing (105 patients; 61 males age: 54 +/- 14 years).Cardiac events included cardiac death, myocardial infarction, late revascularization (>90 days), hospital admission for unstable angina, and admission for heart failure. Mean follow-up in the hypothesis-generating population was 3.1 years. There were 38 cardiac events. Independent predictors of events by multivariate analysis were: Ex wall motion score index (odds ratio [OR] = 2.77/Unit; P <.001); ischemic S-T depression > or = 1 mm (OR = 2.84; P =.002); and treadmill time (OR = 0.87/min; P =.037). A risk index was generated on the basis of the multivariate Cox regression model as: risk index = 1.02 (Ex wall motion score index) + 1.04 (S-T change) - 0.14 (treadmill time). The validity of this index was tested in the hypothesis-testing population. Event rates at 3 years were lowest (0%) in the lower quartile of risk index (-1.22 to -0.47), highest (29.6%) in the upper quartile (+0.66 to +2.02), and intermediate (19.2% to 15.3%) in the intermediate quartiles. The OR of the risk index for predicting cardiac events was 2.94/Unit ([95% confidence interval: 1.4 to 6.2]; P =.0043). Echocardiographic and Ex parameters are independent powerful predictors of cardiac events after treadmill stress testing. A risk index can be derived with these parameters for a more comprehensive risk stratification with Ex echocardiography.

  1. Empirical flow parameters : a tool for hydraulic model validity

    USGS Publications Warehouse

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  2. On the soft supersymmetry-breaking parameters in gauge-mediated models

    NASA Astrophysics Data System (ADS)

    Wagner, C. E. M.

    1998-09-01

    Gauge mediation of supersymmetry breaking in the observable sector is an attractive idea, which naturally alleviates the flavor changing neutral current problem of supersymmetric theories. Quite generally, however, the number and quantum number of the messengers are not known; nor is their characteristic mass scale determined by the theory. Using the recently proposed method to extract supersymmetry-breaking parameters from wave-function renormalization, we derived general formulae for the soft supersymmetry-breaking parameters in the observable sector, valid in the small and moderate tan β regimes, for the case of split messengers. The full leading-order effects of top Yukawa and gauge couplings on the soft supersymmetry-breaking parameters are included. We give a simple interpretation of the general formulae in terms of the renormalization group evolution of the soft supersymmetry-breaking parameters. As a by-product of this analysis, the one-loop renormalization group evolution of the soft supersymmetry-breaking parameters is obtained for arbitrary boundary conditions of the scalar and gaugino mass parameters at high energies.

  3. Characteristic parameters of superconductor-coolant interaction including high Tc current density limits

    NASA Technical Reports Server (NTRS)

    Frederking, T. H. K.

    1989-01-01

    In the area of basic mechanisms of helium heat transfer and related influence on super-conducting magnet stability, thermal boundary conditions are important constraints. Characteristic lengths are considered along with other parameters of the superconducting composite-coolant system. Based on helium temperature range developments, limiting critical current densities are assessed at low fields for high transition temperature superconductors.

  4. Changes in bone mineral metabolism parameters, including FGF23, after discontinuing cinacalcet at kidney transplantation.

    PubMed

    Barros, Xoana; Fuster, David; Paschoalin, Raphael; Oppenheimer, Federico; Rubello, Domenico; Perlaza, Pilar; Pons, Francesca; Torregrosa, Jose V

    2015-05-01

    Little is known about the effects of the administration of cinacalcet in dialytic patients who are scheduled for kidney transplantation, and in particular about the changes in FGF23 and other mineral metabolism parameters after surgery compared with recipients not on cinacalcet at kidney transplantation. We performed a prospective observational cohort study with recruitment of consecutive kidney transplant recipients at our institution. Patients were classified according to whether they were under treatment with cinacalcet before transplantation. Bone mineral metabolism parameters, including C-terminal FGF23, were measured at baseline, on day 15, and at 1, 3, and 6 months after transplantation. In previously cinacalcet-treated patients, cinacalcet therapy was discontinued on the day of surgery and was not restarted after transplantation. A total of 48 kidney transplant recipients, 20 on cinacalcet at surgery and 28 cinacalcet non-treated patients, completed the follow-up. Serum phosphate declined significantly in the first 15 days after transplantation with no differences between the two groups, whereas cinacalcet-treated patients showed higher FGF23 levels, although not significant. After transplantation, PTH and serum calcium were significantly higher in cinacalcet-treated patients. We conclude that patients receiving cinacalcet on dialysis presented similar serum phosphate levels but higher PTH and serum calcium levels during the initial six months after kidney transplantation than cinacalcet non-treated patients. The group previously treated with cinacalcet before transplantation showed higher FGF23 levels without significant differences, so further studies should investigate its relevance in the management of these patients.

  5. Validation of the i-STAT and HemoCue systems for the analysis of blood parameters in the bar-headed goose, Anser indicus

    PubMed Central

    Harter, T. S.; Reichert, M.; Brauner, C. J.; Milsom, W. K.

    2015-01-01

    Every year, bar-headed geese (Anser indicus) perform some of the most remarkable trans-Himalayan migrations, and researchers are increasingly interested in understanding the physiology underlying their high-altitude flight performance. A major challenge is generating reliable measurements of blood parameters on wild birds in the field, where established analytical techniques are often not available. Therefore, we validated two commonly used portable clinical analysers (PCAs), the i-STAT and the HemoCue systems, for the analysis of blood parameters in bar-headed geese. The pH, partial pressures of O2 and CO2 (PO2 and PCO2), haemoglobin O2 saturation (sO2), haematocrit (Hct) and haemoglobin concentration [Hb] were simultaneously measured with the two PCA systems (i-STAT for all parameters; HemoCue for [Hb]) and with conventional laboratory techniques over a physiological range of PO2, PCO2 and Hct. Our results indicate that the i-STAT system can generate reliable values on bar-headed goose whole blood pH, PO2, PCO2 and Hct, but we recommend correcting the obtained values using the linear equations determined here for higher accuracy. The i-STAT is probably not able to produce meaningful measurements of sO2 and [Hb] over a range of physiologically relevant environmental conditions. However, we can recommend the use of the HemoCue to measure [Hb] in the bar-headed goose, if results are corrected. We emphasize that the equations that we provide to correct PCA results are applicable only to bar-headed goose whole blood under the conditions that we tested. We encourage researchers to validate i-STAT or HemoCue results thoroughly for their specific study conditions and species in order to yield accurate results. PMID:27293706

  6. Challenges in Rotorcraft Acoustic Flight Prediction and Validation

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.

    2003-01-01

    Challenges associated with rotorcraft acoustic flight prediction and validation are examined. First, an outline of a state-of-the-art rotorcraft aeroacoustic prediction methodology is presented. Components including rotorcraft aeromechanics, high resolution reconstruction, and rotorcraft acoustic prediction arc discussed. Next, to illustrate challenges and issues involved, a case study is presented in which an analysis of flight data from a specific XV-15 tiltrotor acoustic flight test is discussed in detail. Issues related to validation of methodologies using flight test data are discussed. Primary flight parameters such as velocity, altitude, and attitude are discussed and compared for repeated flight conditions. Other measured steady state flight conditions are examined for consistency and steadiness. A representative example prediction is presented and suggestions are made for future research.

  7. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  8. Validation of simulated earthquake ground motions based on evolution of intensity and frequency content

    USGS Publications Warehouse

    Rezaeian, Sanaz; Zhong, Peng; Hartzell, Stephen; Zareian, Farzin

    2015-01-01

    Simulated earthquake ground motions can be used in many recent engineering applications that require time series as input excitations. However, applicability and validation of simulations are subjects of debate in the seismological and engineering communities. We propose a validation methodology at the waveform level and directly based on characteristics that are expected to influence most structural and geotechnical response parameters. In particular, three time-dependent validation metrics are used to evaluate the evolving intensity, frequency, and bandwidth of a waveform. These validation metrics capture nonstationarities in intensity and frequency content of waveforms, making them ideal to address nonlinear response of structural systems. A two-component error vector is proposed to quantify the average and shape differences between these validation metrics for a simulated and recorded ground-motion pair. Because these metrics are directly related to the waveform characteristics, they provide easily interpretable feedback to seismologists for modifying their ground-motion simulation models. To further simplify the use and interpretation of these metrics for engineers, it is shown how six scalar key parameters, including duration, intensity, and predominant frequency, can be extracted from the validation metrics. The proposed validation methodology is a step forward in paving the road for utilization of simulated ground motions in engineering practice and is demonstrated using examples of recorded and simulated ground motions from the 1994 Northridge, California, earthquake.

  9. Phase Diagrams and the Non-Linear Dielectric Constant in the Landau-Type Potential Including the Linear-Quadratic Coupling between Order Parameters

    NASA Astrophysics Data System (ADS)

    Iwata, Makoto; Orihara, Hiroshi; Ishibashi, Yoshihiro

    1997-04-01

    The phase diagrams in the Landau-type thermodynamic potential including the linear-quadratic coupling between order parameters p and q, i.e., qp2, which is applicable to the phase transition in the benzil, phospholipid bilayers, and the isotropic-nematic phase transition in liquid crystals, are studied. It was found that the phase diagram in the extreme case has one tricritical point c1, one critical end point e1, and two triple points t1 and t2. The linear and nonlinear dielectric constants in the potential are discussed in the case that the order parameter p is the polarization.

  10. Normal- and oblique-shock flow parameters in equilibrium air including attached-shock solutions for surfaces at angles of attack, sweep, and dihedral

    NASA Technical Reports Server (NTRS)

    Hunt, J. L.; Souders, S. W.

    1975-01-01

    Normal- and oblique-shock flow parameters for air in thermochemical equilibrium are tabulated as a function of shock angle for altitudes ranging from 15.24 km to 91.44 km in increments of 7.62 km at selected hypersonic speeds. Post-shock parameters tabulated include flow-deflection angle, velocity, Mach number, compressibility factor, isentropic exponent, viscosity, Reynolds number, entropy difference, and static pressure, temperature, density, and enthalpy ratios across the shock. A procedure is presented for obtaining oblique-shock flow properties in equilibrium air on surfaces at various angles of attack, sweep, and dihedral by use of the two-dimensional tabulations. Plots of the flow parameters against flow-deflection angle are presented at altitudes of 30.48, 60.96, and 91.44 km for various stream velocities.

  11. Predicting distant failure in early stage NSCLC treated with SBRT using clinical parameters.

    PubMed

    Zhou, Zhiguo; Folkert, Michael; Cannon, Nathan; Iyengar, Puneeth; Westover, Kenneth; Zhang, Yuanyuan; Choy, Hak; Timmerman, Robert; Yan, Jingsheng; Xie, Xian-J; Jiang, Steve; Wang, Jing

    2016-06-01

    The aim of this study is to predict early distant failure in early stage non-small cell lung cancer (NSCLC) treated with stereotactic body radiation therapy (SBRT) using clinical parameters by machine learning algorithms. The dataset used in this work includes 81 early stage NSCLC patients with at least 6months of follow-up who underwent SBRT between 2006 and 2012 at a single institution. The clinical parameters (n=18) for each patient include demographic parameters, tumor characteristics, treatment fraction schemes, and pretreatment medications. Three predictive models were constructed based on different machine learning algorithms: (1) artificial neural network (ANN), (2) logistic regression (LR) and (3) support vector machine (SVM). Furthermore, to select an optimal clinical parameter set for the model construction, three strategies were adopted: (1) clonal selection algorithm (CSA) based selection strategy; (2) sequential forward selection (SFS) method; and (3) statistical analysis (SA) based strategy. 5-cross-validation is used to validate the performance of each predictive model. The accuracy was assessed by area under the receiver operating characteristic (ROC) curve (AUC), sensitivity and specificity of the system was also evaluated. The AUCs for ANN, LR and SVM were 0.75, 0.73, and 0.80, respectively. The sensitivity values for ANN, LR and SVM were 71.2%, 72.9% and 83.1%, while the specificity values for ANN, LR and SVM were 59.1%, 63.6% and 63.6%, respectively. Meanwhile, the CSA based strategy outperformed SFS and SA in terms of AUC, sensitivity and specificity. Based on clinical parameters, the SVM with the CSA optimal parameter set selection strategy achieves better performance than other strategies for predicting distant failure in lung SBRT patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Kinetic modelling of anaerobic hydrolysis of solid wastes, including disintegration processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Gen, Santiago; Sousbie, Philippe; Rangaraj, Ganesh

    2015-01-15

    Highlights: • Fractionation of solid wastes into readily and slowly biodegradable fractions. • Kinetic coefficients estimation from mono-digestion batch assays. • Validation of kinetic coefficients with a co-digestion continuous experiment. • Simulation of batch and continuous experiments with an ADM1-based model. - Abstract: A methodology to estimate disintegration and hydrolysis kinetic parameters of solid wastes and validate an ADM1-based anaerobic co-digestion model is presented. Kinetic parameters of the model were calibrated from batch reactor experiments treating individually fruit and vegetable wastes (among other residues) following a new protocol for batch tests. In addition, decoupled disintegration kinetics for readily and slowlymore » biodegradable fractions of solid wastes was considered. Calibrated parameters from batch assays of individual substrates were used to validate the model for a semi-continuous co-digestion operation treating simultaneously 5 fruit and vegetable wastes. The semi-continuous experiment was carried out in a lab-scale CSTR reactor for 15 weeks at organic loading rate ranging between 2.0 and 4.7 g VS/L d. The model (built in Matlab/Simulink) fit to a large extent the experimental results in both batch and semi-continuous mode and served as a powerful tool to simulate the digestion or co-digestion of solid wastes.« less

  13. Reliability and validity of a smartphone-based assessment of gait parameters across walking speed and smartphone locations: Body, bag, belt, hand, and pocket.

    PubMed

    Silsupadol, Patima; Teja, Kunlanan; Lugade, Vipul

    2017-10-01

    The assessment of spatiotemporal gait parameters is a useful clinical indicator of health status. Unfortunately, most assessment tools require controlled laboratory environments which can be expensive and time consuming. As smartphones with embedded sensors are becoming ubiquitous, this technology can provide a cost-effective, easily deployable method for assessing gait. Therefore, the purpose of this study was to assess the reliability and validity of a smartphone-based accelerometer in quantifying spatiotemporal gait parameters when attached to the body or in a bag, belt, hand, and pocket. Thirty-four healthy adults were asked to walk at self-selected comfortable, slow, and fast speeds over a 10-m walkway while carrying a smartphone. Step length, step time, gait velocity, and cadence were computed from smartphone-based accelerometers and validated with GAITRite. Across all walking speeds, smartphone data had excellent reliability (ICC 2,1 ≥0.90) for the body and belt locations, with bag, hand, and pocket locations having good to excellent reliability (ICC 2,1 ≥0.69). Correlations between the smartphone-based and GAITRite-based systems were very high for the body (r=0.89, 0.98, 0.96, and 0.87 for step length, step time, gait velocity, and cadence, respectively). Similarly, Bland-Altman analysis demonstrated that the bias approached zero, particularly in the body, bag, and belt conditions under comfortable and fast speeds. Thus, smartphone-based assessments of gait are most valid when placed on the body, in a bag, or on a belt. The use of a smartphone to assess gait can provide relevant data to clinicians without encumbering the user and allow for data collection in the free-living environment. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Two methods for modeling vibrations of planetary gearboxes including faults: Comparison and validation

    NASA Astrophysics Data System (ADS)

    Parra, J.; Vicuña, Cristián Molina

    2017-08-01

    Planetary gearboxes are important components of many industrial applications. Vibration analysis can increase their lifetime and prevent expensive repair and safety concerns. However, an effective analysis is only possible if the vibration features of planetary gearboxes are properly understood. In this paper, models are used to study the frequency content of planetary gearbox vibrations under non-fault and different fault conditions. Two different models are considered: phenomenological model, which is an analytical-mathematical formulation based on observation, and lumped-parameter model, which is based on the solution of the equations of motion of the system. Results of both models are not directly comparable, because the phenomenological model provides the vibration on a fixed radial direction, such as the measurements of the vibration sensor mounted on the outer part of the ring gear. On the other hand, the lumped-parameter model provides the vibrations on the basis of a rotating reference frame fixed to the carrier. To overcome this situation, a function to decompose the lumped-parameter model solutions to a fixed reference frame is presented. Finally, comparisons of results from both model perspectives and experimental measurements are presented.

  15. A Formal Approach to Empirical Dynamic Model Optimization and Validation

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  16. A combined QC methodology in Ebro Delta HF radar system: real time web monitoring of diagnostic parameters and offline validation of current data

    NASA Astrophysics Data System (ADS)

    Lorente, Pablo; Piedracoba, Silvia; Soto-Navarro, Javier; Ruiz, Maria Isabel; Alvarez Fanjul, Enrique

    2015-04-01

    Over recent years, special attention has been focused on the development of protocols for near real-time quality control (QC) of HF radar derived current measurements. However, no agreement has been worldwide achieved to date to establish a standardized QC methodology, although a number of valuable international initiatives have been launched. In this context, Puertos del Estado (PdE) aims to implement a fully operational HF radar network with four different Codar SeaSonde HF radar systems by means of: - The development of a best-practices robust protocol for data processing and QC procedures to routinely monitor sites performance under a wide variety of ocean conditions. - The execution of validation works with in-situ observations to assess the accuracy of HF radar-derived current measurements. The main goal of the present work is to show this combined methodology for the specific case of Ebro HF radar (although easily expandable to the rest of PdE radar systems), deployed to manage Ebro River deltaic area and promote the conservation of an important aquatic ecosystem exposed to a severe erosion and reshape. To this aim, a web interface has been developed to efficiently monitor in real time the evolution of several diagnostic parameters provided by the manufacturer (CODAR) and used as indicators of HF radar system health. This web, updated automatically every hour, examines sites performance on different time basis in terms of: - Hardware parameters: power and temperature. - Radial parameters, among others: Signal-to-Noise Ratio (SNR), number of radial vectors provided by time step, maximum radial range and bearing. - Total uncertainty metrics provided by CODAR: zonal and meridional standard deviations and covariance between both components. - Additionally, a widget embedded in the web interface executes queries against PdE database, providing the chance to compare current time series observed by Tarragona buoy (located within Ebro HF radar spatial domain) and

  17. Real-time Retrieving Atmospheric Parameters from Multi-GNSS Constellations

    NASA Astrophysics Data System (ADS)

    Li, X.; Zus, F.; Lu, C.; Dick, G.; Ge, M.; Wickert, J.; Schuh, H.

    2016-12-01

    The multi-constellation GNSS (e.g. GPS, GLONASS, Galileo, and BeiDou) bring great opportunities and challenges for real-time retrieval of atmospheric parameters for supporting numerical weather prediction (NWP) nowcasting or severe weather event monitoring. In this study, the observations from different GNSS are combined together for atmospheric parameter retrieving based on the real-time precise point positioning technique. The atmospheric parameters retrieved from multi-GNSS observations, including zenith total delay (ZTD), integrated water vapor (IWV), horizontal gradient (especially high-resolution gradient estimates) and slant total delay (STD), are carefully analyzed and evaluated by using the VLBI, radiosonde, water vapor radiometer and numerical weather model to independently validate the performance of individual GNSS and also demonstrate the benefits of multi-constellation GNSS for real-time atmospheric monitoring. Numerous results show that the multi-GNSS processing can provide real-time atmospheric products with higher accuracy, stronger reliability and better distribution, which would be beneficial for atmospheric sounding systems, especially for nowcasting of extreme weather.

  18. Validation of lumbar spine loading from a musculoskeletal model including the lower limbs and lumbar spine.

    PubMed

    Actis, Jason A; Honegger, Jasmin D; Gates, Deanna H; Petrella, Anthony J; Nolasco, Luis A; Silverman, Anne K

    2018-02-08

    Low back mechanics are important to quantify to study injury, pain and disability. As in vivo forces are difficult to measure directly, modeling approaches are commonly used to estimate these forces. Validation of model estimates is critical to gain confidence in modeling results across populations of interest, such as people with lower-limb amputation. Motion capture, ground reaction force and electromyographic data were collected from ten participants without an amputation (five male/five female) and five participants with a unilateral transtibial amputation (four male/one female) during trunk-pelvis range of motion trials in flexion/extension, lateral bending and axial rotation. A musculoskeletal model with a detailed lumbar spine and the legs including 294 muscles was used to predict L4-L5 loading and muscle activations using static optimization. Model estimates of L4-L5 intervertebral joint loading were compared to measured intradiscal pressures from the literature and muscle activations were compared to electromyographic signals. Model loading estimates were only significantly different from experimental measurements during trunk extension for males without an amputation and for people with an amputation, which may suggest a greater portion of L4-L5 axial load transfer through the facet joints, as facet loads are not captured by intradiscal pressure transducers. Pressure estimates between the model and previous work were not significantly different for flexion, lateral bending or axial rotation. Timing of model-estimated muscle activations compared well with electromyographic activity of the lumbar paraspinals and upper erector spinae. Validated estimates of low back loading can increase the applicability of musculoskeletal models to clinical diagnosis and treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Web based tools for data manipulation, visualisation and validation with interactive georeferenced graphs

    NASA Astrophysics Data System (ADS)

    Ivankovic, D.; Dadic, V.

    2009-04-01

    Some of oceanographic parameters have to be manually inserted into database; some (for example data from CTD probe) are inserted from various files. All this parameters requires visualization, validation and manipulation from research vessel or scientific institution, and also public presentation. For these purposes is developed web based system, containing dynamic sql procedures and java applets. Technology background is Oracle 10g relational database, and Oracle application server. Web interfaces are developed using PL/SQL stored database procedures (mod PL/SQL). Additional parts for data visualization include use of Java applets and JavaScript. Mapping tool is Google maps API (javascript) and as alternative java applet. Graph is realized as dynamically generated web page containing java applet. Mapping tool and graph are georeferenced. That means that click on some part of graph, automatically initiate zoom or marker onto location where parameter was measured. This feature is very useful for data validation. Code for data manipulation and visualization are partially realized with dynamic SQL and that allow as to separate data definition and code for data manipulation. Adding new parameter in system requires only data definition and description without programming interface for this kind of data.

  20. Validity of the Kinect for Gait Assessment: A Focused Review

    PubMed Central

    Springer, Shmuel; Yogev Seligmann, Galit

    2016-01-01

    Gait analysis may enhance clinical practice. However, its use is limited due to the need for expensive equipment which is not always available in clinical settings. Recent evidence suggests that Microsoft Kinect may provide a low cost gait analysis method. The purpose of this report is to critically evaluate the literature describing the concurrent validity of using the Kinect as a gait analysis instrument. An online search of PubMed, CINAHL, and ProQuest databases was performed. Included were studies in which walking was assessed with the Kinect and another gold standard device, and consisted of at least one numerical finding of spatiotemporal or kinematic measures. Our search identified 366 papers, from which 12 relevant studies were retrieved. The results demonstrate that the Kinect is valid only for some spatiotemporal gait parameters. Although the kinematic parameters measured by the Kinect followed the trend of the joint trajectories, they showed poor validity and large errors. In conclusion, the Kinect may have the potential to be used as a tool for measuring spatiotemporal aspects of gait, yet standardized methods should be established, and future examinations with both healthy subjects and clinical participants are required in order to integrate the Kinect as a clinical gait analysis tool. PMID:26861323

  1. Validation of hsp70 stress gene expression as a marker of metal effects in Deroceras reticulatum (Pulmonata): Correlation with demographic parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koehler, H.R.; Eckwert, H.; Rahman, B.

    1998-11-01

    The presence of a stress gene comprising a motif homologous to the hsp70 consensus sequence was proven for the grey garden slug, Deroceras reticulatum (Mueller). The induction of stress gene transcription (including mRNA stability) and the accumulation of the corresponding stress protein, Hsp70, was quantified in slugs exposed to cadmium- or zinc-enriched food for 2 to 3 weeks. To validate the suitability of these two aspects of the cellular stress response to act as early-warning markers for metal effects on life-history parameters, fecundity, offspring number, longevity, and mortality of slugs were recorded in life-cycle experiments. Quantitative reverse transcription-polymerase chain reactionmore » and a standardized immunoblotting technique revealed higher sensitivity of changes in hsp70 transcription than stress protein accumulation in response to both metals. The elevation of the hsp70-mRNA level caused by short-term (14 d) metal exposure coincided with both diminished fecundity and reduced offspring production due to chronic metal exposure in terms of threshold concentrations for cadmium effects. As well, accumulation of Hsp70 after 3 weeks of exposure can be considered an early-warning signal for increased mortality when cadmium or zinc exposure is throughout the entire lifetime of the slugs.« less

  2. Targeted estimation of nuisance parameters to obtain valid statistical inference.

    PubMed

    van der Laan, Mark J

    2014-01-01

    In order to obtain concrete results, we focus on estimation of the treatment specific mean, controlling for all measured baseline covariates, based on observing independent and identically distributed copies of a random variable consisting of baseline covariates, a subsequently assigned binary treatment, and a final outcome. The statistical model only assumes possible restrictions on the conditional distribution of treatment, given the covariates, the so-called propensity score. Estimators of the treatment specific mean involve estimation of the propensity score and/or estimation of the conditional mean of the outcome, given the treatment and covariates. In order to make these estimators asymptotically unbiased at any data distribution in the statistical model, it is essential to use data-adaptive estimators of these nuisance parameters such as ensemble learning, and specifically super-learning. Because such estimators involve optimal trade-off of bias and variance w.r.t. the infinite dimensional nuisance parameter itself, they result in a sub-optimal bias/variance trade-off for the resulting real-valued estimator of the estimand. We demonstrate that additional targeting of the estimators of these nuisance parameters guarantees that this bias for the estimand is second order and thereby allows us to prove theorems that establish asymptotic linearity of the estimator of the treatment specific mean under regularity conditions. These insights result in novel targeted minimum loss-based estimators (TMLEs) that use ensemble learning with additional targeted bias reduction to construct estimators of the nuisance parameters. In particular, we construct collaborative TMLEs (C-TMLEs) with known influence curve allowing for statistical inference, even though these C-TMLEs involve variable selection for the propensity score based on a criterion that measures how effective the resulting fit of the propensity score is in removing bias for the estimand. As a particular special

  3. AIRS Retrieval Validation During the EAQUATE

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L.; Cuomo, Vincenzo; Taylor, Jonathan P.; Barnet, Christopher D.; DiGirolamo, Paolo; Pappalardo, Gelsomina; Larar, Allen M.; Liu, Xu; Newman, Stuart M.

    2006-01-01

    Atmospheric and surface thermodynamic parameters retrieved with advanced hyperspectral remote sensors of Earth observing satellites are critical for weather prediction and scientific research. The retrieval algorithms and retrieved parameters from satellite sounders must be validated to demonstrate the capability and accuracy of both observation and data processing systems. The European AQUA Thermodynamic Experiment (EAQUATE) was conducted mainly for validation of the Atmospheric InfraRed Sounder (AIRS) on the AQUA satellite, but also for assessment of validation systems of both ground-based and aircraft-based instruments which will be used for other satellite systems such as the Infrared Atmospheric Sounding Interferometer (IASI) on the European MetOp satellite, the Cross-track Infrared Sounder (CrIS) from the NPOESS Preparatory Project and the following NPOESS series of satellites. Detailed inter-comparisons were conducted and presented using different retrieval methodologies: measurements from airborne ultraspectral Fourier transform spectrometers, aircraft in-situ instruments, dedicated dropsondes and radiosondes, and ground based Raman Lidar, as well as from the European Center for Medium range Weather Forecasting (ECMWF) modeled thermal structures. The results of this study not only illustrate the quality of the measurements and retrieval products but also demonstrate the capability of these validation systems which are put in place to validate current and future hyperspectral sounding instruments and their scientific products.

  4. Face and content validity of Xperience™ Team Trainer: bed-side assistant training simulator for robotic surgery.

    PubMed

    Sessa, Luca; Perrenot, Cyril; Xu, Song; Hubert, Jacques; Bresler, Laurent; Brunaud, Laurent; Perez, Manuela

    2018-03-01

    In robotic surgery, the coordination between the console-side surgeon and bed-side assistant is crucial, more than in standard surgery or laparoscopy where the surgical team works in close contact. Xperience™ Team Trainer (XTT) is a new optional component for the dv-Trainer ® platform and simulates the patient-side working environment. We present preliminary results for face, content, and the workload imposed regarding the use of the XTT virtual reality platform for the psychomotor and communication skills training of the bed-side assistant in robot-assisted surgery. Participants were categorized into "Beginners" and "Experts". They tested a series of exercises (Pick & Place Laparoscopic Demo, Pick & Place 2 and Team Match Board 1) and completed face validity questionnaires. "Experts" assessed content validity on another questionnaire. All the participants completed a NASA Task Load Index questionnaire to assess the workload imposed by XTT. Twenty-one consenting participants were included (12 "Beginners" and 9 "Experts"). XTT was shown to possess face and content validity, as evidenced by the rankings given on the simulator's ease of use and realism parameters and on the simulator's usefulness for training. Eight out of nine "Experts" judged the visualization of metrics after the exercises useful. However, face validity has shown some weaknesses regarding interactions and instruments. Reasonable workload parameters were registered. XTT demonstrated excellent face and content validity with acceptable workload parameters. XTT could become a useful tool for robotic surgery team training.

  5. Validation of balance-quality assessment using a modified bathroom scale.

    PubMed

    Hewson, D J; Duchêne, J; Hogrel, J-Y

    2015-02-01

    The balance quality tester (BQT), based on a standard electronic bathroom scale has been developed in order to assess balance quality. The BQT includes automatic detection of the person to be tested by means of an infrared detector and bluetooth communication capability for remote assessment when linked to a long-distance communication device such as a mobile phone. The BQT was compared to a standard force plate for validity and agreement. The two most widely reported parameters in balance literature, the area of the centre of pressure (COP) displacement and the velocity of the COP displacement, were compared for 12 subjects, each of whom was tested on ten occasions on each of the 2 days. No significant differences were observed between the BQT and the force plate for either of the two parameters. In addition a high level of agreement was observed between both devices. The BQT is a valid device for remote assessment of balance quality, and could provide a useful tool for long-term monitoring of people with balance problems, particularly during home monitoring.

  6. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  7. Validating Satellite-Retrieved Cloud Properties for Weather and Climate Applications

    NASA Astrophysics Data System (ADS)

    Minnis, P.; Bedka, K. M.; Smith, W., Jr.; Yost, C. R.; Bedka, S. T.; Palikonda, R.; Spangenberg, D.; Sun-Mack, S.; Trepte, Q.; Dong, X.; Xi, B.

    2014-12-01

    Cloud properties determined from satellite imager radiances are increasingly used in weather and climate applications, particularly in nowcasting, model assimilation and validation, trend monitoring, and precipitation and radiation analyses. The value of using the satellite-derived cloud parameters is determined by the accuracy of the particular parameter for a given set of conditions, such as viewing and illumination angles, surface background, and cloud type and structure. Because of the great variety of those conditions and of the sensors used to monitor clouds, determining the accuracy or uncertainties in the retrieved cloud parameters is a daunting task. Sensitivity studies of the retrieved parameters to the various inputs for a particular cloud type are helpful for understanding the errors associated with the retrieval algorithm relative to the plane-parallel world assumed in most of the model clouds that serve as the basis for the retrievals. Real world clouds, however, rarely fit the plane-parallel mold and generate radiances that likely produce much greater errors in the retrieved parameter than can be inferred from sensitivity analyses. Thus, independent, empirical methods are used to provide a more reliable uncertainty analysis. At NASA Langley, cloud properties are being retrieved from both geostationary (GEO) and low-earth orbiting (LEO) satellite imagers for climate monitoring and model validation as part of the NASA CERES project since 2000 and from AVHRR data since 1978 as part of the NOAA CDR program. Cloud properties are also being retrieved in near-real time globally from both GEO and LEO satellites for weather model assimilation and nowcasting for hazards such as aircraft icing. This paper discusses the various independent datasets and approaches that are used to assessing the imager-based satellite cloud retrievals. These include, but are not limited to data from ARM sites, CloudSat, and CALIPSO. This paper discusses the use of the various

  8. Delineating parameter unidentifiabilities in complex models

    NASA Astrophysics Data System (ADS)

    Raman, Dhruva V.; Anderson, James; Papachristodoulou, Antonis

    2017-03-01

    Scientists use mathematical modeling as a tool for understanding and predicting the properties of complex physical systems. In highly parametrized models there often exist relationships between parameters over which model predictions are identical, or nearly identical. These are known as structural or practical unidentifiabilities, respectively. They are hard to diagnose and make reliable parameter estimation from data impossible. They furthermore imply the existence of an underlying model simplification. We describe a scalable method for detecting unidentifiabilities, as well as the functional relations defining them, for generic models. This allows for model simplification, and appreciation of which parameters (or functions thereof) cannot be estimated from data. Our algorithm can identify features such as redundant mechanisms and fast time-scale subsystems, as well as the regimes in parameter space over which such approximations are valid. We base our algorithm on a quantification of regional parametric sensitivity that we call `multiscale sloppiness'. Traditionally, the link between parametric sensitivity and the conditioning of the parameter estimation problem is made locally, through the Fisher information matrix. This is valid in the regime of infinitesimal measurement uncertainty. We demonstrate the duality between multiscale sloppiness and the geometry of confidence regions surrounding parameter estimates made where measurement uncertainty is non-negligible. Further theoretical relationships are provided linking multiscale sloppiness to the likelihood-ratio test. From this, we show that a local sensitivity analysis (as typically done) is insufficient for determining the reliability of parameter estimation, even with simple (non)linear systems. Our algorithm can provide a tractable alternative. We finally apply our methods to a large-scale, benchmark systems biology model of necrosis factor (NF)-κ B , uncovering unidentifiabilities.

  9. Personalized Nutrition-Genes, Diet, and Related Interactive Parameters as Predictors of Cancer in Multiethnic Colorectal Cancer Families.

    PubMed

    Shiao, S Pamela K; Grayson, James; Lie, Amanda; Yu, Chong Ho

    2018-06-20

    To personalize nutrition, the purpose of this study was to examine five key genes in the folate metabolism pathway, and dietary parameters and related interactive parameters as predictors of colorectal cancer (CRC) by measuring the healthy eating index (HEI) in multiethnic families. The five genes included methylenetetrahydrofolate reductase ( MTHFR ) 677 and 1298, methionine synthase ( MTR ) 2756, methionine synthase reductase ( MTRR 66), and dihydrofolate reductase ( DHFR ) 19bp , and they were used to compute a total gene mutation score. We included 53 families, 53 CRC patients and 53 paired family friend members of diverse population groups in Southern California. We measured multidimensional data using the ensemble bootstrap forest method to identify variables of importance within domains of genetic, demographic, and dietary parameters to achieve dimension reduction. We then constructed predictive generalized regression (GR) modeling with a supervised machine learning validation procedure with the target variable (cancer status) being specified to validate the results to allow enhanced prediction and reproducibility. The results showed that the CRC group had increased total gene mutation scores compared to the family members ( p < 0.05). Using the Akaike's information criterion and Leave-One-Out cross validation GR methods, the HEI was interactive with thiamine (vitamin B1), which is a new finding for the literature. The natural food sources for thiamine include whole grains, legumes, and some meats and fish which HEI scoring included as part of healthy portions (versus limiting portions on salt, saturated fat and empty calories). Additional predictors included age, as well as gender and the interaction of MTHFR 677 with overweight status (measured by body mass index) in predicting CRC, with the cancer group having more men and overweight cases. The HEI score was significant when split at the median score of 77 into greater or less scores, confirmed through

  10. Quantitative model validation of manipulative robot systems

    NASA Astrophysics Data System (ADS)

    Kartowisastro, Iman Herwidiana

    This thesis is concerned with applying the distortion quantitative validation technique to a robot manipulative system with revolute joints. Using the distortion technique to validate a model quantitatively, the model parameter uncertainties are taken into account in assessing the faithfulness of the model and this approach is relatively more objective than the commonly visual comparison method. The industrial robot is represented by the TQ MA2000 robot arm. Details of the mathematical derivation of the distortion technique are given which explains the required distortion of the constant parameters within the model and the assessment of model adequacy. Due to the complexity of a robot model, only the first three degrees of freedom are considered where all links are assumed rigid. The modelling involves the Newton-Euler approach to obtain the dynamics model, and the Denavit-Hartenberg convention is used throughout the work. The conventional feedback control system is used in developing the model. The system behavior to parameter changes is investigated as some parameters are redundant. This work is important so that the most important parameters to be distorted can be selected and this leads to a new term called the fundamental parameters. The transfer function approach has been chosen to validate an industrial robot quantitatively against the measured data due to its practicality. Initially, the assessment of the model fidelity criterion indicated that the model was not capable of explaining the transient record in term of the model parameter uncertainties. Further investigations led to significant improvements of the model and better understanding of the model properties. After several improvements in the model, the fidelity criterion obtained was almost satisfied. Although the fidelity criterion is slightly less than unity, it has been shown that the distortion technique can be applied in a robot manipulative system. Using the validated model, the importance of

  11. Objectifying Content Validity: Conducting a Content Validity Study in Social Work Research.

    ERIC Educational Resources Information Center

    Rubio, Doris McGartland; Berg-Weger, Marla; Tebb, Susan S.; Lee, E. Suzanne; Rauch, Shannon

    2003-01-01

    The purpose of this article is to demonstrate how to conduct a content validity study. Instructions on how to calculate a content validity index, factorial validity index, and an interrater reliability index and guide for interpreting these indices are included. Implications regarding the value of conducting a content validity study for…

  12. Kinetic modelling of anaerobic hydrolysis of solid wastes, including disintegration processes.

    PubMed

    García-Gen, Santiago; Sousbie, Philippe; Rangaraj, Ganesh; Lema, Juan M; Rodríguez, Jorge; Steyer, Jean-Philippe; Torrijos, Michel

    2015-01-01

    A methodology to estimate disintegration and hydrolysis kinetic parameters of solid wastes and validate an ADM1-based anaerobic co-digestion model is presented. Kinetic parameters of the model were calibrated from batch reactor experiments treating individually fruit and vegetable wastes (among other residues) following a new protocol for batch tests. In addition, decoupled disintegration kinetics for readily and slowly biodegradable fractions of solid wastes was considered. Calibrated parameters from batch assays of individual substrates were used to validate the model for a semi-continuous co-digestion operation treating simultaneously 5 fruit and vegetable wastes. The semi-continuous experiment was carried out in a lab-scale CSTR reactor for 15 weeks at organic loading rate ranging between 2.0 and 4.7 gVS/Ld. The model (built in Matlab/Simulink) fit to a large extent the experimental results in both batch and semi-continuous mode and served as a powerful tool to simulate the digestion or co-digestion of solid wastes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Simulation of ultrasonic arrays for industrial and civil engineering applications including validation

    NASA Astrophysics Data System (ADS)

    Spies, M.; Rieder, H.; Orth, Th.; Maack, S.

    2012-05-01

    In this contribution we address the beam field simulation of 2D ultrasonic arrays using the Generalized Point Source Synthesis technique. Aiming at the inspection of cylindrical components (e.g. pipes) the influence of concave and convex surface curvatures, respectively, has been evaluated for a commercial probe. We have compared these results with those obtained using a commercial simulation tool. In civil engineering, the ultrasonic inspection of highly attenuating concrete structures has been advanced by the development of dry contact point transducers, mainly applied in array arrangements. Our respective simulations for a widely used commercial probe are validated using experimental results acquired on concrete half-spheres with diameters from 200 mm up to 650 mm.

  14. Parameters of metabolic quantification in clinical practice. Is it now time to include them in reports?

    PubMed

    Mucientes, J; Calles, L; Rodríguez, B; Mitjavila, M

    2018-01-18

    Qualitative techniques have traditionally been the standard for the diagnostic assessment with 18 F-FDG PET studies. Since the introduction of the technique, quantitative parameters have been sought, more accurate and with better diagnostic precision, that may offer relevant information of the behavior, aggressiveness or prognosis of tumors. Nowadays, more and more studies with high quality evidence show the utility of other metabolic parameters different from the SUV maximum, which despite being widely used in clinical practice is controversial and many physicians still do not know its real meaning. The objective of this paper has been to review the key concepts of these metabolic parameters that could be relevant in normal practice in the future. It has been seen that there is more evidence in the complete evaluation of the metabolism of a lesion, through volumetric parameters that more adequately reflect the patient's tumor burden. Basically, these parameters calculate the volume of tumor that fulfills certain characteristics. A software available in the majority of the workstations has been used for this purpose and it has allowed to calculate these volumes using more or less complex criteria. The simplest threshold-based segmentation methods are available in most equipments, they are easy to calculate and they have been shown in many studies to have an important prognostic significance. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  15. Application of regional physically-based landslide early warning model: tuning of the input parameters and validation of the results

    NASA Astrophysics Data System (ADS)

    D'Ambrosio, Michele; Tofani, Veronica; Rossi, Guglielmo; Salvatici, Teresa; Tacconi Stefanelli, Carlo; Rosi, Ascanio; Benedetta Masi, Elena; Pazzi, Veronica; Vannocci, Pietro; Catani, Filippo; Casagli, Nicola

    2017-04-01

    runs in real-time by assimilating weather data and uses Monte Carlo simulation techniques to manage the geotechnical and hydrological input parameters. In this context, an assessment of the factors controlling the geotechnical and hydrological features is crucial in order to understand the occurrence of slope instability mechanisms and to provide reliable forecasting of the hydrogeological hazard occurrence, especially in relation to weather events. In particular, the model and the soil characterization were applied in back analysis, in order to assess the reliability of the model through validation of the results with landslide events that occurred during the period. The validation was performed on four past events of intense rainfall that have affected Valle d'Aosta region between 2008 and 2010 years triggering fast shallows landslides. The simulations show substantial improvement of the reliability of the results compared to the use of literature parameters. A statistical analysis of the HIRESSS outputs in terms of failure probability has been carried out in order to define reliable alert levels for regional landslide early warning systems.

  16. Nonclinical dose formulation analysis method validation and sample analysis.

    PubMed

    Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D

    2010-12-01

    Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.

  17. Implementation and application of an interactive user-friendly validation software for RADIANCE

    NASA Astrophysics Data System (ADS)

    Sundaram, Anand; Boonn, William W.; Kim, Woojin; Cook, Tessa S.

    2012-02-01

    RADIANCE extracts CT dose parameters from dose sheets using optical character recognition and stores the data in a relational database. To facilitate validation of RADIANCE's performance, a simple user interface was initially implemented and about 300 records were evaluated. Here, we extend this interface to achieve a wider variety of functions and perform a larger-scale validation. The validator uses some data from the RADIANCE database to prepopulate quality-testing fields, such as correspondence between calculated and reported total dose-length product. The interface also displays relevant parameters from the DICOM headers. A total of 5,098 dose sheets were used to test the performance accuracy of RADIANCE in dose data extraction. Several search criteria were implemented. All records were searchable by accession number, study date, or dose parameters beyond chosen thresholds. Validated records were searchable according to additional criteria from validation inputs. An error rate of 0.303% was demonstrated in the validation. Dose monitoring is increasingly important and RADIANCE provides an open-source solution with a high level of accuracy. The RADIANCE validator has been updated to enable users to test the integrity of their installation and verify that their dose monitoring is accurate and effective.

  18. Wearable vital parameters monitoring system

    NASA Astrophysics Data System (ADS)

    Caramaliu, Radu Vadim; Vasile, Alexandru; Bacis, Irina

    2015-02-01

    The system we propose monitors body temperature, heart rate and beside this, it tracks if the person who wears it suffers a faint. It uses a digital temperature sensor, a pulse sensor and a gravitational acceleration sensor to monitor the eventual faint or small heights free falls. The system continuously tracks the GPS position when available and stores the last valid data. So, when measuring abnormal vital parameters the module will send an SMS, using the GSM cellular network , with the person's social security number, the last valid GPS position for that person, the heart rate, the body temperature and, where applicable, a valid fall alert or non-valid fall alert. Even though such systems exist, they contain only faint detection or heart rate detection. Usually there is a strong correlation between low/high heart rate and an eventual faint. Combining both features into one system results in a more reliable detection device.

  19. Identifiability of altimetry-based rating curve parameters in function of river morphological parameters

    NASA Astrophysics Data System (ADS)

    Paris, Adrien; André Garambois, Pierre; Calmant, Stéphane; Paiva, Rodrigo; Walter, Collischonn; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Bonnet, Marie-Paule; Seyler, Frédérique; Monnier, Jérôme

    2016-04-01

    Estimating river discharge for ungauged river reaches from satellite measurements is not straightforward given the nonlinearity of flow behavior with respect to measurable and non measurable hydraulic parameters. As a matter of facts, current satellite datasets do not give access to key parameters such as river bed topography and roughness. A unique set of almost one thousand altimetry-based rating curves was built by fit of ENVISAT and Jason-2 water stages with discharges obtained from the MGB-IPH rainfall-runoff model in the Amazon basin. These rated discharges were successfully validated towards simulated discharges (Ens = 0.70) and in-situ discharges (Ens = 0.71) and are not mission-dependent. The rating curve writes Q = a(Z-Z0)b*sqrt(S), with Z the water surface elevation and S its slope gained from satellite altimetry, a and b power law coefficient and exponent and Z0 the river bed elevation such as Q(Z0) = 0. For several river reaches in the Amazon basin where ADCP measurements are available, the Z0 values are fairly well validated with a relative error lower than 10%. The present contribution aims at relating the identifiability and the physical meaning of a, b and Z0given various hydraulic and geomorphologic conditions. Synthetic river bathymetries sampling a wide range of rivers and inflow discharges are used to perform twin experiments. A shallow water model is run for generating synthetic satellite observations, and then rating curve parameters are determined for each river section thanks to a MCMC algorithm. Thanks to twin experiments, it is shown that rating curve formulation with water surface slope, i.e. closer from Manning equation form, improves parameter identifiability. The compensation between parameters is limited, especially for reaches with little water surface variability. Rating curve parameters are analyzed for riffle and pools for small to large rivers, different river slopes and cross section shapes. It is shown that the river bed

  20. Reliability and validity of pressure and temporal parameters recorded using a pressure-sensitive insole during running.

    PubMed

    Mann, Robert; Malisoux, Laurent; Brunner, Roman; Gette, Paul; Urhausen, Axel; Statham, Andrew; Meijer, Kenneth; Theisen, Daniel

    2014-01-01

    Running biomechanics has received increasing interest in recent literature on running-related injuries, calling for new, portable methods for large-scale measurements. Our aims were to define running strike pattern based on output of a new pressure-sensitive measurement device, the Runalyser, and to test its validity regarding temporal parameters describing running gait. Furthermore, reliability of the Runalyser measurements was evaluated, as well as its ability to discriminate different running styles. Thirty-one healthy participants (30.3 ± 7.4 years, 1.78 ± 0.10 m and 74.1 ± 12.1 kg) were involved in the different study parts. Eleven participants were instructed to use a rearfoot (RFS), midfoot (MFS) and forefoot (FFS) strike pattern while running on a treadmill. Strike pattern was subsequently defined using a linear regression (R(2)=0.89) between foot strike angle, as determined by motion analysis (1000 Hz), and strike index (SI, point of contact on the foot sole, as a percentage of foot sole length), as measured by the Runalyser. MFS was defined by the 95% confidence interval of the intercept (SI=43.9-49.1%). High agreement (overall mean difference 1.2%) was found between stance time, flight time, stride time and duty factor as determined by the Runalyser and a force-measuring treadmill (n=16 participants). Measurements of the two devices were highly correlated (R ≥ 0.80) and not significantly different. Test-retest intra-class correlation coefficients for all parameters were ≥ 0.94 (n=14 participants). Significant differences (p<0.05) between FFS, RFS and habitual running were detected regarding SI, stance time and stride time (n=24 participants). The Runalyser is suitable for, and easily applicable in large-scale studies on running biomechanics. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. A validation procedure for a LADAR system radiometric simulation model

    NASA Astrophysics Data System (ADS)

    Leishman, Brad; Budge, Scott; Pack, Robert

    2007-04-01

    The USU LadarSIM software package is a ladar system engineering tool that has recently been enhanced to include the modeling of the radiometry of Ladar beam footprints. This paper will discuss our validation of the radiometric model and present a practical approach to future validation work. In order to validate complicated and interrelated factors affecting radiometry, a systematic approach had to be developed. Data for known parameters were first gathered then unknown parameters of the system were determined from simulation test scenarios. This was done in a way to isolate as many unknown variables as possible, then build on the previously obtained results. First, the appropriate voltage threshold levels of the discrimination electronics were set by analyzing the number of false alarms seen in actual data sets. With this threshold set, the system noise was then adjusted to achieve the appropriate number of dropouts. Once a suitable noise level was found, the range errors of the simulated and actual data sets were compared and studied. Predicted errors in range measurements were analyzed using two methods: first by examining the range error of a surface with known reflectivity and second by examining the range errors for specific detectors with known responsivities. This provided insight into the discrimination method and receiver electronics used in the actual system.

  2. Determination of serum levels of imatinib mesylate in patients with chronic myeloid leukemia: validation and application of a new analytical method to monitor treatment compliance

    PubMed Central

    Rezende, Vinícius Marcondes; Rivellis, Ariane Julio; Gomes, Melissa Medrano; Dörr, Felipe Augusto; Novaes, Mafalda Megumi Yoshinaga; Nardinelli, Luciana; Costa, Ariel Lais de Lima; Chamone, Dalton de Alencar Fisher; Bendit, Israel

    2013-01-01

    Objective The goal of this study was to monitor imatinib mesylate therapeutically in the Tumor Biology Laboratory, Department of Hematology and Hemotherapy, Hospital das Clínicas, Faculdade de Medicina, Universidade de São Paulo (USP). A simple and sensitive method to quantify imatinib and its metabolite (CGP74588) in human serum was developed and fully validated in order to monitor treatment compliance. Methods The method used to quantify these compounds in serum included protein precipitation extraction followed by instrumental analysis using high performance liquid chromatography coupled with mass spectrometry. The method was validated for several parameters, including selectivity, precision, accuracy, recovery and linearity. Results The parameters evaluated during the validation stage exhibited satisfactory results based on the Food and Drug Administration and the Brazilian Health Surveillance Agency (ANVISA) guidelines for validating bioanalytical methods. These parameters also showed a linear correlation greater than 0.99 for the concentration range between 0.500 µg/mL and 10.0 µg/mL and a total analysis time of 13 minutes per sample. This study includes results (imatinib serum concentrations) for 308 samples from patients being treated with imatinib mesylate. Conclusion The method developed in this study was successfully validated and is being efficiently used to measure imatinib concentrations in samples from chronic myeloid leukemia patients to check treatment compliance. The imatinib serum levels of patients achieving a major molecular response were significantly higher than those of patients who did not achieve this result. These results are thus consistent with published reports concerning other populations. PMID:23741187

  3. Collocation mismatch uncertainties in satellite aerosol retrieval validation

    NASA Astrophysics Data System (ADS)

    Virtanen, Timo H.; Kolmonen, Pekka; Sogacheva, Larisa; Rodríguez, Edith; Saponaro, Giulia; de Leeuw, Gerrit

    2018-02-01

    Satellite-based aerosol products are routinely validated against ground-based reference data, usually obtained from sun photometer networks such as AERONET (AEROsol RObotic NETwork). In a typical validation exercise a spatial sample of the instantaneous satellite data is compared against a temporal sample of the point-like ground-based data. The observations do not correspond to exactly the same column of the atmosphere at the same time, and the representativeness of the reference data depends on the spatiotemporal variability of the aerosol properties in the samples. The associated uncertainty is known as the collocation mismatch uncertainty (CMU). The validation results depend on the sampling parameters. While small samples involve less variability, they are more sensitive to the inevitable noise in the measurement data. In this paper we study systematically the effect of the sampling parameters in the validation of AATSR (Advanced Along-Track Scanning Radiometer) aerosol optical depth (AOD) product against AERONET data and the associated collocation mismatch uncertainty. To this end, we study the spatial AOD variability in the satellite data, compare it against the corresponding values obtained from densely located AERONET sites, and assess the possible reasons for observed differences. We find that the spatial AOD variability in the satellite data is approximately 2 times larger than in the ground-based data, and the spatial variability correlates only weakly with that of AERONET for short distances. We interpreted that only half of the variability in the satellite data is due to the natural variability in the AOD, and the rest is noise due to retrieval errors. However, for larger distances (˜ 0.5°) the correlation is improved as the noise is averaged out, and the day-to-day changes in regional AOD variability are well captured. Furthermore, we assess the usefulness of the spatial variability of the satellite AOD data as an estimate of CMU by comparing the

  4. Cloud and Thermodynamic Parameters Retrieved from Satellite Ultraspectral Infrared Measurements

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L.; Larar, Allen M.; Liu, Xu; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    Atmospheric-thermodynamic parameters and surface properties are basic meteorological parameters for weather forecasting. A physical geophysical parameter retrieval scheme dealing with cloudy and cloud-free radiance observed with satellite ultraspectral infrared sounders has been developed and applied to the Infrared Atmospheric Sounding Interferometer (IASI) and the Atmospheric InfraRed Sounder (AIRS). The retrieved parameters presented herein are from radiance data gathered during the Joint Airborne IASI Validation Experiment (JAIVEx). JAIVEx provided intensive aircraft observations obtained from airborne Fourier Transform Spectrometer (FTS) systems, in-situ measurements, and dedicated dropsonde and radiosonde measurements for the validation of the IASI products. Here, IASI atmospheric profile retrievals are compared with those obtained from dedicated dropsondes, radiosondes, and the airborne FTS system. The IASI examples presented here demonstrate the ability to retrieve fine-scale horizontal features with high vertical resolution from satellite ultraspectral sounder radiance spectra.

  5. Development and validation of rt-qpcr for vesicular stomatitis virus detection (Alagoas vesiculovirus).

    PubMed

    de Oliveira, Anapolino Macedo; Fonseca, Antônio Augusto; Camargos, Marcelo Fernandes; Orzil, Lívia Maria; Laguardia-Nascimento, Mateus; Oliveira, Anna Gabriella Guimarães; Rodrigues, Jacqueline Gomes; Sales, Mariana Lázaro; de Oliveira, Tatiana Flávia Pinheiro; de Melo, Cristiano Barros

    2018-07-01

    Vesicular stomatitis is an infectious disease that occurs mainly in countries of the Western Hemisphere and affects cattle, swine and horses. The clinical symptoms in cattle and swine are similar to foot-and-mouth disease and include vesicular ulceration of the tongue and mouth. The disease requires a rapid and accurate differential diagnosis, aiming for immediate implementation of control measures. The objective of the present study was to develop and perform validation tests of multiplex RT-qPCR(s) for the detection of RNA from Alagoas vesiculovirus, considering the parameters of sensitivity and analytical specificity, analytical performance (repeatability and reproducibility criteria) and the uncertainty of the measurement. The threshold cycle values obtained in triplicate from each sample were evaluated by considering the variations between days, analysts and equipment in an analysis of variance aimed at determining the variances of repeatability and reproducibility. The results showed that RT-qPCRs had excellent sensitivity and specificity in the detection of RNA of the Alagoas vesiculovirus. The validation parameters showed low coefficients of variation and were equivalent to those found in other validation studies, indicating that the tests presented excellent repeatability and reproducibility. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Parameter Selection Methods in Inverse Problem Formulation

    DTIC Science & Technology

    2010-11-03

    clinical data and used for prediction and a model for the reaction of the cardiovascular system to an ergometric workload. Key Words: Parameter selection...model for HIV dynamics which has been successfully validated with clinical data and used for prediction and a model for the reaction of the...recently developed in-host model for HIV dynamics which has been successfully validated with clinical data and used for prediction [4, 8]; b) a global

  7. Validation of experimental molecular crystal structures with dispersion-corrected density functional theory calculations.

    PubMed

    van de Streek, Jacco; Neumann, Marcus A

    2010-10-01

    This paper describes the validation of a dispersion-corrected density functional theory (d-DFT) method for the purpose of assessing the correctness of experimental organic crystal structures and enhancing the information content of purely experimental data. 241 experimental organic crystal structures from the August 2008 issue of Acta Cryst. Section E were energy-minimized in full, including unit-cell parameters. The differences between the experimental and the minimized crystal structures were subjected to statistical analysis. The r.m.s. Cartesian displacement excluding H atoms upon energy minimization with flexible unit-cell parameters is selected as a pertinent indicator of the correctness of a crystal structure. All 241 experimental crystal structures are reproduced very well: the average r.m.s. Cartesian displacement for the 241 crystal structures, including 16 disordered structures, is only 0.095 Å (0.084 Å for the 225 ordered structures). R.m.s. Cartesian displacements above 0.25 A either indicate incorrect experimental crystal structures or reveal interesting structural features such as exceptionally large temperature effects, incorrectly modelled disorder or symmetry breaking H atoms. After validation, the method is applied to nine examples that are known to be ambiguous or subtly incorrect.

  8. Predicting stress urinary incontinence during pregnancy: combination of pelvic floor ultrasound parameters and clinical factors.

    PubMed

    Chen, Ling; Luo, Dan; Yu, Xiajuan; Jin, Mei; Cai, Wenzhi

    2018-05-12

    The aim of this study was to develop and validate a predictive tool that combining pelvic floor ultrasound parameters and clinical factors for stress urinary incontinence during pregnancy. A total of 535 women in first or second trimester were included for an interview and transperineal ultrasound assessment from two hospitals. Imaging data sets were analyzed offline to assess for bladder neck vertical position, urethra angles (α, β, and γ angles), hiatal area and bladder neck funneling. All significant continuous variables at univariable analysis were analyzed by receiver-operating characteristics. Three multivariable logistic models were built on clinical factor, and combined with ultrasound parameters. The final predictive model with best performance and fewest variables was selected to establish a nomogram. Internal and external validation of the nomogram were performed by both discrimination represented by C-index and calibration measured by Hosmer-Lemeshow test. A decision curve analysis was conducted to determine the clinical utility of the nomogram. After excluding 14 women with invalid data, 521 women were analyzed. β angle, γ angle and hiatal area had limited predictive value for stress urinary incontinence during pregnancy, with area under curves of 0.558-0.648. The final predictive model included body mass index gain since pregnancy, constipation, previous delivery mode, β angle at rest, and bladder neck funneling. The nomogram based on the final model showed good discrimination with a C-index of 0.789 and satisfactory calibration (P=0.828), both of which were supported by external validation. Decision curve analysis showed that the nomogram was clinical useful. The nomogram incorporating both the pelvic floor ultrasound parameters and clinical factors has been validated to show good discrimination and calibration, and could be an important tool for stress urinary incontinence risk prediction at an early stage of pregnancy. This article is

  9. Validating a driving simulator using surrogate safety measures.

    PubMed

    Yan, Xuedong; Abdel-Aty, Mohamed; Radwan, Essam; Wang, Xuesong; Chilakapati, Praveen

    2008-01-01

    Traffic crash statistics and previous research have shown an increased risk of traffic crashes at signalized intersections. How to diagnose safety problems and develop effective countermeasures to reduce crash rate at intersections is a key task for traffic engineers and researchers. This study aims at investigating whether the driving simulator can be used as a valid tool to assess traffic safety at signalized intersections. In support of the research objective, this simulator validity study was conducted from two perspectives, a traffic parameter (speed) and a safety parameter (crash history). A signalized intersection with as many important features (including roadway geometries, traffic control devices, intersection surroundings, and buildings) was replicated into a high-fidelity driving simulator. A driving simulator experiment with eight scenarios at the intersection were conducted to determine if the subjects' speed behavior and traffic risk patterns in the driving simulator were similar to what were found at the real intersection. The experiment results showed that speed data observed from the field and in the simulator experiment both follow normal distributions and have equal means for each intersection approach, which validated the driving simulator in absolute terms. Furthermore, this study used an innovative approach of using surrogate safety measures from the simulator to contrast with the crash analysis for the field data. The simulator experiment results indicated that compared to the right-turn lane with the low rear-end crash history record (2 crashes), subjects showed a series of more risky behaviors at the right-turn lane with the high rear-end crash history record (16 crashes), including higher deceleration rate (1.80+/-1.20 m/s(2) versus 0.80+/-0.65 m/s(2)), higher non-stop right-turn rate on red (81.67% versus 57.63%), higher right-turn speed as stop line (18.38+/-8.90 km/h versus 14.68+/-6.04 km/h), shorter following distance (30

  10. Utilizing the social media data to validate 'climate change' indices

    NASA Astrophysics Data System (ADS)

    Molodtsova, T.; Kirilenko, A.; Stepchenkova, S.

    2013-12-01

    Reporting the observed and modeled changes in climate to public requires the measures understandable by the general audience. E.g., the NASA GISS Common Sense Climate Index (Hansen et al., 1998) reports the change in climate based on six practically observable parameters such as the air temperature exceeding the norm by one standard deviation. The utility of the constructed indices for reporting climate change depends, however, on an assumption that the selected parameters are felt and connected with the changing climate by a non-expert, which needs to be validated. Dynamic discussion of climate change issues in social media may provide data for this validation. We connected the intensity of public discussion of climate change in social networks with regional weather variations for the territory of the USA. We collected the entire 2012 population of Twitter microblogging activity on climate change topic, accumulating over 1.8 million separate records (tweets) globally. We identified the geographic location of the tweets and associated the daily and weekly intensity of twitting with the following parameters of weather for these locations: temperature anomalies, 'hot' temperature anomalies, 'cold' temperature anomalies, heavy rain/snow events. To account for non-weather related events we included the articles on climate change from the 'prestige press', a collection of major newspapers. We found that the regional changes in parameters of weather significantly affect the number of tweets published on climate change. This effect, however, is short-lived and varies throughout the country. We found that in different locations different weather parameters had the most significant effect on climate change microblogging activity. Overall 'hot' temperature anomalies had significant influence on climate change twitting intensity.

  11. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process.

    PubMed

    Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-31

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.

  12. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process

    PubMed Central

    Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-01

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048

  13. Empirical flow parameters - a tool for hydraulic model validity assessment : [summary].

    DOT National Transportation Integrated Search

    2013-10-01

    Hydraulic modeling assembles models based on generalizations of parameter values from textbooks, professional literature, computer program documentation, and engineering experience. Actual measurements adjacent to the model location are seldom availa...

  14. Development and validation of brief scales to measure emotional and behavioural problems among Chinese adolescents

    PubMed Central

    Shen, Minxue; Hu, Ming; Sun, Zhenqiu

    2017-01-01

    Objectives To develop and validate brief scales to measure common emotional and behavioural problems among adolescents in the examination-oriented education system and collectivistic culture of China. Setting Middle schools in Hunan province. Participants 5442 middle school students aged 11–19 years were sampled. 4727 valid questionnaires were collected and used for validation of the scales. The final sample included 2408 boys and 2319 girls. Primary and secondary outcome measures The tools were assessed by the item response theory, classical test theory (reliability and construct validity) and differential item functioning. Results Four scales to measure anxiety, depression, study problem and sociality problem were established. Exploratory factor analysis showed that each scale had two solutions. Confirmatory factor analysis showed acceptable to good model fit for each scale. Internal consistency and test–retest reliability of all scales were above 0.7. Item response theory showed that all items had acceptable discrimination parameters and most items had appropriate difficulty parameters. 10 items demonstrated differential item functioning with respect to gender. Conclusions Four brief scales were developed and validated among adolescents in middle schools of China. The scales have good psychometric properties with minor differential item functioning. They can be used in middle school settings, and will help school officials to assess the students’ emotional/behavioural problems. PMID:28062469

  15. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    NASA Astrophysics Data System (ADS)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  16. Validated numerical simulation model of a dielectric elastomer generator

    NASA Astrophysics Data System (ADS)

    Foerster, Florentine; Moessinger, Holger; Schlaak, Helmut F.

    2013-04-01

    Dielectric elastomer generators (DEG) produce electrical energy by converting mechanical into electrical energy. Efficient operation requires homogeneous deformation of each single layer. However, by different internal and external influences like supports or the shape of a DEG the deformation will be inhomogeneous and hence negatively affect the amount of the generated electrical energy. Optimization of the deformation behavior leads to improved efficiency of the DEG and consequently to higher energy gain. In this work a numerical simulation model of a multilayer dielectric elastomer generator is developed using the FEM software ANSYS. The analyzed multilayer DEG consists of 49 active dielectric layers with layer thicknesses of 50 μm. The elastomer is silicone (PDMS) while the compliant electrodes are made of graphite powder. In the simulation the real material parameters of the PDMS and the graphite electrodes need to be included. Therefore, the mechanical and electrical material parameters of the PDMS are determined by experimental investigations of test samples while the electrode parameters are determined by numerical simulations of test samples. The numerical simulation of the DEG is carried out as coupled electro-mechanical simulation for the constant voltage energy harvesting cycle. Finally, the derived numerical simulation model is validated by comparison with analytical calculations and further simulated DEG configurations. The comparison of the determined results show good accordance with regard to the deformation of the DEG. Based on the validated model it is now possible to optimize the DEG layout for improved deformation behavior with further simulations.

  17. Finite Element Analysis of Influence of Axial Position of Center of Rotation of a Cervical Total Disc Replacement on Biomechanical Parameters: Simulated 2-Level Replacement Based on a Validated Model.

    PubMed

    Li, Yang; Zhang, Zhenjun; Liao, Zhenhua; Mo, Zhongjun; Liu, Weiqiang

    2017-10-01

    Finite element models have been widely used to predict biomechanical parameters of the cervical spine. Previous studies investigated the influence of position of rotational centers of prostheses on cervical biomechanical parameters after 1-level total disc replacement. The purpose of this study was to explore the effects of axial position of rotational centers of prostheses on cervical biomechanics after 2-level total disc replacement. A validated finite element model of C3-C7 segments and 2 prostheses, including the rotational center located at the superior endplate (SE) and inferior endplate (IE), was developed. Four total disc replacement models were used: 1) IE inserted at C4-C5 disc space and IE inserted at C5-C6 disc space (IE-IE), 2) IE-SE, 3) SE-IE, and 4) SE-SE. All models were subjected to displacement control combined with a 50 N follower load to simulate flexion and extension motions in the sagittal plane. For each case, biomechanical parameters, including predicted moments, range of rotation at each level, facet joint stress, and von Mises stress on the ultra-high-molecular-weight polyethylene core of the prostheses, were calculated. The SE-IE model resulted in significantly lower stress at the cartilage level during extension and at the ultra-high-molecular-weight polyethylene cores when compared with the SE-SE construct and did not generate hypermotion at the C4-C5 level compared with the IE-SE and IE-IE constructs. Based on the present analysis, the SE-IE construct is recommended for treating cervical disease at the C4-C6 level. This study may provide a useful model to inform clinical operations. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Validation of biomarkers to predict response to immunotherapy in cancer: Volume I - pre-analytical and analytical validation.

    PubMed

    Masucci, Giuseppe V; Cesano, Alessandra; Hawtin, Rachael; Janetzki, Sylvia; Zhang, Jenny; Kirsch, Ilan; Dobbin, Kevin K; Alvarez, John; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Butterfield, Lisa H; Thurin, Magdalena

    2016-01-01

    Immunotherapies have emerged as one of the most promising approaches to treat patients with cancer. Recently, there have been many clinical successes using checkpoint receptor blockade, including T cell inhibitory receptors such as cytotoxic T-lymphocyte-associated antigen 4 (CTLA-4) and programmed cell death-1 (PD-1). Despite demonstrated successes in a variety of malignancies, responses only typically occur in a minority of patients in any given histology. Additionally, treatment is associated with inflammatory toxicity and high cost. Therefore, determining which patients would derive clinical benefit from immunotherapy is a compelling clinical question. Although numerous candidate biomarkers have been described, there are currently three FDA-approved assays based on PD-1 ligand expression (PD-L1) that have been clinically validated to identify patients who are more likely to benefit from a single-agent anti-PD-1/PD-L1 therapy. Because of the complexity of the immune response and tumor biology, it is unlikely that a single biomarker will be sufficient to predict clinical outcomes in response to immune-targeted therapy. Rather, the integration of multiple tumor and immune response parameters, such as protein expression, genomics, and transcriptomics, may be necessary for accurate prediction of clinical benefit. Before a candidate biomarker and/or new technology can be used in a clinical setting, several steps are necessary to demonstrate its clinical validity. Although regulatory guidelines provide general roadmaps for the validation process, their applicability to biomarkers in the cancer immunotherapy field is somewhat limited. Thus, Working Group 1 (WG1) of the Society for Immunotherapy of Cancer (SITC) Immune Biomarkers Task Force convened to address this need. In this two volume series, we discuss pre-analytical and analytical (Volume I) as well as clinical and regulatory (Volume II) aspects of the validation process as applied to predictive biomarkers

  19. Atmospheric stellar parameters from cross-correlation functions

    NASA Astrophysics Data System (ADS)

    Malavolta, L.; Lovis, C.; Pepe, F.; Sneden, C.; Udry, S.

    2017-08-01

    The increasing number of spectra gathered by spectroscopic sky surveys and transiting exoplanet follow-up has pushed the community to develop automated tools for atmospheric stellar parameters determination. Here we present a novel approach that allows the measurement of temperature (Teff), metallicity ([Fe/H]) and gravity (log g) within a few seconds and in a completely automated fashion. Rather than performing comparisons with spectral libraries, our technique is based on the determination of several cross-correlation functions (CCFs) obtained by including spectral features with different sensitivity to the photospheric parameters. We use literature stellar parameters of high signal-to-noise (SNR), high-resolution HARPS spectra of FGK main-sequence stars to calibrate Teff, [Fe/H] and log g as a function of CCF parameters. Our technique is validated using low-SNR spectra obtained with the same instrument. For FGK stars we achieve a precision of σ _{{T_eff}} = 50 K, σlog g = 0.09 dex and σ _{{{[Fe/H]}}} =0.035 dex at SNR = 50, while the precision for observation with SNR ≳ 100 and the overall accuracy are constrained by the literature values used to calibrate the CCFs. Our approach can easily be extended to other instruments with similar spectral range and resolution or to other spectral range and stars other than FGK dwarfs if a large sample of reference stars is available for the calibration. Additionally, we provide the mathematical formulation to convert synthetic equivalent widths to CCF parameters as an alternative to direct calibration. We have made our tool publicly available.

  20. Stability evaluation of quality parameters for palm oil products at low temperature storage.

    PubMed

    Ramli, Nur Aainaa Syahirah; Mohd Noor, Mohd Azmil; Musa, Hajar; Ghazali, Razmah

    2018-07-01

    Palm oil is one of the major oils and fats produced and traded worldwide. The value of palm oil products is mainly influenced by their quality. According to ISO 17025:2005, accredited laboratories require a quality control procedure with respect to monitoring the validity of tests for determination of quality parameters. This includes the regular use of internal quality control using secondary reference materials. Unfortunately, palm oil reference materials are not currently available. To establish internal quality control samples, the stability of quality parameters needs to be evaluated. In the present study, the stability of quality parameters for palm oil products was examined over 10 months at low temperature storage (6 ± 2 °C). The palm oil products tested included crude palm oil (CPO); refined, bleached and deodorized (RBD) palm oil (RBDPO); RBD palm olein (RBDPOo); and RBD palm stearin (RBDPS). The quality parameters of the oils [i.e. moisture content, free fatty acid content (FFA), iodine value (IV), fatty acids composition (FAC) and slip melting point (SMP)] were determined prior to and throughout the storage period. The moisture, FFA, IV, FAC and SMP for palm oil products changed significantly (P < 0.05), whereas the moisture content for CPO, IV for RBDPO and RBDPOo, stearic acid composition for CPO and linolenic acid composition for CPO, RBDPO, RBDPOo and RBDPS did not (P > 0.05). The stability study indicated that the quality of the palm oil products was stable within the specified limits throughout the storage period at low temperature. The storage conditions preserved the quality of palm oil products throughout the storage period. These findings qualify the use of the palm oil products CPO, RBDPO, RBDPOo and RBDPS as control samples in the validation of test results. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  1. Statistical Validation of Surrogate Endpoints: Another Look at the Prentice Criterion and Other Criteria.

    PubMed

    Saraf, Sanatan; Mathew, Thomas; Roy, Anindya

    2015-01-01

    For the statistical validation of surrogate endpoints, an alternative formulation is proposed for testing Prentice's fourth criterion, under a bivariate normal model. In such a setup, the criterion involves inference concerning an appropriate regression parameter, and the criterion holds if the regression parameter is zero. Testing such a null hypothesis has been criticized in the literature since it can only be used to reject a poor surrogate, and not to validate a good surrogate. In order to circumvent this, an equivalence hypothesis is formulated for the regression parameter, namely the hypothesis that the parameter is equivalent to zero. Such an equivalence hypothesis is formulated as an alternative hypothesis, so that the surrogate endpoint is statistically validated when the null hypothesis is rejected. Confidence intervals for the regression parameter and tests for the equivalence hypothesis are proposed using bootstrap methods and small sample asymptotics, and their performances are numerically evaluated and recommendations are made. The choice of the equivalence margin is a regulatory issue that needs to be addressed. The proposed equivalence testing formulation is also adopted for other parameters that have been proposed in the literature on surrogate endpoint validation, namely, the relative effect and proportion explained.

  2. Real-Time Gait Cycle Parameter Recognition Using a Wearable Accelerometry System

    PubMed Central

    Yang, Che-Chang; Hsu, Yeh-Liang; Shih, Kao-Shang; Lu, Jun-Ming

    2011-01-01

    This paper presents the development of a wearable accelerometry system for real-time gait cycle parameter recognition. Using a tri-axial accelerometer, the wearable motion detector is a single waist-mounted device to measure trunk accelerations during walking. Several gait cycle parameters, including cadence, step regularity, stride regularity and step symmetry can be estimated in real-time by using autocorrelation procedure. For validation purposes, five Parkinson’s disease (PD) patients and five young healthy adults were recruited in an experiment. The gait cycle parameters among the two subject groups of different mobility can be quantified and distinguished by the system. Practical considerations and limitations for implementing the autocorrelation procedure in such a real-time system are also discussed. This study can be extended to the future attempts in real-time detection of disabling gaits, such as festinating or freezing of gait in PD patients. Ambulatory rehabilitation, gait assessment and personal telecare for people with gait disorders are also possible applications. PMID:22164019

  3. Parameter Design in Fusion Welding of AA 6061 Aluminium Alloy using Desirability Grey Relational Analysis (DGRA) Method

    NASA Astrophysics Data System (ADS)

    Adalarasan, R.; Santhanakumar, M.

    2015-01-01

    In the present work, yield strength, ultimate strength and micro-hardness of the lap joints formed with Al 6061 alloy sheets by using the processes of Tungsten Inert Gas (TIG) welding and Metal Inert Gas (MIG) welding were studied for various combinations of the welding parameters. The parameters taken for study include welding current, voltage, welding speed and inert gas flow rate. Taguchi's L9 orthogonal array was used to conduct the experiments and an integrated technique of desirability grey relational analysis was disclosed for optimizing the welding parameters. The ignored robustness in desirability approach is compensated by the grey relational approach to predict the optimal setting of input parameters for the TIG and MIG welding processes which were validated through the confirmation experiments.

  4. Sensor data validation and reconstruction. Phase 1: System architecture study

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The sensor validation and data reconstruction task reviewed relevant literature and selected applicable validation and reconstruction techniques for further study; analyzed the selected techniques and emphasized those which could be used for both validation and reconstruction; analyzed Space Shuttle Main Engine (SSME) hot fire test data to determine statistical and physical relationships between various parameters; developed statistical and empirical correlations between parameters to perform validation and reconstruction tasks, using a computer aided engineering (CAE) package; and conceptually designed an expert system based knowledge fusion tool, which allows the user to relate diverse types of information when validating sensor data. The host hardware for the system is intended to be a Sun SPARCstation, but could be any RISC workstation with a UNIX operating system and a windowing/graphics system such as Motif or Dataviews. The information fusion tool is intended to be developed using the NEXPERT Object expert system shell, and the C programming language.

  5. Review of surface steam sterilization for validation purposes.

    PubMed

    van Doornmalen, Joost; Kopinga, Klaas

    2008-03-01

    Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.

  6. Laboratory parameter-based machine learning model for excluding non-alcoholic fatty liver disease (NAFLD) in the general population.

    PubMed

    Yip, T C-F; Ma, A J; Wong, V W-S; Tse, Y-K; Chan, H L-Y; Yuen, P-C; Wong, G L-H

    2017-08-01

    Non-alcoholic fatty liver disease (NAFLD) affects 20%-40% of the general population in developed countries and is an increasingly important cause of hepatocellular carcinoma. Electronic medical records facilitate large-scale epidemiological studies, existing NAFLD scores often require clinical and anthropometric parameters that may not be captured in those databases. To develop and validate a laboratory parameter-based machine learning model to detect NAFLD for the general population. We randomly divided 922 subjects from a population screening study into training and validation groups; NAFLD was diagnosed by proton-magnetic resonance spectroscopy. On the basis of machine learning from 23 routine clinical and laboratory parameters after elastic net regulation, we evaluated the logistic regression, ridge regression, AdaBoost and decision tree models. The areas under receiver-operating characteristic curve (AUROC) of models in validation group were compared. Six predictors including alanine aminotransferase, high-density lipoprotein cholesterol, triglyceride, haemoglobin A 1c , white blood cell count and the presence of hypertension were selected. The NAFLD ridge score achieved AUROC of 0.87 (95% CI 0.83-0.90) and 0.88 (0.84-0.91) in the training and validation groups respectively. Using dual cut-offs of 0.24 and 0.44, NAFLD ridge score achieved 92% (86%-96%) sensitivity and 90% (86%-93%) specificity with corresponding negative and positive predictive values of 96% (91%-98%) and 69% (59%-78%), and 87% of overall accuracy among 70% of classifiable subjects in the validation group; 30% of subjects remained indeterminate. NAFLD ridge score is a simple and robust reference comparable to existing NAFLD scores to exclude NAFLD patients in epidemiological studies. © 2017 John Wiley & Sons Ltd.

  7. Assessing the performance of community-available global MHD models using key system parameters and empirical relationships

    NASA Astrophysics Data System (ADS)

    Gordeev, E.; Sergeev, V.; Honkonen, I.; Kuznetsova, M.; Rastätter, L.; Palmroth, M.; Janhunen, P.; Tóth, G.; Lyon, J.; Wiltberger, M.

    2015-12-01

    Global magnetohydrodynamic (MHD) modeling is a powerful tool in space weather research and predictions. There are several advanced and still developing global MHD (GMHD) models that are publicly available via Community Coordinated Modeling Center's (CCMC) Run on Request system, which allows the users to simulate the magnetospheric response to different solar wind conditions including extraordinary events, like geomagnetic storms. Systematic validation of GMHD models against observations still continues to be a challenge, as well as comparative benchmarking of different models against each other. In this paper we describe and test a new approach in which (i) a set of critical large-scale system parameters is explored/tested, which are produced by (ii) specially designed set of computer runs to simulate realistic statistical distributions of critical solar wind parameters and are compared to (iii) observation-based empirical relationships for these parameters. Being tested in approximately similar conditions (similar inputs, comparable grid resolution, etc.), the four models publicly available at the CCMC predict rather well the absolute values and variations of those key parameters (magnetospheric size, magnetic field, and pressure) which are directly related to the large-scale magnetospheric equilibrium in the outer magnetosphere, for which the MHD is supposed to be a valid approach. At the same time, the models have systematic differences in other parameters, being especially different in predicting the global convection rate, total field-aligned current, and magnetic flux loading into the magnetotail after the north-south interplanetary magnetic field turning. According to validation results, none of the models emerges as an absolute leader. The new approach suggested for the evaluation of the models performance against reality may be used by model users while planning their investigations, as well as by model developers and those interesting to quantitatively

  8. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  9. Cross Validation Through Two-Dimensional Solution Surface for Cost-Sensitive SVM.

    PubMed

    Gu, Bin; Sheng, Victor S; Tay, Keng Yeow; Romano, Walter; Li, Shuo

    2017-06-01

    Model selection plays an important role in cost-sensitive SVM (CS-SVM). It has been proven that the global minimum cross validation (CV) error can be efficiently computed based on the solution path for one parameter learning problems. However, it is a challenge to obtain the global minimum CV error for CS-SVM based on one-dimensional solution path and traditional grid search, because CS-SVM is with two regularization parameters. In this paper, we propose a solution and error surfaces based CV approach (CV-SES). More specifically, we first compute a two-dimensional solution surface for CS-SVM based on a bi-parameter space partition algorithm, which can fit solutions of CS-SVM for all values of both regularization parameters. Then, we compute a two-dimensional validation error surface for each CV fold, which can fit validation errors of CS-SVM for all values of both regularization parameters. Finally, we obtain the CV error surface by superposing K validation error surfaces, which can find the global minimum CV error of CS-SVM. Experiments are conducted on seven datasets for cost sensitive learning and on four datasets for imbalanced learning. Experimental results not only show that our proposed CV-SES has a better generalization ability than CS-SVM with various hybrids between grid search and solution path methods, and than recent proposed cost-sensitive hinge loss SVM with three-dimensional grid search, but also show that CV-SES uses less running time.

  10. Validation of the alternating conditional estimation algorithm for estimation of flexible extensions of Cox's proportional hazards model with nonlinear constraints on the parameters.

    PubMed

    Wynant, Willy; Abrahamowicz, Michal

    2016-11-01

    Standard optimization algorithms for maximizing likelihood may not be applicable to the estimation of those flexible multivariable models that are nonlinear in their parameters. For applications where the model's structure permits separating estimation of mutually exclusive subsets of parameters into distinct steps, we propose the alternating conditional estimation (ACE) algorithm. We validate the algorithm, in simulations, for estimation of two flexible extensions of Cox's proportional hazards model where the standard maximum partial likelihood estimation does not apply, with simultaneous modeling of (1) nonlinear and time-dependent effects of continuous covariates on the hazard, and (2) nonlinear interaction and main effects of the same variable. We also apply the algorithm in real-life analyses to estimate nonlinear and time-dependent effects of prognostic factors for mortality in colon cancer. Analyses of both simulated and real-life data illustrate good statistical properties of the ACE algorithm and its ability to yield new potentially useful insights about the data structure. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. An uncertainty model of acoustic metamaterials with random parameters

    NASA Astrophysics Data System (ADS)

    He, Z. C.; Hu, J. Y.; Li, Eric

    2018-01-01

    Acoustic metamaterials (AMs) are man-made composite materials. However, the random uncertainties are unavoidable in the application of AMs due to manufacturing and material errors which lead to the variance of the physical responses of AMs. In this paper, an uncertainty model based on the change of variable perturbation stochastic finite element method (CVPS-FEM) is formulated to predict the probability density functions of physical responses of AMs with random parameters. Three types of physical responses including the band structure, mode shapes and frequency response function of AMs are studied in the uncertainty model, which is of great interest in the design of AMs. In this computation, the physical responses of stochastic AMs are expressed as linear functions of the pre-defined random parameters by using the first-order Taylor series expansion and perturbation technique. Then, based on the linear function relationships of parameters and responses, the probability density functions of the responses can be calculated by the change-of-variable technique. Three numerical examples are employed to demonstrate the effectiveness of the CVPS-FEM for stochastic AMs, and the results are validated by Monte Carlo method successfully.

  12. Sensitivity of predicted bioaerosol exposure from open windrow composting facilities to ADMS dispersion model parameters.

    PubMed

    Douglas, P; Tyrrel, S F; Kinnersley, R P; Whelan, M; Longhurst, P J; Walsh, K; Pollard, S J T; Drew, G H

    2016-12-15

    Bioaerosols are released in elevated quantities from composting facilities and are associated with negative health effects, although dose-response relationships are not well understood, and require improved exposure classification. Dispersion modelling has great potential to improve exposure classification, but has not yet been extensively used or validated in this context. We present a sensitivity analysis of the ADMS dispersion model specific to input parameter ranges relevant to bioaerosol emissions from open windrow composting. This analysis provides an aid for model calibration by prioritising parameter adjustment and targeting independent parameter estimation. Results showed that predicted exposure was most sensitive to the wet and dry deposition modules and the majority of parameters relating to emission source characteristics, including pollutant emission velocity, source geometry and source height. This research improves understanding of the accuracy of model input data required to provide more reliable exposure predictions. Copyright © 2016. Published by Elsevier Ltd.

  13. Reliability, validity and feasibility of nail ultrasonography in psoriatic arthritis.

    PubMed

    Arbault, Anaïs; Devilliers, Hervé; Laroche, Davy; Cayot, Audrey; Vabres, Pierre; Maillefert, Jean-Francis; Ornetti, Paul

    2016-10-01

    To determine the feasibility, reliability and validity of nails ultrasonography in psoriatic arthritis as an outcome measure. Pilot prospective single-centre study of eight ultrasonography parameters in B mode and power Doppler concerning the distal interphalangeal (DIP) joint, the matrix, the bed and nail plate. Intra-observer and inter-observer reliability was evaluated for the seven quantitative parameters (ICC and kappa). Correlations between ultrasonography and clinical variables were searched to assess external validity. Feasibility was assessed by the time to carry out the examination and the percentage of missing data. Twenty-seven patients with psoriatic arthritis (age 55.0±16.2 years, disease duration 13.4±9.4 years) were included. Of these, 67% presented nail involvement on ultrasonography vs 37% on physical examination (P<0.05). Reliability was good (ICC and weighted kappa>0.75) for the seven quantitative parameters, except for synovitis of the DIP joint in B mode. The synovitis of the DIP joint revealed by ultrasonography correlated with the total number of clinical synovitis and Doppler US of the nail (matrix and bed). Doppler US of the matrix correlated with VAS pain but not with the ASDAS-CRP or with clinical enthesitis. No significant correlation was found with US nail thickness. The feasibility and reliability of ultrasonography of the nail in psoriatic arthritis appear to be satisfactory. Among the eight parameters evaluated, power Doppler of the matrix which correlated with local inflammation (DIP joint and bed) and with VAS pain could become an interesting outcome measure, provided that it is also sensitive to change. Copyright © 2015 Société française de rhumatologie. Published by Elsevier SAS. All rights reserved.

  14. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  15. Selection of regularization parameter in total variation image restoration.

    PubMed

    Liao, Haiyong; Li, Fang; Ng, Michael K

    2009-11-01

    We consider and study total variation (TV) image restoration. In the literature there are several regularization parameter selection methods for Tikhonov regularization problems (e.g., the discrepancy principle and the generalized cross-validation method). However, to our knowledge, these selection methods have not been applied to TV regularization problems. The main aim of this paper is to develop a fast TV image restoration method with an automatic selection of the regularization parameter scheme to restore blurred and noisy images. The method exploits the generalized cross-validation (GCV) technique to determine inexpensively how much regularization to use in each restoration step. By updating the regularization parameter in each iteration, the restored image can be obtained. Our experimental results for testing different kinds of noise show that the visual quality and SNRs of images restored by the proposed method is promising. We also demonstrate that the method is efficient, as it can restore images of size 256 x 256 in approximately 20 s in the MATLAB computing environment.

  16. Analysis and validation of severe storm parameters derived from TITAN in Southeast Brazil

    NASA Astrophysics Data System (ADS)

    Gomes, Ana Maria; Held, Gerhard; Vernini, Rafael; Demetrio Souza, Caio

    2014-05-01

    The implementation of TITAN (Thundestorm Identification, Tracking and Nowcasting) System at IPMet in December 2005 has provided real-time access to the storm severity parameters derived from radar reflectivity, which are being used to identify and alert of potentially severe storms within the 240 km quantitative ranges of the Bauru and Presidente Prudente S-band radars. The potential of these tools available with the TITAN system is being evaluated by using the hail reports received from voluntary hail observers to cross-check the occurrence of hail within the radar range against the TITAN predictions. Part of the ongoing research at IPMet aims to determine "signatures" in severe events and therefore, as from 2008, an online standard form was introduced, allowing for greater detail on the occurrence of a severe event within the 240 km ranges of both radars. The model for the hail report was based on the one initially deployed by the Alberta Hail Program, in Canada, and also by the Hail Observer Network established by the CSIR (Council for Scientific and Industrial Research), in Pretoria, South Africa, where it was used for more than 25 years. The TITAN system was deployed to obtain the tracking properties of storms for this analysis. A cell was defined by the thresholds of 40 dBZ for the reflectivity and 16 km3 for the volume, observed at least in two consecutive volume scans (15 minutes). Besides tracking and Nowcasting the movement of storm cells, TITAN comprises algorithms that allow the identification of potentially severe storm "signatures", such as the hail metrics, to indicate the probability of hail (POH), based on a combination of radar data and the knowledge of the vertical temperature distribution of the atmosphere. Another two parameters, also related to hail producing storms, called FOKR (Foote-Krauss) index and HMA (Hail Mass Aloft) index is also included. The period from 2008 to 2013 was used to process all available information about storm

  17. Online cross-validation-based ensemble learning.

    PubMed

    Benkeser, David; Ju, Cheng; Lendle, Sam; van der Laan, Mark

    2018-01-30

    Online estimators update a current estimate with a new incoming batch of data without having to revisit past data thereby providing streaming estimates that are scalable to big data. We develop flexible, ensemble-based online estimators of an infinite-dimensional target parameter, such as a regression function, in the setting where data are generated sequentially by a common conditional data distribution given summary measures of the past. This setting encompasses a wide range of time-series models and, as special case, models for independent and identically distributed data. Our estimator considers a large library of candidate online estimators and uses online cross-validation to identify the algorithm with the best performance. We show that by basing estimates on the cross-validation-selected algorithm, we are asymptotically guaranteed to perform as well as the true, unknown best-performing algorithm. We provide extensions of this approach including online estimation of the optimal ensemble of candidate online estimators. We illustrate excellent performance of our methods using simulations and a real data example where we make streaming predictions of infectious disease incidence using data from a large database. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Online Cross-Validation-Based Ensemble Learning

    PubMed Central

    Benkeser, David; Ju, Cheng; Lendle, Sam; van der Laan, Mark

    2017-01-01

    Online estimators update a current estimate with a new incoming batch of data without having to revisit past data thereby providing streaming estimates that are scalable to big data. We develop flexible, ensemble-based online estimators of an infinite-dimensional target parameter, such as a regression function, in the setting where data are generated sequentially by a common conditional data distribution given summary measures of the past. This setting encompasses a wide range of time-series models and as special case, models for independent and identically distributed data. Our estimator considers a large library of candidate online estimators and uses online cross-validation to identify the algorithm with the best performance. We show that by basing estimates on the cross-validation-selected algorithm, we are asymptotically guaranteed to perform as well as the true, unknown best-performing algorithm. We provide extensions of this approach including online estimation of the optimal ensemble of candidate online estimators. We illustrate excellent performance of our methods using simulations and a real data example where we make streaming predictions of infectious disease incidence using data from a large database. PMID:28474419

  19. Effects of space environment on composites: An analytical study of critical experimental parameters

    NASA Technical Reports Server (NTRS)

    Gupta, A.; Carroll, W. F.; Moacanin, J.

    1979-01-01

    A generalized methodology currently employed at JPL, was used to develop an analytical model for effects of high-energy electrons and interactions between electron and ultraviolet effects. Chemical kinetic concepts were applied in defining quantifiable parameters; the need for determining short-lived transient species and their concentration was demonstrated. The results demonstrates a systematic and cost-effective means of addressing the issues and show qualitative and quantitative, applicable relationships between space radiation and simulation parameters. An equally important result is identification of critical initial experiments necessary to further clarify the relationships. Topics discussed include facility and test design; rastered vs. diffuse continuous e-beam; valid acceleration level; simultaneous vs. sequential exposure to different types of radiation; and interruption of test continuity.

  20. Performance Validation Approach for the GTX Air-Breathing Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Trefny, Charles J.; Roche, Joseph M.

    2002-01-01

    The primary objective of the GTX effort is to determine whether or not air-breathing propulsion can enable a launch vehicle to achieve orbit in a single stage. Structural weight, vehicle aerodynamics, and propulsion performance must be accurately known over the entire flight trajectory in order to make a credible assessment. Structural, aerodynamic, and propulsion parameters are strongly interdependent, which necessitates a system approach to design, evaluation, and optimization of a single-stage-to-orbit concept. The GTX reference vehicle serves this purpose, by allowing design, development, and validation of components and subsystems in a system context. The reference vehicle configuration (including propulsion) was carefully chosen so as to provide high potential for structural and volumetric efficiency, and to allow the high specific impulse of air-breathing propulsion cycles to be exploited. Minor evolution of the configuration has occurred as analytical and experimental results have become available. With this development process comes increasing validation of the weight and performance levels used in system performance determination. This paper presents an overview of the GTX reference vehicle and the approach to its performance validation. Subscale test rigs and numerical studies used to develop and validate component performance levels and unit structural weights are outlined. The sensitivity of the equivalent, effective specific impulse to key propulsion component efficiencies is presented. The role of flight demonstration in development and validation is discussed.

  1. Parameter estimation for the 4-parameter Asymmetric Exponential Power distribution by the method of L-moments using R

    USGS Publications Warehouse

    Asquith, William H.

    2014-01-01

    The implementation characteristics of two method of L-moments (MLM) algorithms for parameter estimation of the 4-parameter Asymmetric Exponential Power (AEP4) distribution are studied using the R environment for statistical computing. The objective is to validate the algorithms for general application of the AEP4 using R. An algorithm was introduced in the original study of the L-moments for the AEP4. A second or alternative algorithm is shown to have a larger L-moment-parameter domain than the original. The alternative algorithm is shown to provide reliable parameter production and recovery of L-moments from fitted parameters. A proposal is made for AEP4 implementation in conjunction with the 4-parameter Kappa distribution to create a mixed-distribution framework encompassing the joint L-skew and L-kurtosis domains. The example application provides a demonstration of pertinent algorithms with L-moment statistics and two 4-parameter distributions (AEP4 and the Generalized Lambda) for MLM fitting to a modestly asymmetric and heavy-tailed dataset using R.

  2. Seasonal evolution of soil and plant parameters on the agricultural Gebesee test site: a database for the set-up and validation of EO-LDAS and satellite-aided retrieval models

    NASA Astrophysics Data System (ADS)

    Truckenbrodt, Sina C.; Schmullius, Christiane C.

    2018-03-01

    Ground reference data are a prerequisite for the calibration, update, and validation of retrieval models facilitating the monitoring of land parameters based on Earth Observation data. Here, we describe the acquisition of a comprehensive ground reference database which was created to test and validate the recently developed Earth Observation Land Data Assimilation System (EO-LDAS) and products derived from remote sensing observations in the visible and infrared range. In situ data were collected for seven crop types (winter barley, winter wheat, spring wheat, durum, winter rape, potato, and sugar beet) cultivated on the agricultural Gebesee test site, central Germany, in 2013 and 2014. The database contains information on hyperspectral surface reflectance factors, the evolution of biophysical and biochemical plant parameters, phenology, surface conditions, atmospheric states, and a set of ground control points. Ground reference data were gathered at an approximately weekly resolution and on different spatial scales to investigate variations within and between acreages. In situ data collected less than 1 day apart from satellite acquisitions (RapidEye, SPOT 5, Landsat-7 and -8) with a cloud coverage ≤ 25 % are available for 10 and 15 days in 2013 and 2014, respectively. The measurements show that the investigated growing seasons were characterized by distinct meteorological conditions causing interannual variations in the parameter evolution. Here, the experimental design of the field campaigns, and methods employed in the determination of all parameters, are described in detail. Insights into the database are provided and potential fields of application are discussed. The data will contribute to a further development of crop monitoring methods based on remote sensing techniques. The database is freely available at PANGAEA (https://doi.org/10.1594/PANGAEA.874251).

  3. A low-cost three-dimensional laser surface scanning approach for defining body segment parameters.

    PubMed

    Pandis, Petros; Bull, Anthony Mj

    2017-11-01

    Body segment parameters are used in many different applications in ergonomics as well as in dynamic modelling of the musculoskeletal system. Body segment parameters can be defined using different methods, including techniques that involve time-consuming manual measurements of the human body, used in conjunction with models or equations. In this study, a scanning technique for measuring subject-specific body segment parameters in an easy, fast, accurate and low-cost way was developed and validated. The scanner can obtain the body segment parameters in a single scanning operation, which takes between 8 and 10 s. The results obtained with the system show a standard deviation of 2.5% in volumetric measurements of the upper limb of a mannequin and 3.1% difference between scanning volume and actual volume. Finally, the maximum mean error for the moment of inertia by scanning a standard-sized homogeneous object was 2.2%. This study shows that a low-cost system can provide quick and accurate subject-specific body segment parameter estimates.

  4. Validation of a novel virtual reality simulator for robotic surgery.

    PubMed

    Schreuder, Henk W R; Persson, Jan E U; Wolswijk, Richard G H; Ihse, Ingmar; Schijven, Marlies P; Verheijen, René H M

    2014-01-01

    With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were "time to complete" and "economy of motion" (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery.

  5. Validating the simulation of large-scale parallel applications using statistical characteristics

    DOE PAGES

    Zhang, Deli; Wilke, Jeremiah; Hendry, Gilbert; ...

    2016-03-01

    Simulation is a widely adopted method to analyze and predict the performance of large-scale parallel applications. Validating the hardware model is highly important for complex simulations with a large number of parameters. Common practice involves calculating the percent error between the projected and the real execution time of a benchmark program. However, in a high-dimensional parameter space, this coarse-grained approach often suffers from parameter insensitivity, which may not be known a priori. Moreover, the traditional approach cannot be applied to the validation of software models, such as application skeletons used in online simulations. In this work, we present a methodologymore » and a toolset for validating both hardware and software models by quantitatively comparing fine-grained statistical characteristics obtained from execution traces. Although statistical information has been used in tasks like performance optimization, this is the first attempt to apply it to simulation validation. Lastly, our experimental results show that the proposed evaluation approach offers significant improvement in fidelity when compared to evaluation using total execution time, and the proposed metrics serve as reliable criteria that progress toward automating the simulation tuning process.« less

  6. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  7. Concerning the Development of the Wide-Field Optics for WFXT Including Methods of Optimizing X-Ray Optical Prescriptions for Wide-Field Applications

    NASA Technical Reports Server (NTRS)

    Weisskopf, M. C.; Elsner, R. F.; O'Dell, S. L.; Ramsey, B. D.

    2010-01-01

    We present a progress report on the various endeavors we are undertaking at MSFC in support of the Wide Field X-Ray Telescope development. In particular we discuss assembly and alignment techniques, in-situ polishing corrections, and the results of our efforts to optimize mirror prescriptions including polynomial coefficients, relative shell displacements, detector placements and tilts. This optimization does not require a blind search through the multi-dimensional parameter space. Under the assumption that the parameters are small enough so that second order expansions are valid, we show that the performance at the detector can be expressed as a quadratic function with numerical coefficients derived from a ray trace through the underlying Wolter I optic. The optimal values for the parameters are found by solving the linear system of equations creating by setting derivatives of this function with respect to each parameter to zero.

  8. Estimation of genetic parameters for heat stress, including dominance gene effects, on milk yield in Thai Holstein dairy cattle.

    PubMed

    Boonkum, Wuttigrai; Duangjinda, Monchai

    2015-03-01

    Heat stress in tropical regions is a major cause that strongly negatively affects to milk production in dairy cattle. Genetic selection for dairy heat tolerance is powerful technique to improve genetic performance. Therefore, the current study aimed to estimate genetic parameters and investigate the threshold point of heat stress for milk yield. Data included 52 701 test-day milk yield records for the first parity from 6247 Thai Holstein dairy cattle, covering the period 1990 to 2007. The random regression test day model with EM-REML was used to estimate variance components, genetic parameters and milk production loss. A decline in milk production was found when temperature and humidity index (THI) exceeded a threshold of 74, also it was associated with the high percentage of Holstein genetics. All variance component estimates increased with THI. The estimate of heritability of test-day milk yield was 0.231. Dominance variance as a proportion to additive variance (0.035) indicated that non-additive effects might not be of concern for milk genetics studies in Thai Holstein cattle. Correlations between genetic and permanent environmental effects, for regular conditions and due to heat stress, were - 0.223 and - 0.521, respectively. The heritability and genetic correlations from this study show that simultaneous selection for milk production and heat tolerance is possible. © 2014 Japanese Society of Animal Science.

  9. Twelve hour reproducibility of choroidal blood flow parameters in healthy subjects.

    PubMed

    Polska, E; Polak, K; Luksch, A; Fuchsjager-Mayrl, G; Petternel, V; Findl, O; Schmetterer, L

    2004-04-01

    To investigate the reproducibility and potential diurnal variation of choroidal blood flow parameters in healthy subjects over a period of 12 hours. The choroidal blood flow parameters of 16 healthy non-smoking subjects were measured at five time points during the day (8:00, 11:00, 14:00, 17:00, and 20:00). Outcome parameters were pulsatile ocular blood flow as assessed by pneumotonometry, fundus pulsation amplitude as assessed by laser interferometry, blood velocities in the opthalmic and posterior ciliary arteries as assessed by colour Doppler imaging, and choroidal blood flow, volume, and velocity as assessed by fundus camera based laser Doppler flowmetry. The coefficient of variation and the maximum change from baseline in an individual were calculated for each outcome parameter. None of the techniques used found a diurnal variation in choroidal blood flow. Coefficients of variation were within 2.9% and 13.6% for all outcome parameters. The maximum change from baseline in an individual was much higher, ranging from 11.2% to 58.8%. These data indicate that in healthy subjects the selected techniques provide adequate reproducibility to be used in clinical studies. Variability may, however, be considerably higher in older subjects or subjects with ocular disease. The higher individual differences in flow parameter readings limit the use of the techniques in clinical practice. To overcome problems with measurement validity, a clinical trial should include as many choroidal blood flow outcome parameters as possible to check for consistency.

  10. Comparing Parameter Estimation Techniques for an Electrical Power Transformer Oil Temperature Prediction Model

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    1999-01-01

    This paper examines various sources of error in MIT's improved top oil temperature rise over ambient temperature model and estimation process. The sources of error are the current parameter estimation technique, quantization noise, and post-processing of the transformer data. Results from this paper will show that an output error parameter estimation technique should be selected to replace the current least squares estimation technique. The output error technique obtained accurate predictions of transformer behavior, revealed the best error covariance, obtained consistent parameter estimates, and provided for valid and sensible parameters. This paper will also show that the output error technique should be used to minimize errors attributed to post-processing (decimation) of the transformer data. Models used in this paper are validated using data from a large transformer in service.

  11. International validation of quality indicators for evaluating priority setting in low income countries: process and key lessons.

    PubMed

    Kapiriri, Lydia

    2017-06-19

    While there have been efforts to develop frameworks to guide healthcare priority setting; there has been limited focus on evaluation frameworks. Moreover, while the few frameworks identify quality indicators for successful priority setting, they do not provide the users with strategies to verify these indicators. Kapiriri and Martin (Health Care Anal 18:129-147, 2010) developed a framework for evaluating priority setting in low and middle income countries. This framework provides BOTH parameters for successful priority setting and proposes means of their verification. Before its use in real life contexts, this paper presents results from a validation process of the framework. The framework validation involved 53 policy makers and priority setting researchers at the global, national and sub-national levels (in Uganda). They were requested to indicate the relative importance of the proposed parameters as well as the feasibility of obtaining the related information. We also pilot tested the proposed means of verification. Almost all the respondents evaluated all the parameters, including the contextual factors, as 'very important'. However, some respondents at the global level thought 'presence of incentives to comply', 'reduced disagreements', 'increased public understanding,' 'improved institutional accountability' and 'meeting the ministry of health objectives', which could be a reflection of their levels of decision making. All the proposed means of verification were assessed as feasible with the exception of meeting observations which would require an insider. These findings results were consistent with those obtained from the pilot testing. These findings are relevant to policy makers and researchers involved in priority setting in low and middle income countries. To the best of our knowledge, this is one of the few initiatives that has involved potential users of a framework (at the global and in a Low Income Country) in its validation. The favorable validation

  12. Critical discussion of evaluation parameters for inter-observer variability in target definition for radiation therapy.

    PubMed

    Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D

    2012-02-01

    Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.

  13. Psychometric properties including reliability, validity and responsiveness of the Majeed pelvic score in patients with chronic sacroiliac joint pain.

    PubMed

    Bajada, Stefan; Mohanty, Khitish

    2016-06-01

    The Majeed scoring system is a disease-specific outcome measure that was originally designed to assess pelvic injuries. The aim of this study was to determine the psychometric properties of the Majeed scoring system for chronic sacroiliac joint pain. Internal consistency, content validity, criterion validity, construct validity and responsiveness to change was assessed prospectively for the Majeed scoring system in a cohort of 60 patients diagnosed with sacroiliac joint pain. This diagnosis was confirmed with CT-guided sacroiliac joint anaesthetic block. The overall Majeed score showed acceptable internal consistency (Cronbach alpha = 0.63). Similarly, it showed acceptable floor (0 %) and ceiling (0 %) effects. On the other hand, the domains of pain, work, sitting and sexual intercourse had high (>30 %) floor effects. Significant correlation with the physical component of the Short Form-36 (p = 0.005) and Oswestry disability index (p ≤ 0.001) was found indicating acceptable criterion validity. The overall Majeed score showed acceptable construct validity with all five developed hypotheses showing significance (p ≤ 0.05). The overall Majeed score showed acceptable responsiveness to change with a large (≥0.80) effect size and standardized response mean. Overall the Majeed scoring system demonstrated acceptable psychometric properties for outcome assessment in chronic sacroiliac joint pain. Thus, its use in this condition is adequate. However, some domains demonstrated suboptimal performance indicating that improvement might be achieved with the development of an outcome measure specific for sacroiliac joint dysfunction and degeneration.

  14. Genetic parameter estimates for carcass traits and visual scores including or not genomic information.

    PubMed

    Gordo, D G M; Espigolan, R; Tonussi, R L; Júnior, G A F; Bresolin, T; Magalhães, A F Braga; Feitosa, F L; Baldi, F; Carvalheiro, R; Tonhati, H; de Oliveira, H N; Chardulo, L A L; de Albuquerque, L G

    2016-05-01

    The objective of this study was to determine whether visual scores used as selection criteria in Nellore breeding programs are effective indicators of carcass traits measured after slaughter. Additionally, this study evaluated the effect of different structures of the relationship matrix ( and ) on the estimation of genetic parameters and on the prediction accuracy of breeding values. There were 13,524 animals for visual scores of conformation (CS), finishing precocity (FP), and muscling (MS) and 1,753, 1,747, and 1,564 for LM area (LMA), backfat thickness (BF), and HCW, respectively. Of these, 1,566 animals were genotyped using a high-density panel containing 777,962 SNP. Six analyses were performed using multitrait animal models, each including the 3 visual scores and 1 carcass trait. For the visual scores, the model included direct additive genetic and residual random effects and the fixed effects of contemporary group (defined by year of birth, management group at yearling, and farm) and the linear effect of age of animal at yearling. The same model was used for the carcass traits, replacing the effect of age of animal at yearling with the linear effect of age of animal at slaughter. The variance and covariance components were estimated by the REML method in analyses using the numerator relationship matrix () or combining the genomic and the numerator relationship matrices (). The heritability estimates for the visual scores obtained with the 2 methods were similar and of moderate magnitude (0.23-0.34), indicating that these traits should response to direct selection. The heritabilities for LMA, BF, and HCW were 0.13, 0.07, and 0.17, respectively, using matrix and 0.29, 0.16, and 0.23, respectively, using matrix . The genetic correlations between the visual scores and carcass traits were positive, and higher correlations were generally obtained when matrix was used. Considering the difficulties and cost of measuring carcass traits postmortem, visual scores of

  15. Validation of column-based chromatography processes for the purification of proteins. Technical report No. 14.

    PubMed

    2008-01-01

    PDA Technical Report No. 14 has been written to provide current best practices, such as application of risk-based decision making, based in sound science to provide a foundation for the validation of column-based chromatography processes and to expand upon information provided in Technical Report No. 42, Process Validation of Protein Manufacturing. The intent of this technical report is to provide an integrated validation life-cycle approach that begins with the use of process development data for the definition of operational parameters as a basis for validation, confirmation, and/or minor adjustment to these parameters at manufacturing scale during production of conformance batches and maintenance of the validated state throughout the product's life cycle.

  16. Validation of the i-STAT system for the analysis of blood parameters in fish

    PubMed Central

    Harter, T. S.; Shartau, R. B.; Brauner, C. J.; Farrell, A. P.

    2014-01-01

    Portable clinical analysers, such as the i-STAT system, are increasingly being used for blood analysis in animal ecology and physiology because of their portability and easy operation. Although originally conceived for clinical application and to replace robust but lengthy techniques, researchers have extended the use of the i-STAT system outside of humans and even to poikilothermic fish, with only limited validation. The present study analysed a range of blood parameters [pH, haematocrit (Hct), haemoglobin (Hb), HCO3−, partial pressure of CO2 (PCO2), partial pressure of O2 (PO2), Hb saturation (sO2) and Na+ concentration] in a model teleost fish (rainbow trout, Oncorhynchus mykiss) using the i-STAT system (CG8+ cartridges) and established laboratory techniques. This methodological comparison was performed at two temperatures (10 and 20°C), two haematocrits (low and high) and three PCO2 levels (0.5, 1.0 and 1.5%). Our results indicate that pH was measured accurately with the i-STAT system over a physiological pH range and using the i-STAT temperature correction. Haematocrit was consistently underestimated by the i-STAT, while the measurements of Na+, PCO2, HCO3− and PO2 were variably inaccurate over the range of values typically found in fish. The algorithm that the i-STAT uses to calculate sO2 did not yield meaningful results on rainbow trout blood. Application of conversion factors to correct i-STAT measurements is not recommended, due to significant effects of temperature, Hct and PCO2 on the measurement errors and complex interactions may exist. In conclusion, the i-STAT system can easily generate fast results from rainbow trout whole blood, but many are inaccurate values. PMID:27293658

  17. Cleaning verification: A five parameter study of a Total Organic Carbon method development and validation for the cleaning assessment of residual detergents in manufacturing equipment.

    PubMed

    Li, Xue; Ahmad, Imad A Haidar; Tam, James; Wang, Yan; Dao, Gina; Blasko, Andrei

    2018-02-05

    A Total Organic Carbon (TOC) based analytical method to quantitate trace residues of clean-in-place (CIP) detergents CIP100 ® and CIP200 ® on the surfaces of pharmaceutical manufacturing equipment was developed and validated. Five factors affecting the development and validation of the method were identified: diluent composition, diluent volume, extraction method, location for TOC sample preparation, and oxidant flow rate. Key experimental parameters were optimized to minimize contamination and to improve the sensitivity, recovery, and reliability of the method. The optimized concentration of the phosphoric acid in the swabbing solution was 0.05M, and the optimal volume of the sample solution was 30mL. The swab extraction method was 1min sonication. The use of a clean room, as compared to an isolated lab environment, was not required for method validation. The method was demonstrated to be linear with a correlation coefficient (R) of 0.9999. The average recoveries from stainless steel surfaces at multiple spike levels were >90%. The repeatability and intermediate precision results were ≤5% across the 2.2-6.6ppm range (50-150% of the target maximum carry over, MACO, limit). The method was also shown to be sensitive with a detection limit (DL) of 38ppb and a quantitation limit (QL) of 114ppb. The method validation demonstrated that the developed method is suitable for its intended use. The methodology developed in this study is generally applicable to the cleaning verification of any organic detergents used for the cleaning of pharmaceutical manufacturing equipment made of electropolished stainless steel material. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Dynamic imaging model and parameter optimization for a star tracker.

    PubMed

    Yan, Jinyun; Jiang, Jie; Zhang, Guangjun

    2016-03-21

    Under dynamic conditions, star spots move across the image plane of a star tracker and form a smeared star image. This smearing effect increases errors in star position estimation and degrades attitude accuracy. First, an analytical energy distribution model of a smeared star spot is established based on a line segment spread function because the dynamic imaging process of a star tracker is equivalent to the static imaging process of linear light sources. The proposed model, which has a clear physical meaning, explicitly reflects the key parameters of the imaging process, including incident flux, exposure time, velocity of a star spot in an image plane, and Gaussian radius. Furthermore, an analytical expression of the centroiding error of the smeared star spot is derived using the proposed model. An accurate and comprehensive evaluation of centroiding accuracy is obtained based on the expression. Moreover, analytical solutions of the optimal parameters are derived to achieve the best performance in centroid estimation. Finally, we perform numerical simulations and a night sky experiment to validate the correctness of the dynamic imaging model, the centroiding error expression, and the optimal parameters.

  19. NASA Sea Ice and Snow Validation Program for the DMSP SSM/I: NASA DC-8 flight report

    NASA Technical Reports Server (NTRS)

    Cavalieri, D. J.

    1988-01-01

    In June 1987 a new microwave sensor called the Special Sensor Microwave Imager (SSM/I) was launched as part of the Defense Meteorological Satellite Program (DMSP). In recognition of the importance of this sensor to the polar research community, NASA developed a program to acquire the data, to convert the data into sea ice parameters, and finally to validate and archive both the SSM/I radiances and the derived sea ice parameters. Central to NASA's sea ice validation program was a series of SSM/I aircraft underflights with the NASA DC-8 airborne Laboratory. The mission (the Arctic '88 Sea Ice Mission) was completed in March 1988. This report summarizes the mission and includes a summary of aircraft instrumentation, coordination with participating Navy aircraft, flight objectives, flight plans, data collected, SSM/I orbits for each day during the mission, and lists several piggyback experiments supported during this mission.

  20. Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun

    NASA Technical Reports Server (NTRS)

    Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry

    2017-01-01

    Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.

  1. Automatic tree parameter extraction by a Mobile LiDAR System in an urban context.

    PubMed

    Herrero-Huerta, Mónica; Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo

    2018-01-01

    In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees.

  2. Automatic tree parameter extraction by a Mobile LiDAR System in an urban context

    PubMed Central

    Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo

    2018-01-01

    In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees. PMID:29689076

  3. Trunk-acceleration based assessment of gait parameters in older persons: a comparison of reliability and validity of four inverted pendulum based estimations.

    PubMed

    Zijlstra, Agnes; Zijlstra, Wiebren

    2013-09-01

    Inverted pendulum (IP) models of human walking allow for wearable motion-sensor based estimations of spatio-temporal gait parameters during unconstrained walking in daily-life conditions. At present it is unclear to what extent different IP based estimations yield different results, and reliability and validity have not been investigated in older persons without a specific medical condition. The aim of this study was to compare reliability and validity of four different IP based estimations of mean step length in independent-living older persons. Participants were assessed twice and walked at different speeds while wearing a tri-axial accelerometer at the lower back. For all step-length estimators, test-retest intra-class correlations approached or were above 0.90. Intra-class correlations with reference step length were above 0.92 with a mean error of 0.0 cm when (1) multiplying the estimated center-of-mass displacement during a step by an individual correction factor in a simple IP model, or (2) adding an individual constant for bipedal stance displacement to the estimated displacement during single stance in a 2-phase IP model. When applying generic corrections or constants in all subjects (i.e. multiplication by 1.25, or adding 75% of foot length), correlations were above 0.75 with a mean error of respectively 2.0 and 1.2 cm. Although the results indicate that an individual adjustment of the IP models provides better estimations of mean step length, the ease of a generic adjustment can be favored when merely evaluating intra-individual differences. Further studies should determine the validity of these IP based estimations for assessing gait in daily life. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Progress in Validation of Wind-US for Ramjet/Scramjet Combustion

    NASA Technical Reports Server (NTRS)

    Engblom, William A.; Frate, Franco C.; Nelson, Chris C.

    2005-01-01

    Validation of the Wind-US flow solver against two sets of experimental data involving high-speed combustion is attempted. First, the well-known Burrows- Kurkov supersonic hydrogen-air combustion test case is simulated, and the sensitively of ignition location and combustion performance to key parameters is explored. Second, a numerical model is developed for simulation of an X-43B candidate, full-scale, JP-7-fueled, internal flowpath operating in ramjet mode. Numerical results using an ethylene-air chemical kinetics model are directly compared against previously existing pressure-distribution data along the entire flowpath, obtained in direct-connect testing conducted at NASA Langley Research Center. Comparison to derived quantities such as burn efficiency and thermal throat location are also made. Reasonable to excellent agreement with experimental data is demonstrated for key parameters in both simulation efforts. Additional Wind-US feature needed to improve simulation efforts are described herein, including maintaining stagnation conditions at inflow boundaries for multi-species flow. An open issue regarding the sensitivity of isolator unstart to key model parameters is briefly discussed.

  5. Face validity, construct validity and training benefits of a virtual reality TURP simulator.

    PubMed

    Bright, Elizabeth; Vine, Samuel; Wilson, Mark R; Masters, Rich S W; McGrath, John S

    2012-01-01

    To assess face validity, construct validity and the training benefits of a virtual reality TURP simulator. 11 novices (no TURP experience) and 7 experts (>200 TURP's) completed a virtual reality median lobe prostate resection task on the TURPsim™ (Simbionix USA Corp., Cleveland, OH). Performance indicators (percentage of prostate resected (PR), percentage of capsular resection (CR) and time diathermy loop active without tissue contact (TAWC) were recorded via the TURPsim™ and compared between novices and experts to assess construct validity. Verbal comments provided by experts following task completion were used to assess face validity. Repeated attempts of the task by the novices were analysed to assess the training benefits of the TURPsim™. Experts resected a significantly greater percentage of prostate per minute (p < 0.01) and had significantly less active diathermy time without tissue contact (p < 0.01) than novices. After practice, novices were able to perform the simulation more effectively, with significant improvement in all measured parameters. Improvement in performance was noted in novices following repetitive training, as evidenced by improved TAWC scores that were not significantly different from the expert group (p = 0.18). This study has established face and construct validity for the TURPsim™. The potential benefit in using this tool to train novices has also been demonstrated. Copyright © 2012 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  6. Model development and validation of geometrically complex eddy current coils using finite element methods

    NASA Astrophysics Data System (ADS)

    Brown, Alexander; Eviston, Connor

    2017-02-01

    Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.

  7. Validation of a Novel Virtual Reality Simulator for Robotic Surgery

    PubMed Central

    Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.

    2014-01-01

    Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328

  8. Robust linear parameter-varying control of blood pressure using vasoactive drugs

    NASA Astrophysics Data System (ADS)

    Luspay, Tamas; Grigoriadis, Karolos

    2015-10-01

    Resuscitation of emergency care patients requires fast restoration of blood pressure to a target value to achieve hemodynamic stability and vital organ perfusion. A robust control design methodology is presented in this paper for regulating the blood pressure of hypotensive patients by means of the closed-loop administration of vasoactive drugs. To this end, a dynamic first-order delay model is utilised to describe the vasoactive drug response with varying parameters that represent intra-patient and inter-patient variability. The proposed framework consists of two components: first, an online model parameter estimation is carried out using a multiple-model extended Kalman-filter. Second, the estimated model parameters are used for continuously scheduling a robust linear parameter-varying (LPV) controller. The closed-loop behaviour is characterised by parameter-varying dynamic weights designed to regulate the mean arterial pressure to a target value. Experimental data of blood pressure response of anesthetised pigs to phenylephrine injection are used for validating the LPV blood pressure models. Simulation studies are provided to validate the online model estimation and the LPV blood pressure control using phenylephrine drug injection models representing patients showing sensitive, nominal and insensitive response to the drug.

  9. Dynamic model including piping acoustics of a centrifugal compression system

    NASA Astrophysics Data System (ADS)

    van Helvoirt, Jan; de Jager, Bram

    2007-04-01

    This paper deals with low-frequency pulsation phenomena in full-scale centrifugal compression systems associated with compressor surge. The Greitzer lumped parameter model is applied to describe the dynamic behavior of an industrial compressor test rig and experimental evidence is provided for the presence of acoustic pulsations in the compression system under study. It is argued that these acoustic phenomena are common for full-scale compression systems where pipe system dynamics have a significant influence on the overall system behavior. The main objective of this paper is to extend the basic compressor model in order to include the relevant pipe system dynamics. For this purpose a pipeline model is proposed, based on previous developments for fluid transmission lines. The connection of this model to the lumped parameter model is accomplished via the selection of appropriate boundary conditions. Validation results will be presented, showing a good agreement between simulation and measurement data. The results indicate that the damping of piping transients depends on the nominal, time-varying pressure and flow velocity. Therefore, model parameters are made dependent on the momentary pressure and a switching nonlinearity is introduced into the model to vary the acoustic damping as a function of flow velocity. These modifications have limited success and the results indicate that a more sophisticated model is required to fully describe all (nonlinear) acoustic effects. However, the very good qualitative results show that the model adequately combines compressor and pipe system dynamics. Therefore, the proposed model forms a step forward in the analysis and modeling of surge in full-scale centrifugal compression systems and opens the path for further developments in this field.

  10. Practical input optimization for aircraft parameter estimation experiments. Ph.D. Thesis, 1990

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1993-01-01

    The object of this research was to develop an algorithm for the design of practical, optimal flight test inputs for aircraft parameter estimation experiments. A general, single pass technique was developed which allows global optimization of the flight test input design for parameter estimation using the principles of dynamic programming with the input forms limited to square waves only. Provision was made for practical constraints on the input, including amplitude constraints, control system dynamics, and selected input frequency range exclusions. In addition, the input design was accomplished while imposing output amplitude constraints required by model validity and considerations of safety during the flight test. The algorithm has multiple input design capability, with optional inclusion of a constraint that only one control move at a time, so that a human pilot can implement the inputs. It is shown that the technique can be used to design experiments for estimation of open loop model parameters from closed loop flight test data. The report includes a new formulation of the optimal input design problem, a description of a new approach to the solution, and a summary of the characteristics of the algorithm, followed by three example applications of the new technique which demonstrate the quality and expanded capabilities of the input designs produced by the new technique. In all cases, the new input design approach showed significant improvement over previous input design methods in terms of achievable parameter accuracies.

  11. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  12. 40 CFR 60.4410 - How do I establish a valid parameter range if I have chosen to continuously monitor parameters?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... NEW STATIONARY SOURCES Standards of Performance for Stationary Combustion Turbines Performance Tests... of NOX emission controls in accordance with § 60.4340, the appropriate parameters must be...

  13. 40 CFR 60.4410 - How do I establish a valid parameter range if I have chosen to continuously monitor parameters?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... NEW STATIONARY SOURCES Standards of Performance for Stationary Combustion Turbines Performance Tests... of NOX emission controls in accordance with § 60.4340, the appropriate parameters must be...

  14. 40 CFR 60.4410 - How do I establish a valid parameter range if I have chosen to continuously monitor parameters?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... NEW STATIONARY SOURCES Standards of Performance for Stationary Combustion Turbines Performance Tests... of NOX emission controls in accordance with § 60.4340, the appropriate parameters must be...

  15. 40 CFR 60.4410 - How do I establish a valid parameter range if I have chosen to continuously monitor parameters?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... NEW STATIONARY SOURCES Standards of Performance for Stationary Combustion Turbines Performance Tests... of NOX emission controls in accordance with § 60.4340, the appropriate parameters must be...

  16. A flexible and qualitatively stable model for cell cycle dynamics including DNA damage effects.

    PubMed

    Jeffries, Clark D; Johnson, Charles R; Zhou, Tong; Simpson, Dennis A; Kaufmann, William K

    2012-01-01

    This paper includes a conceptual framework for cell cycle modeling into which the experimenter can map observed data and evaluate mechanisms of cell cycle control. The basic model exhibits qualitative stability, meaning that regardless of magnitudes of system parameters its instances are guaranteed to be stable in the sense that all feasible trajectories converge to a certain trajectory. Qualitative stability can also be described by the signs of real parts of eigenvalues of the system matrix. On the biological side, the resulting model can be tuned to approximate experimental data pertaining to human fibroblast cell lines treated with ionizing radiation, with or without disabled DNA damage checkpoints. Together these properties validate a fundamental, first order systems view of cell dynamics. Classification Codes: 15A68.

  17. Validation of Interannual Differences of AIRS Monthly Mean Parameters

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Iredell, Lena; Keita, Fricky; Molnar, Gyula

    2005-01-01

    Monthly mean fields of select geophysical parameters derived from analysis of AIRS/AMSU data, and their interannual differences, are shown and compared with analogous fields derived from other sources. All AIRS fields are derived using the AIRS Science Team Version 4 algorithm. Monthly mean results are shown for January 2004, as are interannual differences between January 2004 and January 2003. AIRS temperature and water vapor profile fields are compared with monthly mean collocated ECMWF 3 hour forecast and monthly mean TOVS Pathfinder Path A data. AIRS Tropospheric and Stratospheric coarse climate indicators are compared with analogous MSU products derived by Spencer and christy and found in the TOVS Pathfinder Path A data set. Total ozone is compared with results produced by TOMS. OLR is compared with OLR derived using CERES data and found in the TOVS Pathfinder Path A data set. AIRS results agree well in all cases, especially in the interannual difference sense.

  18. Validating and comparing GNSS antenna calibrations

    NASA Astrophysics Data System (ADS)

    Kallio, Ulla; Koivula, Hannu; Lahtinen, Sonja; Nikkonen, Ville; Poutanen, Markku

    2018-03-01

    GNSS antennas have no fixed electrical reference point. The variation of the phase centre is modelled and tabulated in antenna calibration tables, which include the offset vector (PCO) and phase centre variation (PCV) for each frequency according to the elevations and azimuths of the incoming signal. Used together, PCV and PCO reduce the phase observations to the antenna reference point. The remaining biases, called the residual offsets, can be revealed by circulating and rotating the antennas on pillars. The residual offsets are estimated as additional parameters when combining the daily GNSS network solutions with full covariance matrix. We present a procedure for validating the antenna calibration tables. The dedicated test field, called Revolver, was constructed at Metsähovi. We used the procedure to validate the calibration tables of 17 antennas. Tables from the IGS and three different calibration institutions were used. The tests show that we were able to separate the residual offsets at the millimetre level. We also investigated the influence of the calibration tables from the different institutions on site coordinates by performing kinematic double-difference baseline processing of the data from one site with different antenna tables. We found small but significant differences between the tables.

  19. Cosmological Parameters and Hyper-Parameters: The Hubble Constant from Boomerang and Maxima

    NASA Astrophysics Data System (ADS)

    Lahav, Ofer

    Recently several studies have jointly analysed data from different cosmological probes with the motivation of estimating cosmological parameters. Here we generalise this procedure to allow freedom in the relative weights of various probes. This is done by including in the joint likelihood function a set of `Hyper-Parameters', which are dealt with using Bayesian considerations. The resulting algorithm, which assumes uniform priors on the log of the Hyper-Parameters, is very simple to implement. We illustrate the method by estimating the Hubble constant H0 from different sets of recent CMB experiments (including Saskatoon, Python V, MSAM1, TOCO, Boomerang and Maxima). The approach can be generalised for a combination of cosmic probes, and for other priors on the Hyper-Parameters. Reference: Lahav, Bridle, Hobson, Lasenby & Sodre, 2000, MNRAS, in press (astro-ph/9912105)

  20. Twelve hour reproducibility of choroidal blood flow parameters in healthy subjects

    PubMed Central

    Polska, E; Polak, K; Luksch, A; Fuchsjager-Mayrl, G; Petternel, V; Findl, O; Schmetterer, L

    2004-01-01

    Aims/background: To investigate the reproducibility and potential diurnal variation of choroidal blood flow parameters in healthy subjects over a period of 12 hours. Methods: The choroidal blood flow parameters of 16 healthy non-smoking subjects were measured at five time points during the day (8:00, 11:00, 14:00, 17:00, and 20:00). Outcome parameters were pulsatile ocular blood flow as assessed by pneumotonometry, fundus pulsation amplitude as assessed by laser interferometry, blood velocities in the opthalmic and posterior ciliary arteries as assessed by colour Doppler imaging, and choroidal blood flow, volume, and velocity as assessed by fundus camera based laser Doppler flowmetry. The coefficient of variation and the maximum change from baseline in an individual were calculated for each outcome parameter. Results: None of the techniques used found a diurnal variation in choroidal blood flow. Coefficients of variation were within 2.9% and 13.6% for all outcome parameters. The maximum change from baseline in an individual was much higher, ranging from 11.2% to 58.8%. Conclusions: These data indicate that in healthy subjects the selected techniques provide adequate reproducibility to be used in clinical studies. Variability may, however, be considerably higher in older subjects or subjects with ocular disease. The higher individual differences in flow parameter readings limit the use of the techniques in clinical practice. To overcome problems with measurement validity, a clinical trial should include as many choroidal blood flow outcome parameters as possible to check for consistency. PMID:15031172

  1. A comment on the validity of fragmentation parameters measured in nuclear emulsions. [cosmic ray nuclei

    NASA Technical Reports Server (NTRS)

    Waddington, C. J.

    1978-01-01

    Evidence is reexamined which has been cited as suggesting serious errors in the use of fragmentation parameters appropriate to an airlike medium deduced from measurements made in nuclear emulsions to evaluate corrections for certain effects in balloon-borne observations of cosmic-ray nuclei. Fragmentation parameters for hydrogenlike interactions are calculated and shown to be in overall good agreement with those obtained previously for air. Experimentally measured fragmentation parameters in emulsion are compared with values computed semiempirically, and reasonable agreement is indicated.

  2. Gaia DR2 documentation Chapter 8: Astrophysical Parameters

    NASA Astrophysics Data System (ADS)

    Manteiga, M.; Andrae, R.; Fouesneau, M.; Creevey, O.; Ordenovic, C.; Mary, N.; Jean-Antoine-Piccolo, A.; Bailer-Jones, C. A. L.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes Apsis, the Astrophysical Parameters Inference System used for processing Gaia DR2 data. Beyond this documentation, a complete description of the processing and the results, as well as additional validations, have been published in Andrae et al. (2018).

  3. Validation of Cloud Optical Parameters from Passive Remote Sensing in the Arctic by using the Aircraft Measurements

    NASA Astrophysics Data System (ADS)

    Chen, H.; Schmidt, S.; Coddington, O.; Wind, G.; Bucholtz, A.; Segal-Rosenhaimer, M.; LeBlanc, S. E.

    2017-12-01

    Cloud Optical Parameters (COPs: e.g., cloud optical thickness and cloud effective radius) and surface albedo are the most important inputs for determining the Cloud Radiative Effect (CRE) at the surface. In the Arctic, the COPs derived from passive remote sensing such as from the Moderate Resolution Imaging Spectroradiometer (MODIS) are difficult to obtain with adequate accuracy owing mainly to insufficient knowledge about the snow/ice surface, but also because of the low solar zenith angle. This study aims to validate COPs derived from passive remote sensing in the Arctic by using aircraft measurements collected during two field campaigns based in Fairbanks, Alaska. During both experiments, ARCTAS (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites) and ARISE (Arctic Radiation-IceBridge Sea and Ice Experiment), the Solar Spectral Flux Radiometer (SSFR) measured upwelling and downwelling shortwave spectral irradiances, which can be used to derive surface and cloud albedo, as well as the irradiance transmitted by clouds. We assess the variability of the Arctic sea ice/snow surfaces albedo through these aircraft measurements and incorporate this variability into cloud retrievals for SSFR. We then compare COPs as derived from SSFR and MODIS for all suitable aircraft underpasses of the satellites. Finally, the sensitivities of the COPs to surface albedo and solar zenith angle are investigated.

  4. Rainfall-Runoff Parameters Uncertainity

    NASA Astrophysics Data System (ADS)

    Heidari, A.; Saghafian, B.; Maknoon, R.

    2003-04-01

    Karkheh river basin, located in southwest of Iran, drains an area of over 40000 km2 and is considered a flood active basin. A flood forecasting system is under development for the basin, which consists of a rainfall-runoff model, a river routing model, a reservior simulation model, and a real time data gathering and processing module. SCS, Clark synthetic unit hydrograph, and Modclark methods are the main subbasin rainfall-runoff transformation options included in the rainfall-runoff model. Infiltration schemes, such as exponentioal and SCS-CN methods, account for infiltration losses. Simulation of snow melt is based on degree day approach. River flood routing is performed by FLDWAV model based on one-dimensional full dynamic equation. Calibration and validation of the rainfall-runoff model on Karkheh subbasins are ongoing while the river routing model awaits cross section surveys.Real time hydrometeological data are collected by a telemetry network. The telemetry network is equipped with automatic sensors and INMARSAT-C comunication system. A geographic information system (GIS) stores and manages the spatial data while a database holds the hydroclimatological historical and updated time series. Rainfall runoff parameters uncertainty is analyzed by Monte Carlo and GLUE approaches.

  5. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no

  6. Development and Validation of a 3-Dimensional CFB Furnace Model

    NASA Astrophysics Data System (ADS)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  7. Validation tools for image segmentation

    NASA Astrophysics Data System (ADS)

    Padfield, Dirk; Ross, James

    2009-02-01

    A large variety of image analysis tasks require the segmentation of various regions in an image. For example, segmentation is required to generate accurate models of brain pathology that are important components of modern diagnosis and therapy. While the manual delineation of such structures gives accurate information, the automatic segmentation of regions such as the brain and tumors from such images greatly enhances the speed and repeatability of quantifying such structures. The ubiquitous need for such algorithms has lead to a wide range of image segmentation algorithms with various assumptions, parameters, and robustness. The evaluation of such algorithms is an important step in determining their effectiveness. Therefore, rather than developing new segmentation algorithms, we here describe validation methods for segmentation algorithms. Using similarity metrics comparing the automatic to manual segmentations, we demonstrate methods for optimizing the parameter settings for individual cases and across a collection of datasets using the Design of Experiment framework. We then employ statistical analysis methods to compare the effectiveness of various algorithms. We investigate several region-growing algorithms from the Insight Toolkit and compare their accuracy to that of a separate statistical segmentation algorithm. The segmentation algorithms are used with their optimized parameters to automatically segment the brain and tumor regions in MRI images of 10 patients. The validation tools indicate that none of the ITK algorithms studied are able to outperform with statistical significance the statistical segmentation algorithm although they perform reasonably well considering their simplicity.

  8. Design and validation of diffusion MRI models of white matter

    NASA Astrophysics Data System (ADS)

    Jelescu, Ileana O.; Budde, Matthew D.

    2017-11-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  9. Design and validation of diffusion MRI models of white matter

    PubMed Central

    Jelescu, Ileana O.; Budde, Matthew D.

    2018-01-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  10. Estimation of spatial-temporal gait parameters using a low-cost ultrasonic motion analysis system.

    PubMed

    Qi, Yongbin; Soh, Cheong Boon; Gunawan, Erry; Low, Kay-Soon; Thomas, Rijil

    2014-08-20

    In this paper, a low-cost motion analysis system using a wireless ultrasonic sensor network is proposed and investigated. A methodology has been developed to extract spatial-temporal gait parameters including stride length, stride duration, stride velocity, stride cadence, and stride symmetry from 3D foot displacements estimated by the combination of spherical positioning technique and unscented Kalman filter. The performance of this system is validated against a camera-based system in the laboratory with 10 healthy volunteers. Numerical results show the feasibility of the proposed system with average error of 2.7% for all the estimated gait parameters. The influence of walking speed on the measurement accuracy of proposed system is also evaluated. Statistical analysis demonstrates its capability of being used as a gait assessment tool for some medical applications.

  11. Validity and validation of expert (Q)SAR systems.

    PubMed

    Hulzebos, E; Sijm, D; Traas, T; Posthumus, R; Maslankiewicz, L

    2005-08-01

    At a recent workshop in Setubal (Portugal) principles were drafted to assess the suitability of (quantitative) structure-activity relationships ((Q)SARs) for assessing the hazards and risks of chemicals. In the present study we applied some of the Setubal principles to test the validity of three (Q)SAR expert systems and validate the results. These principles include a mechanistic basis, the availability of a training set and validation. ECOSAR, BIOWIN and DEREK for Windows have a mechanistic or empirical basis. ECOSAR has a training set for each QSAR. For half of the structural fragments the number of chemicals in the training set is >4. Based on structural fragments and log Kow, ECOSAR uses linear regression to predict ecotoxicity. Validating ECOSAR for three 'valid' classes results in predictivity of > or = 64%. BIOWIN uses (non-)linear regressions to predict the probability of biodegradability based on fragments and molecular weight. It has a large training set and predicts non-ready biodegradability well. DEREK for Windows predictions are supported by a mechanistic rationale and literature references. The structural alerts in this program have been developed with a training set of positive and negative toxicity data. However, to support the prediction only a limited number of chemicals in the training set is presented to the user. DEREK for Windows predicts effects by 'if-then' reasoning. The program predicts best for mutagenicity and carcinogenicity. Each structural fragment in ECOSAR and DEREK for Windows needs to be evaluated and validated separately.

  12. Validation of Magnetospheric Magnetohydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  13. Density functional theory calculations of 95Mo NMR parameters in solid-state compounds.

    PubMed

    Cuny, Jérôme; Furet, Eric; Gautier, Régis; Le Pollès, Laurent; Pickard, Chris J; d'Espinose de Lacaillerie, Jean-Baptiste

    2009-12-21

    The application of periodic density functional theory-based methods to the calculation of (95)Mo electric field gradient (EFG) and chemical shift (CS) tensors in solid-state molybdenum compounds is presented. Calculations of EFG tensors are performed using the projector augmented-wave (PAW) method. Comparison of the results with those obtained using the augmented plane wave + local orbitals (APW+lo) method and with available experimental values shows the reliability of the approach for (95)Mo EFG tensor calculation. CS tensors are calculated using the recently developed gauge-including projector augmented-wave (GIPAW) method. This work is the first application of the GIPAW method to a 4d transition-metal nucleus. The effects of ultra-soft pseudo-potential parameters, exchange-correlation functionals and structural parameters are precisely examined. Comparison with experimental results allows the validation of this computational formalism.

  14. Three-dimensional registration of intravascular optical coherence tomography and cryo-image volumes for microscopic-resolution validation.

    PubMed

    Prabhu, David; Mehanna, Emile; Gargesha, Madhusudhana; Brandt, Eric; Wen, Di; van Ditzhuijzen, Nienke S; Chamie, Daniel; Yamamoto, Hirosada; Fujino, Yusuke; Alian, Ali; Patel, Jaymin; Costa, Marco; Bezerra, Hiram G; Wilson, David L

    2016-04-01

    Evidence suggests high-resolution, high-contrast, [Formula: see text] intravascular optical coherence tomography (IVOCT) can distinguish plaque types, but further validation is needed, especially for automated plaque characterization. We developed experimental and three-dimensional (3-D) registration methods to provide validation of IVOCT pullback volumes using microscopic, color, and fluorescent cryo-image volumes with optional registered cryo-histology. A specialized registration method matched IVOCT pullback images acquired in the catheter reference frame to a true 3-D cryo-image volume. Briefly, an 11-parameter registration model including a polynomial virtual catheter was initialized within the cryo-image volume, and perpendicular images were extracted, mimicking IVOCT image acquisition. Virtual catheter parameters were optimized to maximize cryo and IVOCT lumen overlap. Multiple assessments suggested that the registration error was better than the [Formula: see text] spacing between IVOCT image frames. Tests on a digital synthetic phantom gave a registration error of only [Formula: see text] (signed distance). Visual assessment of randomly presented nearby frames suggested registration accuracy within 1 IVOCT frame interval ([Formula: see text]). This would eliminate potential misinterpretations confronted by the typical histological approaches to validation, with estimated 1-mm errors. The method can be used to create annotated datasets and automated plaque classification methods and can be extended to other intravascular imaging modalities.

  15. Predictive model for inflammation grades of chronic hepatitis B: Large-scale analysis of clinical parameters and gene expressions.

    PubMed

    Zhou, Weichen; Ma, Yanyun; Zhang, Jun; Hu, Jingyi; Zhang, Menghan; Wang, Yi; Li, Yi; Wu, Lijun; Pan, Yida; Zhang, Yitong; Zhang, Xiaonan; Zhang, Xinxin; Zhang, Zhanqing; Zhang, Jiming; Li, Hai; Lu, Lungen; Jin, Li; Wang, Jiucun; Yuan, Zhenghong; Liu, Jie

    2017-11-01

    Liver biopsy is the gold standard to assess pathological features (eg inflammation grades) for hepatitis B virus-infected patients although it is invasive and traumatic; meanwhile, several gene profiles of chronic hepatitis B (CHB) have been separately described in relatively small hepatitis B virus (HBV)-infected samples. We aimed to analyse correlations among inflammation grades, gene expressions and clinical parameters (serum alanine amino transaminase, aspartate amino transaminase and HBV-DNA) in large-scale CHB samples and to predict inflammation grades by using clinical parameters and/or gene expressions. We analysed gene expressions with three clinical parameters in 122 CHB samples by an improved regression model. Principal component analysis and machine-learning methods including Random Forest, K-nearest neighbour and support vector machine were used for analysis and further diagnosis models. Six normal samples were conducted to validate the predictive model. Significant genes related to clinical parameters were found enriching in the immune system, interferon-stimulated, regulation of cytokine production, anti-apoptosis, and etc. A panel of these genes with clinical parameters can effectively predict binary classifications of inflammation grade (area under the ROC curve [AUC]: 0.88, 95% confidence interval [CI]: 0.77-0.93), validated by normal samples. A panel with only clinical parameters was also valuable (AUC: 0.78, 95% CI: 0.65-0.86), indicating that liquid biopsy method for detecting the pathology of CHB is possible. This is the first study to systematically elucidate the relationships among gene expressions, clinical parameters and pathological inflammation grades in CHB, and to build models predicting inflammation grades by gene expressions and/or clinical parameters as well. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. The effect of uphill and downhill walking on gait parameters: A self-paced treadmill study.

    PubMed

    Kimel-Naor, Shani; Gottlieb, Amihai; Plotnik, Meir

    2017-07-26

    It has been shown that gait parameters vary systematically with the slope of the surface when walking uphill (UH) or downhill (DH) (Andriacchi et al., 1977; Crowe et al., 1996; Kawamura et al., 1991; Kirtley et al., 1985; McIntosh et al., 2006; Sun et al., 1996). However, gait trials performed on inclined surfaces have been subject to certain technical limitations including using fixed speed treadmills (TMs) or, alternatively, sampling only a few gait cycles on inclined ramps. Further, prior work has not analyzed upper body kinematics. This study aims to investigate effects of slope on gait parameters using a self-paced TM (SPTM) which facilitates more natural walking, including measuring upper body kinematics and gait coordination parameters. Gait of 11 young healthy participants was sampled during walking in steady state speed. Measurements were made at slopes of +10°, 0° and -10°. Force plates and a motion capture system were used to reconstruct twenty spatiotemporal gait parameters. For validation, previously described parameters were compared with the literature, and novel parameters measuring upper body kinematics and bilateral gait coordination were also analyzed. Results showed that most lower and upper body gait parameters were affected by walking slope angle. Specifically, UH walking had a higher impact on gait kinematics than DH walking. However, gait coordination parameters were not affected by walking slope, suggesting that gait asymmetry, left-right coordination and gait variability are robust characteristics of walking. The findings of the study are discussed in reference to a potential combined effect of slope and gait speed. Follow-up studies are needed to explore the relative effects of each of these factors. Copyright © 2017. Published by Elsevier Ltd.

  17. Measuring the statistical validity of summary meta-analysis and meta-regression results for use in clinical practice.

    PubMed

    Willis, Brian H; Riley, Richard D

    2017-09-20

    An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  18. AMSR Validation Program

    NASA Astrophysics Data System (ADS)

    Lobl, E. S.

    2003-12-01

    AMSR and AMSR-E are passive microwave radiometers built by NASDA in Japan. AMSR flies on ADEOS II, launched December 14 2001, and AMSR-E flies on NASA's Aqua satellite, launched May 4 2001. The Science teams in both countries have developed algorithms to retrieve different atmospheric parameters from the data obtained by these radiometers. The US Science team has developed a Validation plan that involved several campaigns. In fact most of these campaign have taken place this year: 2003, nicknamed the "Golden Year" for AMSR Validation. The first campaign started in January 2003 with the Extra-tropical precipitation campaign, followed by IOP3 for Cold Lands Processes Experiment (CLPX) in Colorado. After the change out of some of the instruments, the Validation program continued with the Arctic Sea Ice campaign based in Alaska, and followed by CLPX IOP 4, back in Colorado. Soil Moisture EXperiment 03 (SMEX03) started in late June in Alabama and Georgia, and then completed in Oklahoma mid-July. The last campaign in this series is AMSR Antarctic Sea Ice (AASI)/SMEX in Brazil. The major goals of each campaign, and very preliminary data will be shown. Most of these campaigns were in collaboration with the Japanese AMSR scientists.

  19. Validity and reliability of four language mapping paradigms.

    PubMed

    Wilson, Stephen M; Bautista, Alexa; Yen, Melodie; Lauderdale, Stefanie; Eriksson, Dana K

    2017-01-01

    Language areas of the brain can be mapped in individual participants with functional MRI. We investigated the validity and reliability of four language mapping paradigms that may be appropriate for individuals with acquired aphasia: sentence completion, picture naming, naturalistic comprehension, and narrative comprehension. Five neurologically normal older adults were scanned on each of the four paradigms on four separate occasions. Validity was assessed in terms of whether activation patterns reflected the known typical organization of language regions, that is, lateralization to the left hemisphere, and involvement of the left inferior frontal gyrus and the left middle and/or superior temporal gyri. Reliability (test-retest reproducibility) was quantified in terms of the Dice coefficient of similarity, which measures overlap of activations across time points. We explored the impact of different absolute and relative voxelwise thresholds, a range of cluster size cutoffs, and limitation of analyses to a priori potential language regions. We found that the narrative comprehension and sentence completion paradigms offered the best balance of validity and reliability. However, even with optimal combinations of analysis parameters, there were many scans on which known features of typical language organization were not demonstrated, and test-retest reproducibility was only moderate for realistic parameter choices. These limitations in terms of validity and reliability may constitute significant limitations for many clinical or research applications that depend on identifying language regions in individual participants.

  20. Proline puckering parameters for collagen structure simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Di, E-mail: diwu@fudan.edu.cn

    Collagen is made of triple helices rich in proline residues, and hence is influenced by the conformational motions of prolines. Because the backbone motions of prolines are restricted by the helical structures, the only side chain motion—proline puckering—becomes an influential factor that may affect the stability of collagen structures. In molecular simulations, a proper proline puckering population is desired so to yield valid results of the collagen properties. Here we design the proline puckering parameters in order to yield suitable proline puckering populations as demonstrated in the experimental results. We test these parameters in collagen and the proline dipeptide simulations.more » Compared with the results of the PDB and the quantum calculations, we propose the proline puckering parameters for the selected collagen model simulations.« less

  1. Measuring the statistical validity of summary meta‐analysis and meta‐regression results for use in clinical practice

    PubMed Central

    Riley, Richard D.

    2017-01-01

    An important question for clinicians appraising a meta‐analysis is: are the findings likely to be valid in their own practice—does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity—where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple (‘leave‐one‐out’) cross‐validation technique, we demonstrate how we may test meta‐analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta‐analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta‐analysis and a tailored meta‐regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within‐study variance, between‐study variance, study sample size, and the number of studies in the meta‐analysis. Finally, we apply Vn to two published meta‐analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta‐analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28620945

  2. Cross-validation to select Bayesian hierarchical models in phylogenetics.

    PubMed

    Duchêne, Sebastián; Duchêne, David A; Di Giallonardo, Francesca; Eden, John-Sebastian; Geoghegan, Jemma L; Holt, Kathryn E; Ho, Simon Y W; Holmes, Edward C

    2016-05-26

    Recent developments in Bayesian phylogenetic models have increased the range of inferences that can be drawn from molecular sequence data. Accordingly, model selection has become an important component of phylogenetic analysis. Methods of model selection generally consider the likelihood of the data under the model in question. In the context of Bayesian phylogenetics, the most common approach involves estimating the marginal likelihood, which is typically done by integrating the likelihood across model parameters, weighted by the prior. Although this method is accurate, it is sensitive to the presence of improper priors. We explored an alternative approach based on cross-validation that is widely used in evolutionary analysis. This involves comparing models according to their predictive performance. We analysed simulated data and a range of viral and bacterial data sets using a cross-validation approach to compare a variety of molecular clock and demographic models. Our results show that cross-validation can be effective in distinguishing between strict- and relaxed-clock models and in identifying demographic models that allow growth in population size over time. In most of our empirical data analyses, the model selected using cross-validation was able to match that selected using marginal-likelihood estimation. The accuracy of cross-validation appears to improve with longer sequence data, particularly when distinguishing between relaxed-clock models. Cross-validation is a useful method for Bayesian phylogenetic model selection. This method can be readily implemented even when considering complex models where selecting an appropriate prior for all parameters may be difficult.

  3. The Association between Parameters of Malnutrition and Diagnostic Measures of Sarcopenia in Geriatric Outpatients

    PubMed Central

    Reijnierse, Esmee M.; Trappenburg, Marijke C.; Leter, Morena J.; Blauw, Gerard Jan; de van der Schueren, Marian A. E.; Meskers, Carel G. M.; Maier, Andrea B.

    2015-01-01

    Objectives Diagnostic criteria for sarcopenia include measures of muscle mass, muscle strength and physical performance. Consensus on the definition of sarcopenia has not been reached yet. To improve insight into the most clinically valid definition of sarcopenia, this study aimed to compare the association between parameters of malnutrition, as a risk factor in sarcopenia, and diagnostic measures of sarcopenia in geriatric outpatients. Material and Methods This study is based on data from a cross-sectional study conducted in a geriatric outpatient clinic including 185 geriatric outpatients (mean age 82 years). Parameters of malnutrition included risk of malnutrition (assessed by the Short Nutritional Assessment Questionnaire), loss of appetite, unintentional weight loss and underweight (body mass index <22 kg/m2). Diagnostic measures of sarcopenia included relative muscle mass (lean mass and appendicular lean mass [ALM] as percentages), absolute muscle mass (total lean mass and ALM/height2), handgrip strength and walking speed. All diagnostic measures of sarcopenia were standardized. Associations between parameters of malnutrition (independent variables) and diagnostic measures of sarcopenia (dependent variables) were analysed using multivariate linear regression models adjusted for age, body mass, fat mass and height in separate models. Results None of the parameters of malnutrition was consistently associated with diagnostic measures of sarcopenia. The strongest associations were found for both relative and absolute muscle mass; less stronger associations were found for muscle strength and physical performance. Underweight (p = <0.001) and unintentional weight loss (p = 0.031) were most strongly associated with higher lean mass percentage after adjusting for age. Loss of appetite (p = 0.003) and underweight (p = 0.021) were most strongly associated with lower total lean mass after adjusting for age and fat mass. Conclusion Parameters of malnutrition relate

  4. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2017-07-01

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  5. Initial Retrieval Validation from the Joint Airborne IASI Validation Experiment (JAIVEx)

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Smith, WIlliam L.; Larar, Allen M.; Taylor, Jonathan P.; Revercomb, Henry E.; Mango, Stephen A.; Schluessel, Peter; Calbet, Xavier

    2007-01-01

    The Joint Airborne IASI Validation Experiment (JAIVEx) was conducted during April 2007 mainly for validation of the Infrared Atmospheric Sounding Interferometer (IASI) on the MetOp satellite, but also included a strong component focusing on validation of the Atmospheric InfraRed Sounder (AIRS) aboard the AQUA satellite. The cross validation of IASI and AIRS is important for the joint use of their data in the global Numerical Weather Prediction process. Initial inter-comparisons of geophysical products have been conducted from different aspects, such as using different measurements from airborne ultraspectral Fourier transform spectrometers (specifically, the NPOESS Airborne Sounder Testbed Interferometer (NAST-I) and the Scanning-High resolution Interferometer Sounder (S-HIS) aboard the NASA WB-57 aircraft), UK Facility for Airborne Atmospheric Measurements (FAAM) BAe146-301 aircraft insitu instruments, dedicated dropsondes, radiosondes, and ground based Raman Lidar. An overview of the JAIVEx retrieval validation plan and some initial results of this field campaign are presented.

  6. Parameter extraction of coupling-of-modes equations including coupling between two surface acoustic waves on SiO2/Cu/LiNbO3 structures

    NASA Astrophysics Data System (ADS)

    Huang, Yulin; Bao, Jingfu; Li, Xinyi; Zhang, Benfeng; Omori, Tatsuya; Hashimoto, Ken-ya

    2018-07-01

    This paper describes extraction of parameters of an extended coupling-of-modes (COM) model including coupling between Rayleigh and shear-horizontal (SH) surface acoustic waves (SAW) on the SiO2-overlay/Cu-grating/LiNbO3-substrate structure. First, dispersion characteristics of two SAWs are calculated by the finite element method (FEM), and are fitted with those given by the extended COM. Then variation of COM parameters is expressed in polynomials in terms of the SiO2 and Cu thicknesses and the rotation angle Θ of LiNbO3. Then it is shown how the optimal Θ giving the SH SAW suppression changes with the thicknesses. The result agrees well with that obtained directly by FEM. It is also shown the optimal Θ changes abruptly at certain Cu thickness, and is due to decoupling between two SAW modes.

  7. NPOESS Preparatory Project Validation Program for the Cross-track Infrared Sounder

    NASA Astrophysics Data System (ADS)

    Barnet, C.; Gu, D.; Nalli, N. R.

    2009-12-01

    The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Program, in partnership with National Aeronautical Space Administration (NASA), will launch the NPOESS Preparatory Project (NPP), a risk reduction and data continuity mission, prior to the first operational NPOESS launch. The NPOESS Program, in partnership with Northrop Grumman Aerospace Systems, will execute the NPP Calibration and Validation (Cal/Val) program to ensure the data products comply with the requirements of the sponsoring agencies. The Cross-track Infrared Sounder (CrIS) and the Advanced Technology Microwave Sounder (ATMS) are two of the instruments that make up the suite of sensors on NPP. Together, CrIS and ATMS will produce three Environmental Data Records (EDRs) including the Atmospheric Vertical Temperature Profile (AVTP), Atmospheric Vertical Moisture Profile (AVMP), and the Atmospheric Vertical Pressure Profile (AVPP). The AVTP and the AVMP are both NPOESS Key Performance Parameters (KPPs). The validation plans establish science and user community leadership and participation, and demonstrated, cost-effective Cal/Val approaches. This presentation will provide an overview of the collaborative data, techniques, and schedule for the validation of the NPP CrIS and ATMS environmental data products.

  8. Verification and Validation of Residual Stresses in Bi-Material Composite Rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Stacy Michelle; Hanson, Alexander Anthony; Briggs, Timothy

    Process-induced residual stresses commonly occur in composite structures composed of dissimilar materials. These residual stresses form due to differences in the composite materials’ coefficients of thermal expansion and the shrinkage upon cure exhibited by polymer matrix materials. Depending upon the specific geometric details of the composite structure and the materials’ curing parameters, it is possible that these residual stresses could result in interlaminar delamination or fracture within the composite. Therefore, the consideration of potential residual stresses is important when designing composite parts and their manufacturing processes. However, the experimental determination of residual stresses in prototype parts can be time andmore » cost prohibitive. As an alternative to physical measurement, it is possible for computational tools to be used to quantify potential residual stresses in composite prototype parts. Therefore, the objectives of the presented work are to demonstrate a simplistic method for simulating residual stresses in composite parts, as well as the potential value of sensitivity and uncertainty quantification techniques during analyses for which material property parameters are unknown. Specifically, a simplified residual stress modeling approach, which accounts for coefficient of thermal expansion mismatch and polymer shrinkage, is implemented within the Sandia National Laboratories’ developed SIERRA/SolidMechanics code. Concurrent with the model development, two simple, bi-material structures composed of a carbon fiber/epoxy composite and aluminum, a flat plate and a cylinder, are fabricated and the residual stresses are quantified through the measurement of deformation. Then, in the process of validating the developed modeling approach with the experimental residual stress data, manufacturing process simulations of the two simple structures are developed and undergo a formal verification and validation process, including a mesh

  9. Evaluation of the Validated Soil Moisture Product from the SMAP Radiometer

    NASA Technical Reports Server (NTRS)

    O'Neill, P.; Chan, S.; Colliander, A.; Dunbar, S.; Njoku, E.; Bindlish, R.; Chen, F.; Jackson, T.; Burgin, M.; Piepmeier, J.; hide

    2016-01-01

    NASA's Soil Moisture Active Passive (SMAP) mission launched on January 31, 2015 into a sun-synchronous 6 am/6 pm orbit with an objective to produce global mapping of high-resolution soil moisture and freeze-thaw state every 2-3 days using an L-band (active) radar and an L-band (passive) radiometer. The SMAP radiometer began acquiring routine science data on March 31, 2015 and continues to operate nominally. SMAP's radiometer-derived soil moisture product (L2_SM_P) provides soil moisture estimates posted on a 36 km fixed Earth grid using brightness temperature observations from descending (6 am) passes and ancillary data. A beta quality version of L2_SM_P was released to the public in September, 2015, with the fully validated L2_SM_P soil moisture data expected to be released in May, 2016. Additional improvements (including optimization of retrieval algorithm parameters and upscaling approaches) and methodology expansions (including increasing the number of core sites, model-based intercomparisons, and results from several intensive field campaigns) are anticipated in moving from accuracy assessment of the beta quality data to an evaluation of the fully validated L2_SM_P data product.

  10. The shape parameter and its modification for defining coastal profiles

    NASA Astrophysics Data System (ADS)

    Türker, Umut; Kabdaşli, M. Sedat

    2009-03-01

    The shape parameter is important for the theoretical description of the sandy coastal profiles. This parameter has previously been defined as a function of the sediment-settling velocity. However, the settling velocity cannot be characterized over a wide range of sediment grains. This, in turn, limits the calculation of the shape parameter over a wide range. This paper provides a simpler and faster analytical equation to describe the shape parameter. The validity of the equation has been tested and compared with the previously estimated values given in both graphical and tabular forms. The results of this study indicate that the analytical solutions of the shape parameter improved the usability of profile better than graphical solutions, predicting better results both at the surf zone and offshore.

  11. GNSS-Based Space Weather Systems Including COSMIC Ionospheric Measurements

    NASA Technical Reports Server (NTRS)

    Komjathy, Attila; Mandrake, Lukas; Wilson, Brian; Iijima, Byron; Pi, Xiaoqing; Hajj, George; Mannucci, Anthony J.

    2006-01-01

    The presentation outline includes University Corporation for Atmospheric Research (UCAR) and Jet Propulsion Laboratory (JPL) product comparisons, assimilating ground-based global positioning satellites (GPS) and COSMIC into JPL/University of Southern California (USC) Global Assimilative Ionospheric Model (GAIM), and JPL/USC GAIM validation. The discussion of comparisons examines Abel profiles and calibrated TEC. The JPL/USC GAIM validation uses Arecibo ISR, Jason-2 VTEC, and Abel profiles.

  12. Systematic parameter estimation and sensitivity analysis using a multidimensional PEMFC model coupled with DAKOTA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chao Yang; Luo, Gang; Jiang, Fangming

    2010-05-01

    Current computational models for proton exchange membrane fuel cells (PEMFCs) include a large number of parameters such as boundary conditions, material properties, and numerous parameters used in sub-models for membrane transport, two-phase flow and electrochemistry. In order to successfully use a computational PEMFC model in design and optimization, it is important to identify critical parameters under a wide variety of operating conditions, such as relative humidity, current load, temperature, etc. Moreover, when experimental data is available in the form of polarization curves or local distribution of current and reactant/product species (e.g., O2, H2O concentrations), critical parameters can be estimated inmore » order to enable the model to better fit the data. Sensitivity analysis and parameter estimation are typically performed using manual adjustment of parameters, which is also common in parameter studies. We present work to demonstrate a systematic approach based on using a widely available toolkit developed at Sandia called DAKOTA that supports many kinds of design studies, such as sensitivity analysis as well as optimization and uncertainty quantification. In the present work, we couple a multidimensional PEMFC model (which is being developed, tested and later validated in a joint effort by a team from Penn State Univ. and Sandia National Laboratories) with DAKOTA through the mapping of model parameters to system responses. Using this interface, we demonstrate the efficiency of performing simple parameter studies as well as identifying critical parameters using sensitivity analysis. Finally, we show examples of optimization and parameter estimation using the automated capability in DAKOTA.« less

  13. Validation study and routine control monitoring of moist heat sterilization procedures.

    PubMed

    Shintani, Hideharu

    2012-06-01

    The proposed approach to validation of steam sterilization in autoclaves follows the basic life cycle concepts applicable to all validation programs. Understand the function of sterilization process, develop and understand the cycles to carry out the process, and define a suitable test or series of tests to confirm that the function of the process is suitably ensured by the structure provided. Sterilization of product and components and parts that come in direct contact with sterilized product is the most critical of pharmaceutical processes. Consequently, this process requires a most rigorous and detailed approach to validation. An understanding of the process requires a basic understanding of microbial death, the parameters that facilitate that death, the accepted definition of sterility, and the relationship between the definition and sterilization parameters. Autoclaves and support systems need to be designed, installed, and qualified in a manner that ensures their continued reliability. Lastly, the test program must be complete and definitive. In this paper, in addition to validation study, documentation of IQ, OQ and PQ concretely were described.

  14. Estimating Finite Rate of Population Increase for Sharks Based on Vital Parameters

    PubMed Central

    Liu, Kwang-Ming; Chin, Chien-Pang; Chen, Chun-Hui; Chang, Jui-Han

    2015-01-01

    The vital parameter data for 62 stocks, covering 38 species, collected from the literature, including parameters of age, growth, and reproduction, were log-transformed and analyzed using multivariate analyses. Three groups were identified and empirical equations were developed for each to describe the relationships between the predicted finite rates of population increase (λ’) and the vital parameters, maximum age (Tmax), age at maturity (Tm), annual fecundity (f/Rc)), size at birth (Lb), size at maturity (Lm), and asymptotic length (L∞). Group (1) included species with slow growth rates (0.034 yr-1 < k < 0.103 yr-1) and extended longevity (26 yr < Tmax < 81 yr), e.g., shortfin mako Isurus oxyrinchus, dusky shark Carcharhinus obscurus, etc.; Group (2) included species with fast growth rates (0.103 yr-1 < k < 0.358 yr-1) and short longevity (9 yr < Tmax < 26 yr), e.g., starspotted smoothhound Mustelus manazo, gray smoothhound M. californicus, etc.; Group (3) included late maturing species (Lm/L∞ ≧ 0.75) with moderate longevity (Tmax < 29 yr), e.g., pelagic thresher Alopias pelagicus, sevengill shark Notorynchus cepedianus. The empirical equation for all data pooled was also developed. The λ’ values estimated by these empirical equations showed good agreement with those calculated using conventional demographic analysis. The predictability was further validated by an independent data set of three species. The empirical equations developed in this study not only reduce the uncertainties in estimation but also account for the difference in life history among groups. This method therefore provides an efficient and effective approach to the implementation of precautionary shark management measures. PMID:26576058

  15. Influence of Powder Injection Parameters in High-Pressure Cold Spray

    NASA Astrophysics Data System (ADS)

    Ozdemir, Ozan C.; Widener, Christian A.

    2017-10-01

    High-pressure cold spray systems are becoming widely accepted for use in the structural repair of surface defects of expensive machinery parts used in industrial and military equipment. The deposition quality of cold spray repairs is typically validated using coupon testing and through destructive analysis of mock-ups or first articles for a defined set of parameters. In order to provide a reliable repair, it is important to not only maintain the same processing parameters, but also to have optimum fixed parameters, such as the particle injection location. This study is intended to provide insight into the sensitivity of the way that the powder is injected upstream of supersonic nozzles in high-pressure cold spray systems and the effects of variations in injection parameters on the nature of the powder particle kinetics. Experimentally validated three-dimensional computational fluid dynamics (3D CFD) models are implemented to study the particle impact conditions for varying powder feeder tube size, powder feeder tube axial misalignment, and radial powder feeder injection location on the particle velocity and the deposition shape of aluminum alloy 6061. Outputs of the models are statistically analyzed to explore the shape of the spray plume distribution and resulting coating buildup.

  16. A practical iterative PID tuning method for mechanical systems using parameter chart

    NASA Astrophysics Data System (ADS)

    Kang, M.; Cheong, J.; Do, H. M.; Son, Y.; Niculescu, S.-I.

    2017-10-01

    In this paper, we propose a method of iterative proportional-integral-derivative parameter tuning for mechanical systems that possibly possess hidden mechanical resonances, using a parameter chart which visualises the closed-loop characteristics in a 2D parameter space. We employ a hypothetical assumption that the considered mechanical systems have their upper limit of the derivative feedback gain, from which the feasible region in the parameter chart becomes fairly reduced and thus the gain selection can be extremely simplified. Then, a two-directional parameter search is carried out within the feasible region in order to find the best set of parameters. Experimental results show the validity of the assumption used and the proposed parameter tuning method.

  17. Automated ensemble assembly and validation of microbial genomes.

    PubMed

    Koren, Sergey; Treangen, Todd J; Hill, Christopher M; Pop, Mihai; Phillippy, Adam M

    2014-05-03

    The continued democratization of DNA sequencing has sparked a new wave of development of genome assembly and assembly validation methods. As individual research labs, rather than centralized centers, begin to sequence the majority of new genomes, it is important to establish best practices for genome assembly. However, recent evaluations such as GAGE and the Assemblathon have concluded that there is no single best approach to genome assembly. Instead, it is preferable to generate multiple assemblies and validate them to determine which is most useful for the desired analysis; this is a labor-intensive process that is often impossible or unfeasible. To encourage best practices supported by the community, we present iMetAMOS, an automated ensemble assembly pipeline; iMetAMOS encapsulates the process of running, validating, and selecting a single assembly from multiple assemblies. iMetAMOS packages several leading open-source tools into a single binary that automates parameter selection and execution of multiple assemblers, scores the resulting assemblies based on multiple validation metrics, and annotates the assemblies for genes and contaminants. We demonstrate the utility of the ensemble process on 225 previously unassembled Mycobacterium tuberculosis genomes as well as a Rhodobacter sphaeroides benchmark dataset. On these real data, iMetAMOS reliably produces validated assemblies and identifies potential contamination without user intervention. In addition, intelligent parameter selection produces assemblies of R. sphaeroides comparable to or exceeding the quality of those from the GAGE-B evaluation, affecting the relative ranking of some assemblers. Ensemble assembly with iMetAMOS provides users with multiple, validated assemblies for each genome. Although computationally limited to small or mid-sized genomes, this approach is the most effective and reproducible means for generating high-quality assemblies and enables users to select an assembly best tailored to

  18. System and method for motor parameter estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luhrs, Bin; Yan, Ting

    2014-03-18

    A system and method for determining unknown values of certain motor parameters includes a motor input device connectable to an electric motor having associated therewith values for known motor parameters and an unknown value of at least one motor parameter. The motor input device includes a processing unit that receives a first input from the electric motor comprising values for the known motor parameters for the electric motor and receive a second input comprising motor data on a plurality of reference motors, including values for motor parameters corresponding to the known motor parameters of the electric motor and values formore » motor parameters corresponding to the at least one unknown motor parameter value of the electric motor. The processor determines the unknown value of the at least one motor parameter from the first input and the second input and determines a motor management strategy for the electric motor based thereon.« less

  19. An approach to measure parameter sensitivity in watershed ...

    EPA Pesticide Factsheets

    Hydrologic responses vary spatially and temporally according to watershed characteristics. In this study, the hydrologic models that we developed earlier for the Little Miami River (LMR) and Las Vegas Wash (LVW) watersheds were used for detail sensitivity analyses. To compare the relative sensitivities of the hydrologic parameters of these two models, we used Normalized Root Mean Square Error (NRMSE). By combining the NRMSE index with the flow duration curve analysis, we derived an approach to measure parameter sensitivities under different flow regimes. Results show that the parameters related to groundwater are highly sensitive in the LMR watershed, whereas the LVW watershed is primarily sensitive to near surface and impervious parameters. The high and medium flows are more impacted by most of the parameters. Low flow regime was highly sensitive to groundwater related parameters. Moreover, our approach is found to be useful in facilitating model development and calibration. This journal article describes hydrological modeling of climate change and land use changes on stream hydrology, and elucidates the importance of hydrological model construction in generating valid modeling results.

  20. Test validity and performance validity: considerations in providing a framework for development of an ability-focused neuropsychological test battery.

    PubMed

    Larrabee, Glenn J

    2014-11-01

    Literature on test validity and performance validity is reviewed to propose a framework for specification of an ability-focused battery (AFB). Factor analysis supports six domains of ability: first, verbal symbolic; secondly, visuoperceptual and visuospatial judgment and problem solving; thirdly, sensorimotor skills; fourthly, attention/working memory; fifthly, processing speed; finally, learning and memory (which can be divided into verbal and visual subdomains). The AFB should include at least three measures for each of the six domains, selected based on various criteria for validity including sensitivity to presence of disorder, sensitivity to severity of disorder, correlation with important activities of daily living, and containing embedded/derived measures of performance validity. Criterion groups should include moderate and severe traumatic brain injury, and Alzheimer's disease. Validation groups should also include patients with left and right hemisphere stroke, to determine measures sensitive to lateralized cognitive impairment and so that the moderating effects of auditory comprehension impairment and neglect can be analyzed on AFB measures. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Validation of Bayesian analysis of compartmental kinetic models in medical imaging.

    PubMed

    Sitek, Arkadiusz; Li, Quanzheng; El Fakhri, Georges; Alpert, Nathaniel M

    2016-10-01

    Kinetic compartmental analysis is frequently used to compute physiologically relevant quantitative values from time series of images. In this paper, a new approach based on Bayesian analysis to obtain information about these parameters is presented and validated. The closed-form of the posterior distribution of kinetic parameters is derived with a hierarchical prior to model the standard deviation of normally distributed noise. Markov chain Monte Carlo methods are used for numerical estimation of the posterior distribution. Computer simulations of the kinetics of F18-fluorodeoxyglucose (FDG) are used to demonstrate drawing statistical inferences about kinetic parameters and to validate the theory and implementation. Additionally, point estimates of kinetic parameters and covariance of those estimates are determined using the classical non-linear least squares approach. Posteriors obtained using methods proposed in this work are accurate as no significant deviation from the expected shape of the posterior was found (one-sided P>0.08). It is demonstrated that the results obtained by the standard non-linear least-square methods fail to provide accurate estimation of uncertainty for the same data set (P<0.0001). The results of this work validate new methods for a computer simulations of FDG kinetics. Results show that in situations where the classical approach fails in accurate estimation of uncertainty, Bayesian estimation provides an accurate information about the uncertainties in the parameters. Although a particular example of FDG kinetics was used in the paper, the methods can be extended for different pharmaceuticals and imaging modalities. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  2. Sensitivity of the model error parameter specification in weak-constraint four-dimensional variational data assimilation

    NASA Astrophysics Data System (ADS)

    Shaw, Jeremy A.; Daescu, Dacian N.

    2017-08-01

    This article presents the mathematical framework to evaluate the sensitivity of a forecast error aspect to the input parameters of a weak-constraint four-dimensional variational data assimilation system (w4D-Var DAS), extending the established theory from strong-constraint 4D-Var. Emphasis is placed on the derivation of the equations for evaluating the forecast sensitivity to parameters in the DAS representation of the model error statistics, including bias, standard deviation, and correlation structure. A novel adjoint-based procedure for adaptive tuning of the specified model error covariance matrix is introduced. Results from numerical convergence tests establish the validity of the model error sensitivity equations. Preliminary experiments providing a proof-of-concept are performed using the Lorenz multi-scale model to illustrate the theoretical concepts and potential benefits for practical applications.

  3. Matrix Extension and Multilaboratory Validation of Arsenic Speciation Method EAM §4.10 to Include Wine.

    PubMed

    Tanabe, Courtney K; Hopfer, Helene; Ebeler, Susan E; Nelson, Jenny; Conklin, Sean D; Kubachka, Kevin M; Wilson, Robert A

    2017-05-24

    A multilaboratory validation (MLV) was performed to extend the U.S. Food and Drug Administration's (FDA) analytical method Elemental Analysis Manual (EAM) §4.10, High Performance Liquid Chromatography-Inductively Coupled Plasma-Mass Spectrometric Determination of Four Arsenic Species in Fruit Juice, to include wine. Several method modifications were examined to optimize the method for the analysis of dimethylarsinic acid, monomethylarsonic acid, arsenate (AsV), and arsenite (AsIII) in various wine matrices with a range of ethanol concentrations by liquid chromatography-inductively coupled plasma-mass spectrometry. The optimized method was used for the analysis of five wines of different classifications (red, white, sparkling, rosé, and fortified) by three laboratories. Additionally, the samples were fortified in duplicate at levels of approximately 5, 10, and 30 μg kg -1 and analyzed by each participating laboratory. The combined average fortification recoveries of dimethylarsinic acid, monomethylarsonic acid, and inorganic arsenic (iAs the sum of AsV and AsIII) in these samples were 101, 100, and 100%, respectively. To further demonstrate the method, 46 additional wine samples were analyzed. The total As levels of all the wines analyzed in this study were between 1.0 and 38.2 μg kg -1 . The overall average mass balance based on the sum of the species recovered from the chromatographic separation compared to the total As measured was 89% with a range of 51-135%. In the 51 analyzed samples, iAs accounted for an average of 91% of the sum of the species with a range of 37-100%.

  4. Molybdenum disulfide and water interaction parameters

    NASA Astrophysics Data System (ADS)

    Heiranian, Mohammad; Wu, Yanbin; Aluru, Narayana R.

    2017-09-01

    Understanding the interaction between water and molybdenum disulfide (MoS2) is of crucial importance to investigate the physics of various applications involving MoS2 and water interfaces. An accurate force field is required to describe water and MoS2 interactions. In this work, water-MoS2 force field parameters are derived using the high-accuracy random phase approximation (RPA) method and validated by comparing to experiments. The parameters obtained from the RPA method result in water-MoS2 interface properties (solid-liquid work of adhesion) in good comparison to the experimental measurements. An accurate description of MoS2-water interaction will facilitate the study of MoS2 in applications such as DNA sequencing, sea water desalination, and power generation.

  5. 21 CFR 1271.230 - Process validation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...

  6. 21 CFR 1271.230 - Process validation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...

  7. 21 CFR 1271.230 - Process validation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...

  8. 21 CFR 1271.230 - Process validation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...

  9. 21 CFR 1271.230 - Process validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Process validation. 1271.230 Section 1271.230 Food..., AND CELLULAR AND TISSUE-BASED PRODUCTS Current Good Tissue Practice § 1271.230 Process validation. (a... validation activities and results must be documented, including the date and signature of the individual(s...

  10. Optimization of operating parameters in polysilicon chemical vapor deposition reactor with response surface methodology

    NASA Astrophysics Data System (ADS)

    An, Li-sha; Liu, Chun-jiao; Liu, Ying-wen

    2018-05-01

    In the polysilicon chemical vapor deposition reactor, the operating parameters are complex to affect the polysilicon's output. Therefore, it is very important to address the coupling problem of multiple parameters and solve the optimization in a computationally efficient manner. Here, we adopted Response Surface Methodology (RSM) to analyze the complex coupling effects of different operating parameters on silicon deposition rate (R) and further achieve effective optimization of the silicon CVD system. Based on finite numerical experiments, an accurate RSM regression model is obtained and applied to predict the R with different operating parameters, including temperature (T), pressure (P), inlet velocity (V), and inlet mole fraction of H2 (M). The analysis of variance is conducted to describe the rationality of regression model and examine the statistical significance of each factor. Consequently, the optimum combination of operating parameters for the silicon CVD reactor is: T = 1400 K, P = 3.82 atm, V = 3.41 m/s, M = 0.91. The validation tests and optimum solution show that the results are in good agreement with those from CFD model and the deviations of the predicted values are less than 4.19%. This work provides a theoretical guidance to operate the polysilicon CVD process.

  11. Validation of Student and Parent Reported Data on the Basic Grant Application Form, 1978-79 Comprehensive Validation Guide. Procedural Manual for: Validation of Cases Referred by Institutions; Validation of Cases Referred by the Office of Education; Recovery of Overpayments.

    ERIC Educational Resources Information Center

    Smith, Karen; And Others

    Procedures for validating data reported by students and parents on an application for Basic Educational Opportunity Grants were developed in 1978 for the U.S. Office of Education (OE). Validation activities include: validation of flagged Student Eligibility Reports (SERs) for students whose schools are part of the Alternate Disbursement System;…

  12. On approaches to analyze the sensitivity of simulated hydrologic fluxes to model parameters in the community land model

    DOE PAGES

    Bao, Jie; Hou, Zhangshuan; Huang, Maoyi; ...

    2015-12-04

    Here, effective sensitivity analysis approaches are needed to identify important parameters or factors and their uncertainties in complex Earth system models composed of multi-phase multi-component phenomena and multiple biogeophysical-biogeochemical processes. In this study, the impacts of 10 hydrologic parameters in the Community Land Model on simulations of runoff and latent heat flux are evaluated using data from a watershed. Different metrics, including residual statistics, the Nash-Sutcliffe coefficient, and log mean square error, are used as alternative measures of the deviations between the simulated and field observed values. Four sensitivity analysis (SA) approaches, including analysis of variance based on the generalizedmore » linear model, generalized cross validation based on the multivariate adaptive regression splines model, standardized regression coefficients based on a linear regression model, and analysis of variance based on support vector machine, are investigated. Results suggest that these approaches show consistent measurement of the impacts of major hydrologic parameters on response variables, but with differences in the relative contributions, particularly for the secondary parameters. The convergence behaviors of the SA with respect to the number of sampling points are also examined with different combinations of input parameter sets and output response variables and their alternative metrics. This study helps identify the optimal SA approach, provides guidance for the calibration of the Community Land Model parameters to improve the model simulations of land surface fluxes, and approximates the magnitudes to be adjusted in the parameter values during parametric model optimization.« less

  13. Experimental validation of structural optimization methods

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.

    1992-01-01

    The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.

  14. FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.

    PubMed

    Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver

    2014-06-14

    Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.

  15. Parameters estimation for reactive transport: A way to test the validity of a reactive model

    NASA Astrophysics Data System (ADS)

    Aggarwal, Mohit; Cheikh Anta Ndiaye, Mame; Carrayrou, Jérôme

    The chemical parameters used in reactive transport models are not known accurately due to the complexity and the heterogeneous conditions of a real domain. We will present an efficient algorithm in order to estimate the chemical parameters using Monte-Carlo method. Monte-Carlo methods are very robust for the optimisation of the highly non-linear mathematical model describing reactive transport. Reactive transport of tributyltin (TBT) through natural quartz sand at seven different pHs is taken as the test case. Our algorithm will be used to estimate the chemical parameters of the sorption of TBT onto the natural quartz sand. By testing and comparing three models of surface complexation, we show that the proposed adsorption model cannot explain the experimental data.

  16. Determining Parameters of Fractional-Exponential Heredity Kernels of Nonlinear Viscoelastic Materials

    NASA Astrophysics Data System (ADS)

    Golub, V. P.; Pavlyuk, Ya. V.; Fernati, P. V.

    2017-07-01

    The problem of determining the parameters of fractional-exponential heredity kernels of nonlinear viscoelastic materials is solved. The methods for determining the parameters that are used in the cubic theory of viscoelasticity and the nonlinear theories based on the conditions of similarity of primary creep curves and isochronous creep diagrams are analyzed. The parameters of fractional-exponential heredity kernels are determined and experimentally validated for the oriented polypropylene, FM3001 and FM10001 nylon fibers, microplastics, TC 8/3-250 glass-reinforced plastic, SWAM glass-reinforced plastic, and contact molding glass-reinforced plastic.

  17. {gamma} parameter and Solar System constraint in chameleon-Brans-Dicke theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saaidi, Kh.; Mohammadi, A.; Sheikhahmadi, H.

    2011-05-15

    The post Newtonian parameter is considered in the chameleon-Brans-Dicke model. In the first step, the general form of this parameter and also effective gravitational constant is obtained. An arbitrary function for f({Phi}), which indicates the coupling between matter and scalar field, is introduced to investigate validity of solar system constraint. It is shown that the chameleon-Brans-Dicke model can satisfy the solar system constraint and gives us an {omega} parameter of order 10{sup 4}, which is in comparable to the constraint which has been indicated in [19].

  18. Validation of Satellite Derived Cloud Properties Over the Southeastern Pacific

    NASA Astrophysics Data System (ADS)

    Ayers, J.; Minnis, P.; Zuidema, P.; Sun-Mack, S.; Palikonda, R.; Nguyen, L.; Fairall, C.

    2005-12-01

    Satellite measurements of cloud properties and the radiation budget are essential for understanding meso- and large-scale processes that determine the variability in climate over the southeastern Pacific. Of particular interest in this region is the prevalent stratocumulus cloud deck. The stratocumulus albedos are directly related to cloud microphysical properties that need to be accurately characterized in Global Climate Models (GCMs) to properly estimate the Earth's radiation budget. Meteorological observations in this region are sparse causing large uncertainties in initialized model fields. Remote sensing from satellites can provide a wealth of information about the clouds in this region, but it is vital to validate the remotely sensed parameters and to understand their relationship to other parameters that are not directly observed by the satellites. The variety of measurements from the R/V Roger Revelle during the 2003 STRATUS cruise and from the R/V Ron Brown during EPIC 2001 and the 2004 STRATUS cruises are suitable for validating and improving the interpretation of the satellite derived cloud properties. In this study, satellite-derived cloud properties including coverage, height, optical depth, and liquid water path are compared with in situ measurements taken during the EPIC and STRATUS cruises. The remotely sensed values are derived from Geostationary Operational Environmental Satellite (GOES) imager data, Moderate Resolution Imaging Spectroradiometer (MODIS) data from the Terra and Aqua satellites, and from the Visible and Infrared Scanner (VIRS) aboard the Tropical Rainfall Measuring Mission (TRMM) satellite. The products from this study will include regional monthly cloud climatologies derived from the GOES data for the 2003 and 2004 cruises as well as micro and macro physical cloud property retrievals centered over the ship tracks from MODIS and VIRS.

  19. Evidence-Based Diagnostic Algorithm for Glioma: Analysis of the Results of Pathology Panel Review and Molecular Parameters of EORTC 26951 and 26882 Trials.

    PubMed

    Kros, Johan M; Huizer, Karin; Hernández-Laín, Aurelio; Marucci, Gianluca; Michotte, Alex; Pollo, Bianca; Rushing, Elisabeth J; Ribalta, Teresa; French, Pim; Jaminé, David; Bekka, Nawal; Lacombe, Denis; van den Bent, Martin J; Gorlia, Thierry

    2015-06-10

    With the rapid discovery of prognostic and predictive molecular parameters for glioma, the status of histopathology in the diagnostic process should be scrutinized. Our project aimed to construct a diagnostic algorithm for gliomas based on molecular and histologic parameters with independent prognostic values. The pathology slides of 636 patients with gliomas who had been included in EORTC 26951 and 26882 trials were reviewed using virtual microscopy by a panel of six neuropathologists who independently scored 18 histologic features and provided an overall diagnosis. The molecular data for IDH1, 1p/19q loss, EGFR amplification, loss of chromosome 10 and chromosome arm 10q, gain of chromosome 7, and hypermethylation of the promoter of MGMT were available for some of the cases. The slides were divided in discovery (n = 426) and validation sets (n = 210). The diagnostic algorithm resulting from analysis of the discovery set was validated in the latter. In 66% of cases, consensus of overall diagnosis was present. A diagnostic algorithm consisting of two molecular markers and one consensus histologic feature was created by conditional inference tree analysis. The order of prognostic significance was: 1p/19q loss, EGFR amplification, and astrocytic morphology, which resulted in the identification of four diagnostic nodes. Validation of the nodes in the validation set confirmed the prognostic value (P < .001). We succeeded in the creation of a timely diagnostic algorithm for anaplastic glioma based on multivariable analysis of consensus histopathology and molecular parameters. © 2015 by American Society of Clinical Oncology.

  20. Tube-Load Model Parameter Estimation for Monitoring Arterial Hemodynamics

    PubMed Central

    Zhang, Guanqun; Hahn, Jin-Oh; Mukkamala, Ramakrishna

    2011-01-01

    A useful model of the arterial system is the uniform, lossless tube with parametric load. This tube-load model is able to account for wave propagation and reflection (unlike lumped-parameter models such as the Windkessel) while being defined by only a few parameters (unlike comprehensive distributed-parameter models). As a result, the parameters may be readily estimated by accurate fitting of the model to available arterial pressure and flow waveforms so as to permit improved monitoring of arterial hemodynamics. In this paper, we review tube-load model parameter estimation techniques that have appeared in the literature for monitoring wave reflection, large artery compliance, pulse transit time, and central aortic pressure. We begin by motivating the use of the tube-load model for parameter estimation. We then describe the tube-load model, its assumptions and validity, and approaches for estimating its parameters. We next summarize the various techniques and their experimental results while highlighting their advantages over conventional techniques. We conclude the review by suggesting future research directions and describing potential applications. PMID:22053157

  1. The validation of a generalized Hooke's law for coronary arteries.

    PubMed

    Wang, Chong; Zhang, Wei; Kassab, Ghassan S

    2008-01-01

    The exponential form of constitutive model is widely used in biomechanical studies of blood vessels. There are two main issues, however, with this model: 1) the curve fits of experimental data are not always satisfactory, and 2) the material parameters may be oversensitive. A new type of strain measure in a generalized Hooke's law for blood vessels was recently proposed by our group to address these issues. The new model has one nonlinear parameter and six linear parameters. In this study, the stress-strain equation is validated by fitting the model to experimental data of porcine coronary arteries. Material constants of left anterior descending artery and right coronary artery for the Hooke's law were computed with a separable nonlinear least-squares method with an excellent goodness of fit. A parameter sensitivity analysis shows that the stability of material constants is improved compared with the exponential model and a biphasic model. A boundary value problem was solved to demonstrate that the model prediction can match the measured arterial deformation under experimental loading conditions. The validated constitutive relation will serve as a basis for the solution of various boundary value problems of cardiovascular biomechanics.

  2. Real time identification of the internal combustion engine combustion parameters based on the vibration velocity signal

    NASA Astrophysics Data System (ADS)

    Zhao, Xiuliang; Cheng, Yong; Wang, Limei; Ji, Shaobo

    2017-03-01

    Accurate combustion parameters are the foundations of effective closed-loop control of engine combustion process. Some combustion parameters, including the start of combustion, the location of peak pressure, the maximum pressure rise rate and its location, can be identified from the engine block vibration signals. These signals often include non-combustion related contributions, which limit the prompt acquisition of the combustion parameters computationally. The main component in these non-combustion related contributions is considered to be caused by the reciprocating inertia force excitation (RIFE) of engine crank train. A mathematical model is established to describe the response of the RIFE. The parameters of the model are recognized with a pattern recognition algorithm, and the response of the RIFE is predicted and then the related contributions are removed from the measured vibration velocity signals. The combustion parameters are extracted from the feature points of the renovated vibration velocity signals. There are angle deviations between the feature points in the vibration velocity signals and those in the cylinder pressure signals. For the start of combustion, a system bias is adopted to correct the deviation and the error bound of the predicted parameters is within 1.1°. To predict the location of the maximum pressure rise rate and the location of the peak pressure, algorithms based on the proportion of high frequency components in the vibration velocity signals are introduced. Tests results show that the two parameters are able to be predicted within 0.7° and 0.8° error bound respectively. The increase from the knee point preceding the peak value point to the peak value in the vibration velocity signals is used to predict the value of the maximum pressure rise rate. Finally, a monitoring frame work is inferred to realize the combustion parameters prediction. Satisfactory prediction for combustion parameters in successive cycles is achieved, which

  3. Automatic control system generation for robot design validation

    NASA Technical Reports Server (NTRS)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  4. New fundamental parameters for attitude representation

    NASA Astrophysics Data System (ADS)

    Patera, Russell P.

    2017-08-01

    A new attitude parameter set is developed to clarify the geometry of combining finite rotations in a rotational sequence and in combining infinitesimal angular increments generated by angular rate. The resulting parameter set of six Pivot Parameters represents a rotation as a great circle arc on a unit sphere that can be located at any clocking location in the rotation plane. Two rotations are combined by linking their arcs at either of the two intersection points of the respective rotation planes. In a similar fashion, linking rotational increments produced by angular rate is used to derive the associated kinematical equations, which are linear and have no singularities. Included in this paper is the derivation of twelve Pivot Parameter elements that represent all twelve Euler Angle sequences, which enables efficient conversions between Pivot Parameters and any Euler Angle sequence. Applications of this new parameter set include the derivation of quaternions and the quaternion composition rule, as well as, the derivation of the analytical solution to time dependent coning motion. The relationships between Pivot Parameters and traditional parameter sets are included in this work. Pivot Parameters are well suited for a variety of aerospace applications due to their effective composition rule, singularity free kinematic equations, efficient conversion to and from Euler Angle sequences and clarity of their geometrical foundation.

  5. Universal Parameter Measurement and Sensorless Vector Control of Induction and Permanent Magnet Synchronous Motors

    NASA Astrophysics Data System (ADS)

    Yamamoto, Shu; Ara, Takahiro

    Recently, induction motors (IMs) and permanent-magnet synchronous motors (PMSMs) have been used in various industrial drive systems. The features of the hardware device used for controlling the adjustable-speed drive in these motors are almost identical. Despite this, different techniques are generally used for parameter measurement and speed-sensorless control of these motors. If the same technique can be used for parameter measurement and sensorless control, a highly versatile adjustable-speed-drive system can be realized. In this paper, the authors describe a new universal sensorless control technique for both IMs and PMSMs (including salient pole and nonsalient pole machines). A mathematical model applicable for IMs and PMSMs is discussed. Using this model, the authors derive the proposed universal sensorless vector control algorithm on the basis of estimation of the stator flux linkage vector. All the electrical motor parameters are determined by a unified test procedure. The proposed method is implemented on three test machines. The actual driving test results demonstrate the validity of the proposed method.

  6. Blind Source Parameters for Performance Evaluation of Despeckling Filters.

    PubMed

    Biradar, Nagashettappa; Dewal, M L; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images.

  7. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    PubMed Central

    Biradar, Nagashettappa; Dewal, M. L.; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  8. A convenient and accurate wide-range parameter relationship between Buckingham and Morse potential energy functions

    NASA Astrophysics Data System (ADS)

    Lim, Teik-Cheng; Dawson, James Alexander

    2018-05-01

    This study explores the close-range, short-range and long-range relationships between the parameters of the Morse and Buckingham potential energy functions. The results show that the close-range and short-range relationships are valid for bond compression and for very small changes in bond length, respectively, while the long-range relationship is valid for bond stretching. A wide-range relationship is proposed to combine the comparative advantages of the close-range, short-range and long-range parameter relationships. The wide-range relationship is useful for replacing the close-range, short-range and long-range parameter relationships, thereby preventing the undesired effects of potential energy jumps resulting from functional switching between the close-range, short-range and long-range interaction energies.

  9. Parameter Estimation and Model Validation of Nonlinear Dynamical Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, Henry; Gill, Philip

    In the performance period of this work under a DOE contract, the co-PIs, Philip Gill and Henry Abarbanel, developed new methods for statistical data assimilation for problems of DOE interest, including geophysical and biological problems. This included numerical optimization algorithms for variational principles, new parallel processing Monte Carlo routines for performing the path integrals of statistical data assimilation. These results have been summarized in the monograph: “Predicting the Future: Completing Models of Observed Complex Systems” by Henry Abarbanel, published by Spring-Verlag in June 2013. Additional results and details have appeared in the peer reviewed literature.

  10. AMSR2 Soil Moisture Product Validation

    NASA Technical Reports Server (NTRS)

    Bindlish, R.; Jackson, T.; Cosh, M.; Koike, T.; Fuiji, X.; de Jeu, R.; Chan, S.; Asanuma, J.; Berg, A.; Bosch, D.; hide

    2017-01-01

    The Advanced Microwave Scanning Radiometer 2 (AMSR2) is part of the Global Change Observation Mission-Water (GCOM-W) mission. AMSR2 fills the void left by the loss of the Advanced Microwave Scanning Radiometer Earth Observing System (AMSR-E) after almost 10 years. Both missions provide brightness temperature observations that are used to retrieve soil moisture. Merging AMSR-E and AMSR2 will help build a consistent long-term dataset. Before tackling the integration of AMSR-E and AMSR2 it is necessary to conduct a thorough validation and assessment of the AMSR2 soil moisture products. This study focuses on validation of the AMSR2 soil moisture products by comparison with in situ reference data from a set of core validation sites. Three products that rely on different algorithms were evaluated; the JAXA Soil Moisture Algorithm (JAXA), the Land Parameter Retrieval Model (LPRM), and the Single Channel Algorithm (SCA). Results indicate that overall the SCA has the best performance based upon the metrics considered.

  11. Determination of some phenolic compounds in red wine by RP-HPLC: method development and validation.

    PubMed

    Burin, Vívian Maria; Arcari, Stefany Grützmann; Costa, Léa Luzia Freitas; Bordignon-Luiz, Marilde T

    2011-09-01

    A methodology employing reversed-phase high-performance liquid chromatography (RP-HPLC) was developed and validated for simultaneous determination of five phenolic compounds in red wine. The chromatographic separation was carried out in a C(18) column with water acidify with acetic acid (pH 2.6) (solvent A) and 20% solvent A and 80% acetonitrile (solvent B) as the mobile phase. The validation parameters included: selectivity, linearity, range, limits of detection and quantitation, precision and accuracy, using an internal standard. All calibration curves were linear (R(2) > 0.999) within the range, and good precision (RSD < 2.6%) and recovery (80-120%) was obtained for all compounds. This method was applied to quantify phenolics in red wine samples from Santa Catarina State, Brazil, and good separation peaks for phenolic compounds in these wines were observed.

  12. Estimating physiological skin parameters from hyperspectral signatures

    NASA Astrophysics Data System (ADS)

    Vyas, Saurabh; Banerjee, Amit; Burlina, Philippe

    2013-05-01

    We describe an approach for estimating human skin parameters, such as melanosome concentration, collagen concentration, oxygen saturation, and blood volume, using hyperspectral radiometric measurements (signatures) obtained from in vivo skin. We use a computational model based on Kubelka-Munk theory and the Fresnel equations. This model forward maps the skin parameters to a corresponding multiband reflectance spectra. Machine-learning-based regression is used to generate the inverse map, and hence estimate skin parameters from hyperspectral signatures. We test our methods using synthetic and in vivo skin signatures obtained in the visible through the short wave infrared domains from 24 patients of both genders and Caucasian, Asian, and African American ethnicities. Performance validation shows promising results: good agreement with the ground truth and well-established physiological precepts. These methods have potential use in the characterization of skin abnormalities and in minimally-invasive prescreening of malignant skin cancers.

  13. A new model for including the effect of fly ash on biochemical methane potential.

    PubMed

    Gertner, Pablo; Huiliñir, César; Pinto-Villegas, Paula; Castillo, Alejandra; Montalvo, Silvio; Guerrero, Lorna

    2017-10-01

    The modelling of the effect of trace elements on anaerobic digestion, and specifically the effect of fly ash, has been scarcely studied. Thus, the present work was aimed at the development of a new function that allows accumulated methane models to predict the effect of FA on the volume of methane accumulation. For this, purpose five fly ash concentrations (10, 25, 50, 250 and 500mg/L) using raw and pre-treated sewage sludge were used to calibrate the new function, while three fly ash concentrations were used (40, 150 and 350mg/L) for validation. Three models for accumulated methane volume (the modified Gompertz equation, the logistic function, and the transfer function) were evaluated. The results showed that methane production increased in the presence of FA when the sewage sludge was not pre-treated, while with pretreated sludge there is inhibition of methane production at FA concentrations higher than 50mg/L. In the calibration of the proposed function, it fits well with the experimental data under all the conditions, including the inhibition and stimulating zones, with the values of the parameters of the methane production models falling in the range of those reported in the literature. For validation experiments, the model succeeded in representing the behavior of new experiments in both the stimulating and inhibiting zones, with NRMSE and R 2 ranging from 0.3577 to 0.03714 and 0.2209 to 0.9911, respectively. Thus, the proposed model is robust and valid for the studied conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Estimation of real-time runway surface contamination using flight data recorder parameters

    NASA Astrophysics Data System (ADS)

    Curry, Donovan

    Within this research effort, the development of an analytic process for friction coefficient estimation is presented. Under static equilibrium, the sum of forces and moments acting on the aircraft, in the aircraft body coordinate system, while on the ground at any instant is equal to zero. Under this premise the longitudinal, lateral and normal forces due to landing are calculated along with the individual deceleration components existent when an aircraft comes to a rest during ground roll. In order to validate this hypothesis a six degree of freedom aircraft model had to be created and landing tests had to be simulated on different surfaces. The simulated aircraft model includes a high fidelity aerodynamic model, thrust model, landing gear model, friction model and antiskid model. Three main surfaces were defined in the friction model; dry, wet and snow/ice. Only the parameters recorded by an FDR are used directly from the aircraft model all others are estimated or known a priori. The estimation of unknown parameters is also presented in the research effort. With all needed parameters a comparison and validation with simulated and estimated data, under different runway conditions, is performed. Finally, this report presents results of a sensitivity analysis in order to provide a measure of reliability of the analytic estimation process. Linear and non-linear sensitivity analysis has been performed in order to quantify the level of uncertainty implicit in modeling estimated parameters and how they can affect the calculation of the instantaneous coefficient of friction. Using the approach of force and moment equilibrium about the CG at landing to reconstruct the instantaneous coefficient of friction appears to be a reasonably accurate estimate when compared to the simulated friction coefficient. This is also true when the FDR and estimated parameters are introduced to white noise and when crosswind is introduced to the simulation. After the linear analysis the

  15. Validating the Heat Stress Indices for Using In Heavy Work Activities in Hot and Dry Climates.

    PubMed

    Hajizadeh, Roohalah; Golbabaei, Farideh; Farhang Dehghan, Somayeh; Beheshti, Mohammad Hossein; Jafari, Sayed Mohammad; Taheri, Fereshteh

    2016-01-01

    Necessity of evaluating heat stress in the workplace, require validation of indices and selection optimal index. The present study aimed to assess the precision and validity of some heat stress indices and select the optimum index for using in heavy work activities in hot and dry climates. It carried out on 184 workers from 40 brick kilns workshops in the city of Qom, central Iran (as representative hot and dry climates). After reviewing the working process and evaluation the activity of workers and the type of work, environmental and physiological parameters according to standards recommended by International Organization for Standardization (ISO) including ISO 7243 and ISO 9886 were measured and indices were calculated. Workers engaged in indoor kiln experienced the highest values of natural wet temperature, dry temperature, globe temperature and relative humidity among studied sections (P<0.05). Indoor workplaces had the higher levels of all environmental parameters than outdoors (P=0.0001), except for air velocity. The wet-bulb globe temperature (WBGT) and heat stress index (HSI) indices had the highest correlation with other physiological parameters among the other heat stress indices. Relationship between WBGT index and carotid artery temperature (r=0.49), skin temperature (r=0.319), and oral temperature (r=0.203) was statistically significant (P=0.006). Since WBGT index, as the most applicable index for evaluating heat stress in workplaces is approved by ISO, and due to the positive features of WBGT such as ease of measurement and calculation, and with respect to some limitation in application of HSI; WBGT can be introduced as the most valid empirical index of heat stress in the brick workshops.

  16. UV Spectrophotometric Determination and Validation of Hydroquinone in Liposome.

    PubMed

    Khoshneviszadeh, Rabea; Fazly Bazzaz, Bibi Sedigheh; Housaindokht, Mohammad Reza; Ebrahim-Habibi, Azadeh; Rajabi, Omid

    2015-01-01

    The method has been developed and validated for the determination of hydroquinone in liposomal formulation. The samples were dissolved in methanol and evaluated in 293 nm. The validation parameters such as linearity, accuracy, precision, specificity, limit of detection (LOD) and limit of quantitation (LOQ) were determined. The calibration curve was linear in 1-50 µg/mL range of hydroquinone analyte with a regression coefficient of 0.9998. This study showed that the liposomal hydroquinone composed of phospholipid (7.8 %), cholesterol (1.5 %), alpha ketopherol (0.17 %) and hydroquinone (0.5 %) did not absorb wavelength of 293 nm if it diluted 500 times by methanol. The concentration of hydroquinone reached 10 µg/mL after 500 times of dilution. Furthermore, various validation parameters as per ICH Q2B guideline were tested and found accordingly. The recovery percentages of liposomal hydroquinone were found 102 ± 0.8, 99 ± 0.2 and 98 ± 0.4 for 80%, 100% and 120% respectively. The relative standard deviation values of inter and intra-day precisions were <%2. LOD and LOQ were 0.24 and 0.72 µg/mL respectively.

  17. Development and Validation of an Agency for Healthcare Research and Quality Indicator for Mortality After Congenital Heart Surgery Harmonized With Risk Adjustment for Congenital Heart Surgery (RACHS-1) Methodology.

    PubMed

    Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee

    2016-05-20

    The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients <18 years with International Classification of Diseases, Ninth Revision, Clinical Modification procedure codes for congenital heart surgery or nonspecific heart surgery combined with congenital heart disease diagnosis codes. The final model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  18. Effects of Including Misidentified Sharks in Life History Analyses: A Case Study on the Grey Reef Shark Carcharhinus amblyrhynchos from Papua New Guinea.

    PubMed

    Smart, Jonathan J; Chin, Andrew; Baje, Leontine; Green, Madeline E; Appleyard, Sharon A; Tobin, Andrew J; Simpfendorfer, Colin A; White, William T

    2016-01-01

    Fisheries observer programs are used around the world to collect crucial information and samples that inform fisheries management. However, observer error may misidentify similar-looking shark species. This raises questions about the level of error that species misidentifications could introduce to estimates of species' life history parameters. This study addressed these questions using the Grey Reef Shark Carcharhinus amblyrhynchos as a case study. Observer misidentification rates were quantified by validating species identifications using diagnostic photographs taken on board supplemented with DNA barcoding. Length-at-age and maturity ogive analyses were then estimated and compared with and without the misidentified individuals. Vertebrae were retained from a total of 155 sharks identified by observers as C. amblyrhynchos. However, 22 (14%) of these were sharks were misidentified by the observers and were subsequently re-identified based on photographs and/or DNA barcoding. Of the 22 individuals misidentified as C. amblyrhynchos, 16 (73%) were detected using photographs and a further 6 via genetic validation. If misidentified individuals had been included, substantial error would have been introduced to both the length-at-age and the maturity estimates. Thus validating the species identification, increased the accuracy of estimated life history parameters for C. amblyrhynchos. From the corrected sample a multi-model inference approach was used to estimate growth for C. amblyrhynchos using three candidate models. The model averaged length-at-age parameters for C. amblyrhynchos with the sexes combined were L∞ = 159 cm TL and L0 = 72 cm TL. Females mature at a greater length (l50 = 136 cm TL) and older age (A50 = 9.1 years) than males (l50 = 123 cm TL; A50 = 5.9 years). The inclusion of techniques to reduce misidentification in observer programs will improve the results of life history studies and ultimately improve management through the use of more accurate data

  19. Effects of Including Misidentified Sharks in Life History Analyses: A Case Study on the Grey Reef Shark Carcharhinus amblyrhynchos from Papua New Guinea

    PubMed Central

    Smart, Jonathan J.; Chin, Andrew; Baje, Leontine; Green, Madeline E.; Appleyard, Sharon A.; Tobin, Andrew J.; Simpfendorfer, Colin A.; White, William T.

    2016-01-01

    Fisheries observer programs are used around the world to collect crucial information and samples that inform fisheries management. However, observer error may misidentify similar-looking shark species. This raises questions about the level of error that species misidentifications could introduce to estimates of species’ life history parameters. This study addressed these questions using the Grey Reef Shark Carcharhinus amblyrhynchos as a case study. Observer misidentification rates were quantified by validating species identifications using diagnostic photographs taken on board supplemented with DNA barcoding. Length-at-age and maturity ogive analyses were then estimated and compared with and without the misidentified individuals. Vertebrae were retained from a total of 155 sharks identified by observers as C. amblyrhynchos. However, 22 (14%) of these were sharks were misidentified by the observers and were subsequently re-identified based on photographs and/or DNA barcoding. Of the 22 individuals misidentified as C. amblyrhynchos, 16 (73%) were detected using photographs and a further 6 via genetic validation. If misidentified individuals had been included, substantial error would have been introduced to both the length-at-age and the maturity estimates. Thus validating the species identification, increased the accuracy of estimated life history parameters for C. amblyrhynchos. From the corrected sample a multi-model inference approach was used to estimate growth for C. amblyrhynchos using three candidate models. The model averaged length-at-age parameters for C. amblyrhynchos with the sexes combined were  L¯∞ = 159 cm TL and  L¯0 = 72 cm TL. Females mature at a greater length (l50 = 136 cm TL) and older age (A50 = 9.1 years) than males (l50 = 123 cm TL; A50 = 5.9 years). The inclusion of techniques to reduce misidentification in observer programs will improve the results of life history studies and ultimately improve management through the use of more

  20. Validation of learning assessments: A primer.

    PubMed

    Peeters, Michael J; Martin, Beth A

    2017-09-01

    The Accreditation Council for Pharmacy Education's Standards 2016 has placed greater emphasis on validating educational assessments. In this paper, we describe validity, reliability, and validation principles, drawing attention to the conceptual change that highlights one validity with multiple evidence sources; to this end, we recommend abandoning historical (confusing) terminology associated with the term validity. Further, we describe and apply Kane's framework (scoring, generalization, extrapolation, and implications) for the process of validation, with its inferences and conclusions from varied uses of assessment instruments by different colleges and schools of pharmacy. We then offer five practical recommendations that can improve reporting of validation evidence in pharmacy education literature. We describe application of these recommendations, including examples of validation evidence in the context of pharmacy education. After reading this article, the reader should be able to understand the current concept of validation, and use a framework as they validate and communicate their own institution's learning assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Pulsatility Index as a Diagnostic Parameter of Reciprocating Wall Shear Stress Parameters in Physiological Pulsating Waveforms

    PubMed Central

    Avrahami, Idit; Kersh, Dikla

    2016-01-01

    Arterial wall shear stress (WSS) parameters are widely used for prediction of the initiation and development of atherosclerosis and arterial pathologies. Traditional clinical evaluation of arterial condition relies on correlations of WSS parameters with average flow rate (Q) and heart rate (HR) measurements. We show that for pulsating flow waveforms in a straight tube with flow reversals that lead to significant reciprocating WSS, the measurements of HR and Q are not sufficient for prediction of WSS parameters. Therefore, we suggest adding a third quantity—known as the pulsatility index (PI)—which is defined as the peak-to-peak flow rate amplitude normalized by Q. We examine several pulsating flow waveforms with and without flow reversals using a simulation of a Womersley model in a straight rigid tube and validate the simulations through experimental study using particle image velocimetry (PIV). The results indicate that clinically relevant WSS parameters such as the percentage of negative WSS (P[%]), oscillating shear index (OSI) and the ratio of minimum to maximum shear stress rates (min/max), are better predicted when the PI is used in conjunction with HR and Q. Therefore, we propose to use PI as an additional and essential diagnostic quantity for improved predictability of the reciprocating WSS. PMID:27893801

  2. Bibliography for aircraft parameter estimation

    NASA Technical Reports Server (NTRS)

    Iliff, Kenneth W.; Maine, Richard E.

    1986-01-01

    An extensive bibliography in the field of aircraft parameter estimation has been compiled. This list contains definitive works related to most aircraft parameter estimation approaches. Theoretical studies as well as practical applications are included. Many of these publications are pertinent to subjects peripherally related to parameter estimation, such as aircraft maneuver design or instrumentation considerations.

  3. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    PubMed Central

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  4. Inversion group (IG) fitting: A new T1 mapping method for modified look-locker inversion recovery (MOLLI) that allows arbitrary inversion groupings and rest periods (including no rest period).

    PubMed

    Sussman, Marshall S; Yang, Issac Y; Fok, Kai-Ho; Wintersperger, Bernd J

    2016-06-01

    The Modified Look-Locker Inversion Recovery (MOLLI) technique is used for T1 mapping in the heart. However, a drawback of this technique is that it requires lengthy rest periods in between inversion groupings to allow for complete magnetization recovery. In this work, a new MOLLI fitting algorithm (inversion group [IG] fitting) is presented that allows for arbitrary combinations of inversion groupings and rest periods (including no rest period). Conventional MOLLI algorithms use a three parameter fitting model. In IG fitting, the number of parameters is two plus the number of inversion groupings. This increased number of parameters permits any inversion grouping/rest period combination. Validation was performed through simulation, phantom, and in vivo experiments. IG fitting provided T1 values with less than 1% discrepancy across a range of inversion grouping/rest period combinations. By comparison, conventional three parameter fits exhibited up to 30% discrepancy for some combinations. The one drawback with IG fitting was a loss of precision-approximately 30% worse than the three parameter fits. IG fitting permits arbitrary inversion grouping/rest period combinations (including no rest period). The cost of the algorithm is a loss of precision relative to conventional three parameter fits. Magn Reson Med 75:2332-2340, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  5. Detecting severity of delamination in a lap joint using S-parameters

    NASA Astrophysics Data System (ADS)

    Islam, M. M.; Huang, H.

    2018-03-01

    The scattering parameters (S-parameters) represent the frequency response of a two-port linear time-invariant network. Treating a lap joint structure instrumented with two piezoelectric wafer active transducers (PWaTs) as such a network, this paper investigates the application of the S-parameters for detecting the severity of delamination in the lap joint. The pulse-echo signal calculated from the reflection coefficients, namely the S 11 and S 22-parameters, can be divided into three signals, i.e. the excitation, resonant, and echo signals, based on their respective time spans. Analyzing the effects of the delamination on the resonant signal enables us to identify the resonance at which the resonant characteristics of the PWaTs are least sensitive to the delamination. Only at this resonance, we found that the reflection coefficients and the amplitude of the first arrival echo signal changed monotonously with the increase of the delamination length. This discovery is further validated by the time-domain pitch-catch signal calculated from the transmission coefficient (i.e. the S 21-parameter). In addition, comparing the pulse-echo signals obtained from both PWaTs enables us to determine the side of the lap joint that the delamination is located at. This work establishes the S-parameters as an effective tool to evaluate the effects of damage on the PWaT resonant characteristics, based on which the PWaT resonance can be selected judiciously for damage severity detection. Correlating the reflection and transmission coefficients also provide addition validations that increase the detection confidence.

  6. Calibration of sea ice dynamic parameters in an ocean-sea ice model using an ensemble Kalman filter

    NASA Astrophysics Data System (ADS)

    Massonnet, F.; Goosse, H.; Fichefet, T.; Counillon, F.

    2014-07-01

    The choice of parameter values is crucial in the course of sea ice model development, since parameters largely affect the modeled mean sea ice state. Manual tuning of parameters will soon become impractical, as sea ice models will likely include more parameters to calibrate, leading to an exponential increase of the number of possible combinations to test. Objective and automatic methods for parameter calibration are thus progressively called on to replace the traditional heuristic, "trial-and-error" recipes. Here a method for calibration of parameters based on the ensemble Kalman filter is implemented, tested and validated in the ocean-sea ice model NEMO-LIM3. Three dynamic parameters are calibrated: the ice strength parameter P*, the ocean-sea ice drag parameter Cw, and the atmosphere-sea ice drag parameter Ca. In twin, perfect-model experiments, the default parameter values are retrieved within 1 year of simulation. Using 2007-2012 real sea ice drift data, the calibration of the ice strength parameter P* and the oceanic drag parameter Cw improves clearly the Arctic sea ice drift properties. It is found that the estimation of the atmospheric drag Ca is not necessary if P* and Cw are already estimated. The large reduction in the sea ice speed bias with calibrated parameters comes with a slight overestimation of the winter sea ice areal export through Fram Strait and a slight improvement in the sea ice thickness distribution. Overall, the estimation of parameters with the ensemble Kalman filter represents an encouraging alternative to manual tuning for ocean-sea ice models.

  7. Validation of updated neutronic calculation models proposed for Atucha-II PHWR. Part I: Benchmark comparisons of WIMS-D5 and DRAGON cell and control rod parameters with MCNP5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mollerach, R.; Leszczynski, F.; Fink, J.

    2006-07-01

    In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure-vessel design with 451 vertical coolant channels, and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update calculation methods and models (cell, supercell and reactor) was recently carried out coveringmore » cell, supercell (control rod) and core calculations. As a validation of the new models some benchmark comparisons were done with Monte Carlo calculations with MCNP5. This paper presents comparisons of cell and supercell benchmark problems based on a slightly idealized model of the Atucha-I core obtained with the WIMS-D5 and DRAGON codes with MCNP5 results. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, and more symmetric than Atucha-II Cell parameters compared include cell k-infinity, relative power levels of the different rings of fuel rods, and some two-group macroscopic cross sections. Supercell comparisons include supercell k-infinity changes due to the control rods (tubes) of steel and hafnium. (authors)« less

  8. A comparison between two powder compaction parameters of plasticity: the effective medium A parameter and the Heckel 1/K parameter.

    PubMed

    Mahmoodi, Foad; Klevan, Ingvild; Nordström, Josefina; Alderborn, Göran; Frenning, Göran

    2013-09-10

    The purpose of the research was to introduce a procedure to derive a powder compression parameter (EM A) representing particle yield stress using an effective medium equation and to compare the EM A parameter with the Heckel compression parameter (1/K). 16 pharmaceutical powders, including drugs and excipients, were compressed in a materials testing instrument and powder compression profiles were derived using the EM and Heckel equations. The compression profiles thus obtained could be sub-divided into regions among which one region was approximately linear and from this region, the compression parameters EM A and 1/K were calculated. A linear relationship between the EM A parameter and the 1/K parameter was obtained with a strong correlation. The slope of the plot was close to 1 (0.84) and the intercept of the plot was small in comparison to the range of parameter values obtained. The relationship between the theoretical EM A parameter and the 1/K parameter supports the interpretation of the empirical Heckel parameter as being a measure of yield stress. It is concluded that the combination of Heckel and EM equations represents a suitable procedure to derive a value of particle plasticity from powder compression data. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. A Method of Q-Matrix Validation for the Linear Logistic Test Model

    PubMed Central

    Baghaei, Purya; Hohensinn, Christine

    2017-01-01

    The linear logistic test model (LLTM) is a well-recognized psychometric model for examining the components of difficulty in cognitive tests and validating construct theories. The plausibility of the construct model, summarized in a matrix of weights, known as the Q-matrix or weight matrix, is tested by (1) comparing the fit of LLTM with the fit of the Rasch model (RM) using the likelihood ratio (LR) test and (2) by examining the correlation between the Rasch model item parameters and LLTM reconstructed item parameters. The problem with the LR test is that it is almost always significant and, consequently, LLTM is rejected. The drawback of examining the correlation coefficient is that there is no cut-off value or lower bound for the magnitude of the correlation coefficient. In this article we suggest a simulation method to set a minimum benchmark for the correlation between item parameters from the Rasch model and those reconstructed by the LLTM. If the cognitive model is valid then the correlation coefficient between the RM-based item parameters and the LLTM-reconstructed item parameters derived from the theoretical weight matrix should be greater than those derived from the simulated matrices. PMID:28611721

  10. [Development of an analyzing system for soil parameters based on NIR spectroscopy].

    PubMed

    Zheng, Li-Hua; Li, Min-Zan; Sun, Hong

    2009-10-01

    A rapid estimation system for soil parameters based on spectral analysis was developed by using object-oriented (OO) technology. A class of SOIL was designed. The instance of the SOIL class is the object of the soil samples with the particular type, specific physical properties and spectral characteristics. Through extracting the effective information from the modeling spectral data of soil object, a map model was established between the soil parameters and its spectral data, while it was possible to save the mapping model parameters in the database of the model. When forecasting the content of any soil parameter, the corresponding prediction model of this parameter can be selected with the same soil type and the similar soil physical properties of objects. And after the object of target soil samples was carried into the prediction model and processed by the system, the accurate forecasting content of the target soil samples could be obtained. The system includes modules such as file operations, spectra pretreatment, sample analysis, calibrating and validating, and samples content forecasting. The system was designed to run out of equipment. The parameters and spectral data files (*.xls) of the known soil samples can be input into the system. Due to various data pretreatment being selected according to the concrete conditions, the results of predicting content will appear in the terminal and the forecasting model can be stored in the model database. The system reads the predicting models and their parameters are saved in the model database from the module interface, and then the data of the tested samples are transferred into the selected model. Finally the content of soil parameters can be predicted by the developed system. The system was programmed with Visual C++6.0 and Matlab 7.0. And the Access XP was used to create and manage the model database.

  11. Machining Parameters Optimization using Hybrid Firefly Algorithm and Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Farahlina Johari, Nur; Zain, Azlan Mohd; Haszlinna Mustaffa, Noorfa; Udin, Amirmudin

    2017-09-01

    Firefly Algorithm (FA) is a metaheuristic algorithm that is inspired by the flashing behavior of fireflies and the phenomenon of bioluminescent communication and the algorithm is used to optimize the machining parameters (feed rate, depth of cut, and spindle speed) in this research. The algorithm is hybridized with Particle Swarm Optimization (PSO) to discover better solution in exploring the search space. Objective function of previous research is used to optimize the machining parameters in turning operation. The optimal machining cutting parameters estimated by FA that lead to a minimum surface roughness are validated using ANOVA test.

  12. The New Millennium Program: Validating Advanced Technologies for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Minning, Charles P.; Luers, Philip

    1999-01-01

    This presentation reviews the activities of the New Millennium Program (NMP) in validating advanced technologies for space missions. The focus of these breakthrough technologies are to enable new capabilities to fulfill the science needs, while reducing costs of future missions. There is a broad spectrum of NMP partners, including government agencies, universities and private industry. The DS-1 was launched on October 24, 1998. Amongst the technologies validated by the NMP on DS-1 are: a Low Power Electronics Experiment, the Power Activation and Switching Module, Multi-Functional Structures. The first two of these technologies are operational and the data analysis is still ongoing. The third program is also operational, and its performance parameters have been verified. The second program, DS-2, was launched January 3 1999. It is expected to impact near Mars southern polar region on 3 December 1999. The technologies used on this mission awaiting validation are an advanced microcontroller, a power microelectronics unit, an evolved water experiment and soil thermal conductivity experiment, Lithium-Thionyl Chloride batteries, the flexible cable interconnect, aeroshell/entry system, and a compact telecom system. EO-1 on schedule for launch in December 1999 carries several technologies to be validated. Amongst these are: a Carbon-Carbon Radiator, an X-band Phased Array Antenna, a pulsed plasma thruster, a wideband advanced recorder processor, an atmospheric corrector, lightweight flexible solar arrays, Advanced Land Imager and the Hyperion instrument

  13. Edge Modeling by Two Blur Parameters in Varying Contrasts.

    PubMed

    Seo, Suyoung

    2018-06-01

    This paper presents a method of modeling edge profiles with two blur parameters, and estimating and predicting those edge parameters with varying brightness combinations and camera-to-object distances (COD). First, the validity of the edge model is proven mathematically. Then, it is proven experimentally with edges from a set of images captured for specifically designed target sheets and with edges from natural images. Estimation of the two blur parameters for each observed edge profile is performed with a brute-force method to find parameters that produce global minimum errors. Then, using the estimated blur parameters, actual blur parameters of edges with arbitrary brightness combinations are predicted using a surface interpolation method (i.e., kriging). The predicted surfaces show that the two blur parameters of the proposed edge model depend on both dark-side edge brightness and light-side edge brightness following a certain global trend. This is similar across varying CODs. The proposed edge model is compared with a one-blur parameter edge model using experiments of the root mean squared error for fitting the edge models to each observed edge profile. The comparison results suggest that the proposed edge model has superiority over the one-blur parameter edge model in most cases where edges have varying brightness combinations.

  14. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  15. Realistic sampling of anisotropic correlogram parameters for conditional simulation of daily rainfields

    NASA Astrophysics Data System (ADS)

    Gyasi-Agyei, Yeboah

    2018-01-01

    This paper has established a link between the spatial structure of radar rainfall, which more robustly describes the spatial structure, and gauge rainfall for improved daily rainfield simulation conditioned on the limited gauged data for regions with or without radar records. A two-dimensional anisotropic exponential function that has parameters of major and minor axes lengths, and direction, is used to describe the correlogram (spatial structure) of daily rainfall in the Gaussian domain. The link is a copula-based joint distribution of the radar-derived correlogram parameters that uses the gauge-derived correlogram parameters and maximum daily temperature as covariates of the Box-Cox power exponential margins and Gumbel copula. While the gauge-derived, radar-derived and the copula-derived correlogram parameters reproduced the mean estimates similarly using leave-one-out cross-validation of ordinary kriging, the gauge-derived parameters yielded higher standard deviation (SD) of the Gaussian quantile which reflects uncertainty in over 90% of cases. However, the distribution of the SD generated by the radar-derived and the copula-derived parameters could not be distinguished. For the validation case, the percentage of cases of higher SD by the gauge-derived parameter sets decreased to 81.2% and 86.6% for the non-calibration and the calibration periods, respectively. It has been observed that 1% reduction in the Gaussian quantile SD can cause over 39% reduction in the SD of the median rainfall estimate, actual reduction being dependent on the distribution of rainfall of the day. Hence the main advantage of using the most correct radar correlogram parameters is to reduce the uncertainty associated with conditional simulations that rely on SD through kriging.

  16. Photospheric properties and fundamental parameters of M dwarfs

    NASA Astrophysics Data System (ADS)

    Rajpurohit, A. S.; Allard, F.; Teixeira, G. D. C.; Homeier, D.; Rajpurohit, S.; Mousis, O.

    2018-02-01

    Context. M dwarfs are an important source of information when studying and probing the lower end of the Hertzsprung-Russell (HR) diagram, down to the hydrogen-burning limit. Being the most numerous and oldest stars in the galaxy, they carry fundamental information on its chemical history. The presence of molecules in their atmospheres, along with various condensed species, complicates our understanding of their physical properties and thus makes the determination of their fundamental stellar parameters more challenging and difficult. Aim. The aim of this study is to perform a detailed spectroscopic analysis of the high-resolution H-band spectra of M dwarfs in order to determine their fundamental stellar parameters and to validate atmospheric models. The present study will also help us to understand various processes, including dust formation and depletion of metals onto dust grains in M dwarf atmospheres. The high spectral resolution also provides a unique opportunity to constrain other chemical and physical processes that occur in a cool atmosphere. Methods: The high-resolution APOGEE spectra of M dwarfs, covering the entire H-band, provide a unique opportunity to measure their fundamental parameters. We have performed a detailed spectral synthesis by comparing these high-resolution H-band spectra to that of the most recent BT-Settl model and have obtained fundamental parameters such as effective temperature, surface gravity, and metallicity (Teff, log g, and [Fe/H]), respectively. Results: We have determined Teff, log g, and [Fe/H] for 45 M dwarfs using high-resolution H-band spectra. The derived Teff for the sample ranges from 3100 to 3900 K, values of log g lie in the range 4.5 ≤ log g ≤ 5.5, and the resulting metallicities lie in the range ‑0.5 ≤ [Fe/H] ≤ +0.5. We have explored systematic differences between effective temperature and metallicity calibrations with other studies using the same sample of M dwarfs. We have also shown that the stellar

  17. Further Validation of the Coach Identity Prominence Scale

    ERIC Educational Resources Information Center

    Pope, J. Paige; Hall, Craig R.

    2014-01-01

    This study was designed to examine select psychometric properties of the Coach Identity Prominence Scale (CIPS), including the reliability, factorial validity, convergent validity, discriminant validity, and predictive validity. Coaches (N = 338) who averaged 37 (SD = 12.27) years of age, had a mean of 13 (SD = 9.90) years of coaching experience,…

  18. REVIEW OF INDOOR EMISSION SOURCE MODELS: PART 2. PARAMETER ESTIMATION

    EPA Science Inventory

    This review consists of two sections. Part I provides an overview of 46 indoor emission source models. Part 2 (this paper) focuses on parameter estimation, a topic that is critical to modelers but has never been systematically discussed. A perfectly valid model may not be a usefu...

  19. Bias in error estimation when using cross-validation for model selection.

    PubMed

    Varma, Sudhir; Simon, Richard

    2006-02-23

    Cross-validation (CV) is an effective method for estimating the prediction error of a classifier. Some recent articles have proposed methods for optimizing classifiers by choosing classifier parameter values that minimize the CV error estimate. We have evaluated the validity of using the CV error estimate of the optimized classifier as an estimate of the true error expected on independent data. We used CV to optimize the classification parameters for two kinds of classifiers; Shrunken Centroids and Support Vector Machines (SVM). Random training datasets were created, with no difference in the distribution of the features between the two classes. Using these "null" datasets, we selected classifier parameter values that minimized the CV error estimate. 10-fold CV was used for Shrunken Centroids while Leave-One-Out-CV (LOOCV) was used for the SVM. Independent test data was created to estimate the true error. With "null" and "non null" (with differential expression between the classes) data, we also tested a nested CV procedure, where an inner CV loop is used to perform the tuning of the parameters while an outer CV is used to compute an estimate of the error. The CV error estimate for the classifier with the optimal parameters was found to be a substantially biased estimate of the true error that the classifier would incur on independent data. Even though there is no real difference between the two classes for the "null" datasets, the CV error estimate for the Shrunken Centroid with the optimal parameters was less than 30% on 18.5% of simulated training data-sets. For SVM with optimal parameters the estimated error rate was less than 30% on 38% of "null" data-sets. Performance of the optimized classifiers on the independent test set was no better than chance. The nested CV procedure reduces the bias considerably and gives an estimate of the error that is very close to that obtained on the independent testing set for both Shrunken Centroids and SVM classifiers for

  20. Electronegativity Equalization Method: Parameterization and Validation for Large Sets of Organic, Organohalogene and Organometal Molecule

    PubMed Central

    Vařeková, Radka Svobodová; Jiroušková, Zuzana; Vaněk, Jakub; Suchomel, Šimon; Koča, Jaroslav

    2007-01-01

    The Electronegativity Equalization Method (EEM) is a fast approach for charge calculation. A challenging part of the EEM is the parameterization, which is performed using ab initio charges obtained for a set of molecules. The goal of our work was to perform the EEM parameterization for selected sets of organic, organohalogen and organometal molecules. We have performed the most robust parameterization published so far. The EEM parameterization was based on 12 training sets selected from a database of predicted 3D structures (NCI DIS) and from a database of crystallographic structures (CSD). Each set contained from 2000 to 6000 molecules. We have shown that the number of molecules in the training set is very important for quality of the parameters. We have improved EEM parameters (STO-3G MPA charges) for elements that were already parameterized, specifically: C, O, N, H, S, F and Cl. The new parameters provide more accurate charges than those published previously. We have also developed new parameters for elements that were not parameterized yet, specifically for Br, I, Fe and Zn. We have also performed crossover validation of all obtained parameters using all training sets that included relevant elements and confirmed that calculated parameters provide accurate charges.

  1. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    NASA Astrophysics Data System (ADS)

    Capote, R.; Herman, M.; Obložinský, P.; Young, P. G.; Goriely, S.; Belgya, T.; Ignatyuk, A. V.; Koning, A. J.; Hilaire, S.; Plujko, V. A.; Avrigeanu, M.; Bersillon, O.; Chadwick, M. B.; Fukahori, T.; Ge, Zhigang; Han, Yinlu; Kailas, S.; Kopecky, J.; Maslov, V. M.; Reffo, G.; Sin, M.; Soukhovitskii, E. Sh.; Talou, P.

    2009-12-01

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and γ-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains

  2. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Oblozinsky, P.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through (http://www-nds.iaea.org/RIPL-3/). This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES

  3. RIPL-Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Capote,R.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains

  4. Parameter extraction using global particle swarm optimization approach and the influence of polymer processing temperature on the solar cell parameters

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Singh, A.; Dhar, A.

    2017-08-01

    The accurate estimation of the photovoltaic parameters is fundamental to gain an insight of the physical processes occurring inside a photovoltaic device and thereby to optimize its design, fabrication processes, and quality. A simulative approach of accurately determining the device parameters is crucial for cell array and module simulation when applied in practical on-field applications. In this work, we have developed a global particle swarm optimization (GPSO) approach to estimate the different solar cell parameters viz., ideality factor (η), short circuit current (Isc), open circuit voltage (Voc), shunt resistant (Rsh), and series resistance (Rs) with wide a search range of over ±100 % for each model parameter. After validating the accurateness and global search power of the proposed approach with synthetic and noisy data, we applied the technique to the extract the PV parameters of ZnO/PCDTBT based hybrid solar cells (HSCs) prepared under different annealing conditions. Further, we examine the variation of extracted model parameters to unveil the physical processes occurring when different annealing temperatures are employed during the device fabrication and establish the role of improved charge transport in polymer films from independent FET measurements. The evolution of surface morphology, optical absorption, and chemical compositional behaviour of PCDTBT co-polymer films as a function of processing temperature has also been captured in the study and correlated with the findings from the PV parameters extracted using GPSO approach.

  5. Synthetic Jet Flow Field Database for CFD Validation

    NASA Technical Reports Server (NTRS)

    Yao, Chung-Sheng; Chen, Fang Jenq; Neuhart, Dan; Harris, Jerome

    2004-01-01

    An oscillatory zero net mass flow jet was generated by a cavity-pumping device, namely a synthetic jet actuator. This basic oscillating jet flow field was selected as the first of the three test cases for the Langley workshop on CFD Validation of Synthetic Jets and Turbulent Separation Control. The purpose of this workshop was to assess the current CFD capabilities to predict unsteady flow fields of synthetic jets and separation control. This paper describes the characteristics and flow field database of a synthetic jet in a quiescent fluid. In this experiment, Particle Image Velocimetry (PIV), Laser Doppler Velocimetry (LDV), and hot-wire anemometry were used to measure the jet velocity field. In addition, the actuator operating parameters including diaphragm displacement, internal cavity pressure, and internal cavity temperature were also documented to provide boundary conditions for CFD modeling.

  6. Validity of a basketball-specific complex test in female professional players.

    PubMed

    Schwesig, René; Hermassi, Souhail; Lauenroth, Andreas; Laudner, Kevin; Koke, Alexander; Bartels, Thomas; Delank, Stefan; Schulze, Stephan

    2018-06-01

    The purpose of this study was to assess the validity of a new basketball-specific complex test (BBCT) based on the ascertained match performance.Fourteen female professional basketball players (ages: 23.4 ± 1.8 years) performed the BBCT and a treadmill test (TT) at the beginning of pre-season training. Lactate, heart rate (HR), time, shooting precision and number of errors were measured during the four test sequences of the BBCT (short distance sprinting with direction changes, with and without a ball; fast break; lay-up parcours; sprint endurance test). In addition, lactate threshold (LT) and HR were assessed at selected times throughout the TT and the BBCT and over 6 (TT) or 10 (BBCT) minutes after the tests. The match performance score (mps) was calculated on specific parameters (e. g. points) collected during all matches during the subsequent season (22 matches). The mps served as the "gold standard" within the validation process for the BBCT and the TT.TT parameters demonstrated an explained variance (EV) between 0 % (HR recovery) and 11 % (running speed at 6 mmol/l LT). The EV from the BBCT was higher and ranged from 0 % (HR recovery 6 minutes after end of exercise) to 28 % (sprint endurance test after 8 of 10 sprints). Ten out of 21 BBCT parameters (48 %) and 2 out of 5 TT parameters (40 %) demonstrated an EV higher than 10 %. Average EV for all parameters was 12 % (BBCT) and 6 % (TT), respectively. The BBCT had a higher validity than the TT for predicting match performance. These findings suggest that coaches and scientists should consider using the BBCT testing protocol to estimate the match performance abilities of elite female players. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Iced Aircraft Flight Data for Flight Simulator Validation

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Blankenship, Kurt; Rieke, William; Brinker, David J.

    2003-01-01

    NASA is developing and validating technology to incorporate aircraft icing effects into a flight training device concept demonstrator. Flight simulation models of a DHC-6 Twin Otter were developed from wind tunnel data using a subscale, complete aircraft model with and without simulated ice, and from previously acquired flight data. The validation of the simulation models required additional aircraft response time histories of the airplane configured with simulated ice similar to the subscale model testing. Therefore, a flight test was conducted using the NASA Twin Otter Icing Research Aircraft. Over 500 maneuvers of various types were conducted in this flight test. The validation data consisted of aircraft state parameters, pilot inputs, propulsion, weight, center of gravity, and moments of inertia with the airplane configured with different amounts of simulated ice. Emphasis was made to acquire data at wing stall and tailplane stall since these events are of primary interest to model accurately in the flight training device. Analyses of several datasets are described regarding wing and tailplane stall. Key findings from these analyses are that the simulated wing ice shapes significantly reduced the C , max, while the simulated tail ice caused elevator control force anomalies and tailplane stall when flaps were deflected 30 deg or greater. This effectively reduced the safe operating margins between iced wing and iced tail stall as flap deflection and thrust were increased. This flight test demonstrated that the critical aspects to be modeled in the icing effects flight training device include: iced wing and tail stall speeds, flap and thrust effects, control forces, and control effectiveness.

  8. Performing a Content Validation Study.

    ERIC Educational Resources Information Center

    Spool, Mark D.

    Content validity is concerned with three components: (1) the job content; (2) the test content, and (3) the strength of the relationship between the two. A content validation study, to be considered adequate and defensible should include at least the following four procedures: (1) A thorough and accurate job analysis (to define the job content);…

  9. PSI-Center Validation Studies

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Sutherland, D. A.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2014-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with 3D extended MHD simulations using the NIMROD, HiFi, and PSI-TET codes. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), HBT-EP (Columbia), HIT-SI (U Wash-UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition (BOD) is used to compare experiments with simulations. BOD separates data sets into spatial and temporal structures, giving greater weight to dominant structures. Several BOD metrics are being formulated with the goal of quantitive validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  10. LAnd surface remote sensing Products VAlidation System (LAPVAS) and its preliminary application

    NASA Astrophysics Data System (ADS)

    Lin, Xingwen; Wen, Jianguang; Tang, Yong; Ma, Mingguo; Dou, Baocheng; Wu, Xiaodan; Meng, Lumin

    2014-11-01

    The long term record of remote sensing product shows the land surface parameters with spatial and temporal change to support regional and global scientific research widely. Remote sensing product with different sensors and different algorithms is necessary to be validated to ensure the high quality remote sensing product. Investigation about the remote sensing product validation shows that it is a complex processing both the quality of in-situ data requirement and method of precision assessment. A comprehensive validation should be needed with long time series and multiple land surface types. So a system named as land surface remote sensing product is designed in this paper to assess the uncertainty information of the remote sensing products based on a amount of in situ data and the validation techniques. The designed validation system platform consists of three parts: Validation databases Precision analysis subsystem, Inter-external interface of system. These three parts are built by some essential service modules, such as Data-Read service modules, Data-Insert service modules, Data-Associated service modules, Precision-Analysis service modules, Scale-Change service modules and so on. To run the validation system platform, users could order these service modules and choreograph them by the user interactive and then compete the validation tasks of remote sensing products (such as LAI ,ALBEDO ,VI etc.) . Taking SOA-based architecture as the framework of this system. The benefit of this architecture is the good service modules which could be independent of any development environment by standards such as the Web-Service Description Language(WSDL). The standard language: C++ and java will used as the primary programming language to create service modules. One of the key land surface parameter, albedo, is selected as an example of the system application. It is illustrated that the LAPVAS has a good performance to implement the land surface remote sensing product

  11. The validation index: a new metric for validation of segmentation algorithms using two or more expert outlines with application to radiotherapy planning.

    PubMed

    Juneja, Prabhjot; Evans, Philp M; Harris, Emma J

    2013-08-01

    Validation is required to ensure automated segmentation algorithms are suitable for radiotherapy target definition. In the absence of true segmentation, algorithmic segmentation is validated against expert outlining of the region of interest. Multiple experts are used to overcome inter-expert variability. Several approaches have been studied in the literature, but the most appropriate approach to combine the information from multiple expert outlines, to give a single metric for validation, is unclear. None consider a metric that can be tailored to case-specific requirements in radiotherapy planning. Validation index (VI), a new validation metric which uses experts' level of agreement was developed. A control parameter was introduced for the validation of segmentations required for different radiotherapy scenarios: for targets close to organs-at-risk and for difficult to discern targets, where large variation between experts is expected. VI was evaluated using two simulated idealized cases and data from two clinical studies. VI was compared with the commonly used Dice similarity coefficient (DSCpair - wise) and found to be more sensitive than the DSCpair - wise to the changes in agreement between experts. VI was shown to be adaptable to specific radiotherapy planning scenarios.

  12. Development of a validation model for the defense meteorological satellite program's special sensor microwave imager

    NASA Technical Reports Server (NTRS)

    Swift, C. T.; Goodberlet, M. A.; Wilkerson, J. C.

    1990-01-01

    The Defence Meteorological Space Program's (DMSP) Special Sensor Microwave/Imager (SSM/I), an operational wind speed algorithm was developed. The algorithm is based on the D-matrix approach which seeks a linear relationship between measured SSM/I brightness temperatures and environmental parameters. D-matrix performance was validated by comparing algorithm derived wind speeds with near-simultaneous and co-located measurements made by off-shore ocean buoys. Other topics include error budget modeling, alternate wind speed algorithms, and D-matrix performance with one or more inoperative SSM/I channels.

  13. Identification of two clinical hepatocellular carcinoma patient phenotypes from results of standard screening parameters

    PubMed Central

    Carr, Brian I.; Giannini, Edoardo G.; Farinati, Fabio; Ciccarese, Francesca; Rapaccini, Gian Ludovico; Marco, Maria Di; Benvegnù, Luisa; Zoli, Marco; Borzio, Franco; Caturelli, Eugenio; Chiaramonte, Maria; Trevisani, Franco

    2014-01-01

    Background Previous work has shown that 2 general processes contribute to hepatocellular cancer (HCC) prognosis. They are: a. liver damage, monitored by indices such as blood bilirubin, prothrombin time and AST; as well as b. tumor biology, monitored by indices such as tumor size, tumor number, presence of PVT and blood AFP levels. These 2 processes may affect one another, with prognostically significant interactions between multiple tumor and host parameters. These interactions form a context that provide personalization of the prognostic meaning of these factors for every patient. Thus, a given level of bilirubin or tumor diameter might have a different significance in different personal contexts. We previously applied Network Phenotyping Strategy (NPS) to characterize interactions between liver function indices of Asian HCC patients and recognized two clinical phenotypes, S and L, differing in tumor size and tumor nodule numbers. Aims To validate the applicability of the NPS-based HCC S/L classification on an independent European HCC cohort, for which survival information was additionally available. Methods Four sets of peripheral blood parameters, including AFP-platelets, derived from routine blood parameter levels and tumor indices from the ITA.LI.CA database, were analyzed using NPS, a graph-theory based approach, which compares personal patterns of complete relationships between clinical data values to reference patterns with significant association to disease outcomes. Results Without reference to the actual tumor sizes, patients were classified by NPS into 2 subgroups with S and L phenotypes. These two phenotypes were recognized using solely the HCC screening test results, consisting of eight common blood parameters, paired by their significant correlations, including an AFP-Platelets relationship. These trends were combined with patient age, gender and self-reported alcoholism into NPS personal patient profiles. We subsequently validated (using actual

  14. Reliability and Construct Validity of the Psychopathic Personality Inventory-Revised in a Swedish Non-Criminal Sample - A Multimethod Approach including Psychophysiological Correlates of Empathy for Pain.

    PubMed

    Sörman, Karolina; Nilsonne, Gustav; Howner, Katarina; Tamm, Sandra; Caman, Shilan; Wang, Hui-Xin; Ingvar, Martin; Edens, John F; Gustavsson, Petter; Lilienfeld, Scott O; Petrovic, Predrag; Fischer, Håkan; Kristiansson, Marianne

    2016-01-01

    Cross-cultural investigation of psychopathy measures is important for clarifying the nomological network surrounding the psychopathy construct. The Psychopathic Personality Inventory-Revised (PPI-R) is one of the most extensively researched self-report measures of psychopathic traits in adults. To date however, it has been examined primarily in North American criminal or student samples. To address this gap in the literature, we examined PPI-R's reliability, construct validity and factor structure in non-criminal individuals (N = 227) in Sweden, using a multimethod approach including psychophysiological correlates of empathy for pain. PPI-R construct validity was investigated in subgroups of participants by exploring its degree of overlap with (i) the Psychopathy Checklist: Screening Version (PCL:SV), (ii) self-rated empathy and behavioral and physiological responses in an experiment on empathy for pain, and (iii) additional self-report measures of alexithymia and trait anxiety. The PPI-R total score was significantly associated with PCL:SV total and factor scores. The PPI-R Coldheartedness scale demonstrated significant negative associations with all empathy subscales and with rated unpleasantness and skin conductance responses in the empathy experiment. The PPI-R higher order Self-Centered Impulsivity and Fearless Dominance dimensions were associated with trait anxiety in opposite directions (positively and negatively, respectively). Overall, the results demonstrated solid reliability (test-retest and internal consistency) and promising but somewhat mixed construct validity for the Swedish translation of the PPI-R.

  15. The MODIS Aerosol Algorithm, Products and Validation

    NASA Technical Reports Server (NTRS)

    Remer, L. A.; Kaufman, Y. J.; Tanre, D.; Mattoo, S.; Chu, D. A.; Martins, J. V.; Li, R.-R.; Ichoku, C.; Levy, R. C.; Kleidman, R. G.

    2003-01-01

    The MODerate resolution Imaging Spectroradiometer (MODIS) aboard both NASA's Terra and Aqua satellites is making near global daily observations of the earth in a wide spectral range. These measurements are used to derive spectral aerosol optical thickness and aerosol size parameters over both land and ocean. The aerosol products available over land include aerosol optical thickness at three visible wavelengths, a measure of the fraction of aerosol optical thickness attributed to the fine mode and several derived parameters including reflected spectral solar flux at top of atmosphere. Over ocean, the aerosol optical thickness is provided in seven wavelengths from 0.47 microns to 2.13 microns. In addition, quantitative aerosol size information includes effective radius of the aerosol and quantitative fraction of optical thickness attributed to the fine mode. Spectral aerosol flux, mass concentration and number of cloud condensation nuclei round out the list of available aerosol products over the ocean. The spectral optical thickness and effective radius of the aerosol over the ocean are validated by comparison with two years of AERONET data gleaned from 133 AERONET stations. 8000 MODIS aerosol retrievals colocated with AERONET measurements confirm that one-standard deviation of MODIS optical thickness retrievals fall within the predicted uncertainty of delta tauapproximately equal to plus or minus 0.03 plus or minus 0.05 tau over ocean and delta tay equal to plus or minus 0.05 plus or minus 0.15 tau over land. 271 MODIS aerosol retrievals co-located with AERONET inversions at island and coastal sites suggest that one-standard deviation of MODIS effective radius retrievals falls within delta r_eff approximately equal to 0.11 microns. The accuracy of the MODIS retrievals suggests that the product can be used to help narrow the uncertainties associated with aerosol radiative forcing of global climate.

  16. Spectral Line Parameters Including Temperature Dependences of Self- and Air-Broadening in the 2 (left arrow) 0 Band of CO at 2.3 micrometers

    NASA Technical Reports Server (NTRS)

    Devi, V. Malathy; Benner, D. Chris; Smith, M. A. H.; Mantz, A. W.; Sung, K.; Brown, L. R.; Predoi-Cross, A.

    2012-01-01

    Temperature dependences of pressure-broadened half-width and pressure-induced shift coefficients along with accurate positions and intensities have been determined for transitions in the 2<--0 band of C-12 O-16 from analyzing high-resolution and high signal-to-noise spectra recorded with two different Fourier transform spectrometers. A total of 28 spectra, 16 self-broadened and 12 air-broadened, recorded using high- purity (greater than or equal to 99.5% C-12-enriched) CO samples and CO diluted with dry air(research grade) at different temperatures and pressures, were analyzed simultaneously to maximize the accuracy of the retrieved parameters. The sample temperatures ranged from 150 to 298K and the total pressures varied between 5 and 700 Torr. A multispectrum nonlinear least squares spectrum fitting technique was used to adjust the rovibrational constants (G, B, D, etc.) and intensity parameters (including Herman-Wallis coefficients), rather than determining individual line positions and intensities. Self-and air-broadened Lorentz half-width coefficients, their temperature dependence exponents, self- and air-pressure-induced shift coefficients, their temperature dependences, self- and air-line mixing coefficients, their temperature dependences and speed dependence have been retrieved from the analysis. Speed-dependent line shapes with line mixing employing off-diagonal relaxation matrix element formalism were needed to minimize the fit residuals. This study presents a precise and complete set of spectral line parameters that consistently reproduce the spectrum of carbon monoxide over terrestrial atmospheric conditions.

  17. A method for landing gear modeling and simulation with experimental validation

    NASA Technical Reports Server (NTRS)

    Daniels, James N.

    1996-01-01

    This document presents an approach for modeling and simulating landing gear systems. Specifically, a nonlinear model of an A-6 Intruder Main Gear is developed, simulated, and validated against static and dynamic test data. This model includes nonlinear effects such as a polytropic gas model, velocity squared damping, a geometry governed model for the discharge coefficients, stick-slip friction effects and a nonlinear tire spring and damping model. An Adams-Moulton predictor corrector was used to integrate the equations of motion until a discontinuity caused by a stick-slip friction model was reached, at which point, a Runga-Kutta routine integrated past the discontinuity and returned the problem solution back to the predictor corrector. Run times of this software are around 2 mins. per 1 sec. of simulation under dynamic circumstances. To validate the model, engineers at the Aircraft Landing Dynamics facilities at NASA Langley Research Center installed one A-6 main gear on a drop carriage and used a hydraulic shaker table to provide simulated runway inputs to the gear. Model parameters were tuned to produce excellent agreement for many cases.

  18. FLASH Interface; a GUI for managing runtime parameters in FLASH simulations

    NASA Astrophysics Data System (ADS)

    Walker, Christopher; Tzeferacos, Petros; Weide, Klaus; Lamb, Donald; Flocke, Norbert; Feister, Scott

    2017-10-01

    We present FLASH Interface, a novel graphical user interface (GUI) for managing runtime parameters in simulations performed with the FLASH code. FLASH Interface supports full text search of available parameters; provides descriptions of each parameter's role and function; allows for the filtering of parameters based on categories; performs input validation; and maintains all comments and non-parameter information already present in existing parameter files. The GUI can be used to edit existing parameter files or generate new ones. FLASH Interface is open source and was implemented with the Electron framework, making it available on Mac OSX, Windows, and Linux operating systems. The new interface lowers the entry barrier for new FLASH users and provides an easy-to-use tool for experienced FLASH simulators. U.S. Department of Energy (DOE), NNSA ASC/Alliances Center for Astrophysical Thermonuclear Flashes, U.S. DOE NNSA ASC through the Argonne Institute for Computing in Science, U.S. National Science Foundation.

  19. 100-lbf LO2/CH4 RCS Thruster Testing and Validation

    NASA Technical Reports Server (NTRS)

    Barnes, Frank; Cannella, Matthew; Gomez, Carlos; Hand, Jeffrey; Rosenberg, David

    2009-01-01

    100 pound thrust liquid Oxygen-Methane thruster sized for RCS (Reaction Control System) applications. Innovative Design Characteristics include: a) Simple compact design with minimal part count; b) Gaseous or Liquid propellant operation; c) Affordable and Reusable; d) Greater flexibility than existing systems; e) Part of NASA'S study of "Green Propellants." Hot-fire testing validated performance and functionality of thruster. Thruster's dependence on mixture ratio has been evaluated. Data has been used to calculate performance parameters such as thrust and Isp. Data has been compared with previous test results to verify reliability and repeatability. Thruster was found to have an Isp of 131 s and 82 lbf thrust at a mixture ratio of 1.62.

  20. Quality Assessment Parameters for Student Support at Higher Education Institutions

    ERIC Educational Resources Information Center

    Sajiene, Laima; Tamuliene, Rasa

    2012-01-01

    The research presented in this article aims to validate quality assessment parameters for student support at higher education institutions. Student support is discussed as the system of services provided by a higher education institution which helps to develop student-centred curriculum and fulfils students' emotional, academic, social needs, and…

  1. Time Domain Tool Validation Using ARES I-X Flight Data

    NASA Technical Reports Server (NTRS)

    Hough, Steven; Compton, James; Hannan, Mike; Brandon, Jay

    2011-01-01

    The ARES I-X vehicle was launched from NASA's Kennedy Space Center (KSC) on October 28, 2009 at approximately 11:30 EDT. ARES I-X was the first test flight for NASA s ARES I launch vehicle, and it was the first non-Shuttle launch vehicle designed and flown by NASA since Saturn. The ARES I-X had a 4-segment solid rocket booster (SRB) first stage and a dummy upper stage (US) to emulate the properties of the ARES I US. During ARES I-X pre-flight modeling and analysis, six (6) independent time domain simulation tools were developed and cross validated. Each tool represents an independent implementation of a common set of models and parameters in a different simulation framework and architecture. Post flight data and reconstructed models provide the means to validate a subset of the simulations against actual flight data and to assess the accuracy of pre-flight dispersion analysis. Post flight data consists of telemetered Operational Flight Instrumentation (OFI) data primarily focused on flight computer outputs and sensor measurements as well as Best Estimated Trajectory (BET) data that estimates vehicle state information from all available measurement sources. While pre-flight models were found to provide a reasonable prediction of the vehicle flight, reconstructed models were generated to better represent and simulate the ARES I-X flight. Post flight reconstructed models include: SRB propulsion model, thrust vector bias models, mass properties, base aerodynamics, and Meteorological Estimated Trajectory (wind and atmospheric data). The result of the effort is a set of independently developed, high fidelity, time-domain simulation tools that have been cross validated and validated against flight data. This paper presents the process and results of high fidelity aerospace modeling, simulation, analysis and tool validation in the time domain.

  2. Validation of the Acoustic Voice Quality Index in the Japanese Language.

    PubMed

    Hosokawa, Kiyohito; Barsties, Ben; Iwahashi, Toshihiko; Iwahashi, Mio; Kato, Chieri; Iwaki, Shinobu; Sasai, Hisanori; Miyauchi, Akira; Matsushiro, Naoki; Inohara, Hidenori; Ogawa, Makoto; Maryn, Youri

    2017-03-01

    The Acoustic Voice Quality Index (AVQI) is a multivariate construct for quantification of overall voice quality based on the analysis of continuous speech and sustained vowel. The stability and validity of the AVQI is well established in several language families. However, the Japanese language has distinct characteristics with respect to several parameters of articulatory and phonatory physiology. The aim of the study was to confirm the criterion-related concurrent validity of AVQI, as well as its responsiveness to change and diagnostic accuracy for voice assessment in the Japanese-speaking population. This is a retrospective study. A total of 336 voice recordings, which included 69 pairs of voice recordings (before and after therapeutic interventions), were eligible for the study. The auditory-perceptual judgment of overall voice quality was evaluated by five experienced raters. The concurrent validity, responsiveness to change, and diagnostic accuracy of the AVQI were estimated. The concurrent validity and responsiveness to change based on the overall voice quality was indicated by high correlation coefficients 0.828 and 0.767, respectively. Receiver operating characteristic analysis revealed an excellent diagnostic accuracy for discrimination between dysphonic and normophonic voices (area under the curve: 0.905). The best threshold level for the AVQI of 3.15 corresponded with a sensitivity of 72.5% and specificity of 95.2%, with the positive and negative likelihood ratios of 15.1 and 0.29, respectively. We demonstrated the validity of the AVQI as a tool for assessment of overall voice quality and that of voice therapy outcomes in the Japanese-speaking population. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  3. Validation of Nutritional Risk Screening-2002 in a Hospitalized Adult Population.

    PubMed

    Bolayir, Başak; Arik, Güneş; Yeşil, Yusuf; Kuyumcu, Mehmet Emin; Varan, Hacer Doğan; Kara, Özgür; Güngör, Anil Evrim; Yavuz, Burcu Balam; Cankurtaran, Mustafa; Halil, Meltem Gülhan

    2018-03-30

    Malnutrition in hospitalized patients is a serious problem and is associated with a number of adverse outcomes. The Nutritional Risk Screening-2002 (NRS-2002) tool was designed to identify patients at nutrition risk. The validation of NRS-2002 compared with detailed clinical assessment of nutrition status was not studied before in hospitalized Turkish adults. The aim of this study is to determine validity, sensitivity, and specificity of the Turkish version of NRS-2002 in a hospitalized adult population. A total of 271 consecutive hospitalized patients aged >18 years admitted to surgical and medical wards of a university hospital in Turkey were included in this single-center non interventional validity study. Assessment by geriatricians was used as the reference method. Two geriatricians experienced in the field of malnutrition interpreted the patients' nutrition status after the evaluation of several parameters. Patients were divided into "at nutrition risk" and "not at nutrition risk" groups by geriatricians. Concordance between the 2 geriatricians' clinical assessments was analyzed by κ statistics. Excellent concordance was found; therefore, the first geriatrician's decisions were accepted as the gold standard. The correlation of nutrition status of the patients, determined with NRS-2002 and experienced geriatrician's decisions, was evaluated for the validity. NRS-2002 has a sensitivity of 88% and specificity of 92% when compared with professional assessment. The positive and negative predictive values were 87% and 92%, respectively. Testretest agreement was excellent as represented by a κ coefficient of 0.956. NRS-2002 is a valid tool to assess malnutrition risk in Turkish hospitalized patients. © 2018 American Society for Parenteral and Enteral Nutrition.

  4. A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.

    PubMed

    Faya, Paul; Stamey, James D; Seaman, John W

    2017-01-01

    For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can

  5. CosmoQuest:Using Data Validation for More Than Just Data Validation

    NASA Astrophysics Data System (ADS)

    Lehan, C.; Gay, P.

    2016-12-01

    It is often taken for granted that different scientists completing the same task (e.g. mapping geologic features) will get the same results, and data validation is often skipped or under-utilized due to time and funding constraints. Robbins et. al (2014), however, demonstrated that this is a needed step, as large variation can exist even among collaborating team members completing straight-forward tasks like marking craters. Data Validation should be much more than a simple post-project verification of results. The CosmoQuest virtual research facility employs regular data-validation for a variety of benefits, including real-time user feedback, real-time tracking to observe user activity while it's happening, and using pre-solved data to analyze users' progress and to help them retain skills. Some creativity in this area can drastically improve project results. We discuss methods of validating data in citizen science projects and outline the variety of uses for validation, which, when used properly, improves the scientific output of the project and the user experience for the citizens doing the work. More than just a tool for scientists, validation can assist users in both learning and retaining important information and skills, improving the quality and quantity of data gathered. Real-time analysis of user data can give key information in the effectiveness of the project that a broad glance would miss, and properly presenting that analysis is vital. Training users to validate their own data, or the data of others, can significantly improve the accuracy of misinformed or novice users.

  6. Unfolding linac photon spectra and incident electron energies from experimental transmission data, with direct independent validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, E. S. M.; McEwen, M. R.; Rogers, D. W. O.

    2012-11-15

    Purpose: In a recent computational study, an improved physics-based approach was proposed for unfolding linac photon spectra and incident electron energies from transmission data. In this approach, energy differentiation is improved by simultaneously using transmission data for multiple attenuators and detectors, and the unfolding robustness is improved by using a four-parameter functional form to describe the photon spectrum. The purpose of the current study is to validate this approach experimentally, and to demonstrate its application on a typical clinical linac. Methods: The validation makes use of the recent transmission measurements performed on the Vickers research linac of National Research Councilmore » Canada. For this linac, the photon spectra were previously measured using a NaI detector, and the incident electron parameters are independently known. The transmission data are for eight beams in the range 10-30 MV using thick Be, Al and Pb bremsstrahlung targets. To demonstrate the approach on a typical clinical linac, new measurements are performed on an Elekta Precise linac for 6, 10 and 25 MV beams. The different experimental setups are modeled using EGSnrc, with the newly added photonuclear attenuation included. Results: For the validation on the research linac, the 95% confidence bounds of the unfolded spectra fall within the noise of the NaI data. The unfolded spectra agree with the EGSnrc spectra (calculated using independently known electron parameters) with RMS energy fluence deviations of 4.5%. The accuracy of unfolding the incident electron energy is shown to be {approx}3%. A transmission cutoff of only 10% is suitable for accurate unfolding, provided that the other components of the proposed approach are implemented. For the demonstration on a clinical linac, the unfolded incident electron energies and their 68% confidence bounds for the 6, 10 and 25 MV beams are 6.1 {+-} 0.1, 9.3 {+-} 0.1, and 19.3 {+-} 0.2 MeV, respectively. The unfolded

  7. Testing backreaction effects with observational Hubble parameter data

    NASA Astrophysics Data System (ADS)

    Cao, Shu-Lei; Teng, Huan-Yu; Wan, Hao-Yi; Yu, Hao-Ran; Zhang, Tong-Jie

    2018-02-01

    The spatially averaged inhomogeneous Universe includes a kinematical backreaction term Q_{D} that is relate to the averaged spatial Ricci scalar _{D} in the framework of general relativity. Under the assumption that Q_{D} and < R > _{D} obey the scaling laws of the volume scale factor a_{D}, a direct coupling between them with a scaling index n is remarkable. In order to explore the generic properties of a backreaction model for explaining the accelerated expansion of the Universe, we exploit two metrics to describe the late time Universe. Since the standard FLRW metric cannot precisely describe the late time Universe on small scales, the template metric with an evolving curvature parameter κ _{D}(t) is employed. However, we doubt the validity of the prescription for κ _{D}, which motivates us apply observational Hubble parameter data (OHD) to constrain parameters in dust cosmology. First, for FLRW metric, by getting best-fit constraints of Ω^{D_0}_m = 0.25^{+0.03}_{-0.03}, n = 0.02^{+0.69}_{-0.66}, and H_{D_0} = 70.54^{+4.24}_{-3.97} km s^{-1 Mpc^{-1}}, the evolutions of parameters are explored. Second, in template metric context, by marginalizing over H_{D_0} as a prior of uniform distribution, we obtain the best-fit values of n=-1.22^{+0.68}_{-0.41} and Ωm^{D0}=0.12^{+0.04}_{-0.02}. Moreover, we utilize three different Gaussian priors of H_{D_0}, which result in different best-fits of n, but almost the same best-fit value of Ωm^{D0}˜ 0.12. Also, the absolute constraints without marginalization of parameter are obtained: n=-1.1^{+0.58}_{-0.50} and Ωm^{D0}=0.13± 0.03. With these constraints, the evolutions of the effective deceleration parameter q^{D} indicate that the backreaction can account for the accelerated expansion of the Universe without involving extra dark energy component in the scaling solution context. Nevertheless, the results also verify that the prescription of κ _{D} is insufficient and should be improved.

  8. Quantitative parameters of CT texture analysis as potential markersfor early prediction of spontaneous intracranial hemorrhage enlargement.

    PubMed

    Shen, Qijun; Shan, Yanna; Hu, Zhengyu; Chen, Wenhui; Yang, Bing; Han, Jing; Huang, Yanfang; Xu, Wen; Feng, Zhan

    2018-04-30

    To objectively quantify intracranial hematoma (ICH) enlargement by analysing the image texture of head CT scans and to provide objective and quantitative imaging parameters for predicting early hematoma enlargement. We retrospectively studied 108 ICH patients with baseline non-contrast computed tomography (NCCT) and 24-h follow-up CT available. Image data were assessed by a chief radiologist and a resident radiologist. Consistency analysis between observers was tested. The patients were divided into training set (75%) and validation set (25%) by stratified sampling. Patients in the training set were dichotomized according to 24-h hematoma expansion ≥ 33%. Using the Laplacian of Gaussian bandpass filter, we chose different anatomical spatial domains ranging from fine texture to coarse texture to obtain a series of derived parameters (mean grayscale intensity, variance, uniformity) in order to quantify and evaluate all data. The parameters were externally validated on validation set. Significant differences were found between the two groups of patients within variance at V 1.0 and in uniformity at U 1.0 , U 1.8 and U 2.5 . The intraclass correlation coefficients for the texture parameters were between 0.67 and 0.99. The area under the ROC curve between the two groups of ICH cases was between 0.77 and 0.92. The accuracy of validation set by CTTA was 0.59-0.85. NCCT texture analysis can objectively quantify the heterogeneity of ICH and independently predict early hematoma enlargement. • Heterogeneity is helpful in predicting ICH enlargement. • CTTA could play an important role in predicting early ICH enlargement. • After filtering, fine texture had the best diagnostic performance. • The histogram-based uniformity parameters can independently predict ICH enlargement. • CTTA is more objective, more comprehensive, more independently operable, than previous methods.

  9. Impact of transitional care on endocrine and anthropometric parameters in Prader–Willi syndrome

    PubMed Central

    Paepegaey, A C; Coupaye, M; Jaziri, A; Ménesguen, F; Dubern, B; Polak, M; Oppert, J M; Tauber, M; Pinto, G; Poitou, C

    2018-01-01

    Context The transition of patients with Prader–Willi syndrome (PWS) to adult life for medical care is challenging because of multiple comorbidities, including hormone deficiencies, obesity and cognitive and behavioral disabilities. Objective To assess endocrine management, and metabolic and anthropometric parameters of PWS adults who received (n = 31) or not (n = 64) transitional care, defined as specialized pediatric care followed by a structured care pathway to a multidisciplinary adult team. Patients and study design Hormonal and metabolic parameters were retrospectively recorded in 95 adults with PWS (mean ± s.d. age 24.7 ± 8.2 years, BMI: 39.8 ± 12.1 kg/m²) referred to our Reference Center and compared according to transition. Results Among the entire cohort, 35.8% received growth hormone (GH) during childhood and 16.8% had a GH stimulation test after completion of growth. In adulthood, 14.7% were treated with GH, 56.8% received sex-hormone therapy, whereas 91.1% were hypogonadic and 37.9% had undergone valid screening of the corticotropic axis. The main reason for suboptimal endocrine management was marked behavioral disorders. Patients receiving transitional care were more likely to have had a GH stimulation test and hormonal substitutions in childhood. They also had a lower BMI, percentage of fat mass, improved metabolic parameters and fewer antidepressant treatments. Transitional care remained significantly associated with these parameters in multivariate analysis when adjusted on GH treatment. Conclusion A coordinated care pathway with specialized pediatric care and transition to a multidisciplinary adult team accustomed to managing complex disability including psychiatric troubles are associated with a better health status in adults with PWS. PMID:29666169

  10. Impact of transitional care on endocrine and anthropometric parameters in Prader-Willi syndrome.

    PubMed

    Paepegaey, A C; Coupaye, M; Jaziri, A; Ménesguen, F; Dubern, B; Polak, M; Oppert, J M; Tauber, M; Pinto, G; Poitou, C

    2018-05-01

    The transition of patients with Prader-Willi syndrome (PWS) to adult life for medical care is challenging because of multiple comorbidities, including hormone deficiencies, obesity and cognitive and behavioral disabilities. To assess endocrine management, and metabolic and anthropometric parameters of PWS adults who received ( n  = 31) or not ( n  = 64) transitional care, defined as specialized pediatric care followed by a structured care pathway to a multidisciplinary adult team. Hormonal and metabolic parameters were retrospectively recorded in 95 adults with PWS (mean ± s.d. age 24.7 ± 8.2 years, BMI: 39.8 ± 12.1 kg/m²) referred to our Reference Center and compared according to transition. Among the entire cohort, 35.8% received growth hormone (GH) during childhood and 16.8% had a GH stimulation test after completion of growth. In adulthood, 14.7% were treated with GH, 56.8% received sex-hormone therapy, whereas 91.1% were hypogonadic and 37.9% had undergone valid screening of the corticotropic axis. The main reason for suboptimal endocrine management was marked behavioral disorders. Patients receiving transitional care were more likely to have had a GH stimulation test and hormonal substitutions in childhood. They also had a lower BMI, percentage of fat mass, improved metabolic parameters and fewer antidepressant treatments. Transitional care remained significantly associated with these parameters in multivariate analysis when adjusted on GH treatment. A coordinated care pathway with specialized pediatric care and transition to a multidisciplinary adult team accustomed to managing complex disability including psychiatric troubles are associated with a better health status in adults with PWS. © 2018 The authors.

  11. The Impact of Multidirectional Item Parameter Drift on IRT Scaling Coefficients and Proficiency Estimates

    ERIC Educational Resources Information Center

    Han, Kyung T.; Wells, Craig S.; Sireci, Stephen G.

    2012-01-01

    Item parameter drift (IPD) occurs when item parameter values change from their original value over time. IPD may pose a serious threat to the fairness and validity of test score interpretations, especially when the goal of the assessment is to measure growth or improvement. In this study, we examined the effect of multidirectional IPD (i.e., some…

  12. Robust Smoothing: Smoothing Parameter Selection and Applications to Fluorescence Spectroscopy∂

    PubMed Central

    Lee, Jong Soo; Cox, Dennis D.

    2009-01-01

    Fluorescence spectroscopy has emerged in recent years as an effective way to detect cervical cancer. Investigation of the data preprocessing stage uncovered a need for a robust smoothing to extract the signal from the noise. Various robust smoothing methods for estimating fluorescence emission spectra are compared and data driven methods for the selection of smoothing parameter are suggested. The methods currently implemented in R for smoothing parameter selection proved to be unsatisfactory, and a computationally efficient procedure that approximates robust leave-one-out cross validation is presented. PMID:20729976

  13. 2nd NASA CFD Validation Workshop

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose of the workshop was to review NASA's progress in CFD validation since the first workshop (held at Ames in 1987) and to affirm the future direction of the NASA CFD validation program. The first session consisted of overviews of CFD validation research at each of the three OAET research centers and at Marshall Space Flight Center. The second session consisted of in-depth technical presentations of the best examples of CFD validation work at each center (including Marshall). On the second day the workshop divided into three working groups to discuss CFD validation progress and needs in the subsonic, high-speed, and hypersonic speed ranges. The emphasis of the working groups was on propulsion.

  14. Validation of Ionosonde Electron Density Reconstruction Algorithms with IONOLAB-RAY in Central Europe

    NASA Astrophysics Data System (ADS)

    Gok, Gokhan; Mosna, Zbysek; Arikan, Feza; Arikan, Orhan; Erdem, Esra

    2016-07-01

    for various purposes including calculation of actual height and generation of ionograms. In this study, the performance of electron density reconstruction algorithm of IONOLAB group and standard electron density profile algorithms of ionosondes are compared with IONOLAB-RAY wave propagation simulation in near vertical incidence. The electron density reconstruction and parameter extraction algorithms of ionosondes are validated with the IONOLAB-RAY results both for quiet anddisturbed ionospheric states in Central Europe using ionosonde stations such as Pruhonice and Juliusruh . It is observed that IONOLAB ionosonde parameter extraction and electron density reconstruction algorithm performs significantly better compared to standard algorithms especially for disturbed ionospheric conditions. IONOLAB-RAY provides an efficient and reliable tool to investigate and validate ionosonde electron density reconstruction algorithms, especially in determination of reflection height (true height) of signals and critical parameters of ionosphere. This study is supported by TUBITAK 114E541, 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  15. Validation of AIRS/AMSU Cloud Retrievals Using MODIS Cloud Analyses

    NASA Technical Reports Server (NTRS)

    Molnar, Gyula I.; Susskind, Joel

    2005-01-01

    The AIRS/AMSU (flying on the EOS-AQUA satellite) sounding retrieval methodology allows for the retrieval of key atmospheric/surface parameters under partially cloudy conditions (Susskind et al.). In addition, cloud parameters are also derived from the AIRS/AMSU observations. Within each AIRS footprint, cloud parameters at up to 2 cloud layers are determined with differing cloud top pressures and effective (product of infrared emissivity at 11 microns and physical cloud fraction) cloud fractions. However, so far the AIRS cloud product has not been rigorously evaluated/validated. Fortunately, collocated/coincident radiances measured by MODIS/AQUA (at a much lower spectral resolution but roughly an order of-magnitude higher spatial resolution than that of AIRS) are used to determine analogous cloud products from MODIS. This allows us for a rather rare and interesting possibility: the intercomparisons and mutual validation of imager vs. sounder-based cloud products obtained from the same satellite positions. First, we present results of small-scale (granules) instantaneous intercomparisons. Next, we will evaluate differences of temporally averaged (monthly) means as well as the representation of inter-annual variability of cloud parameters as presented by the two cloud data sets. In particular, we present statistical differences in the retrieved parameters of cloud fraction and cloud top pressure. We will investigate what type of cloud systems are retrieved most consistently (if any) with both retrieval schemes, and attempt to assess reasons behind statistically significant differences.

  16. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Minimal residual method provides optimal regularization parameter for diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Jagannath, Ravi Prasad K.; Yalavarthy, Phaneendra K.

    2012-10-01

    The inverse problem in the diffuse optical tomography is known to be nonlinear, ill-posed, and sometimes under-determined, requiring regularization to obtain meaningful results, with Tikhonov-type regularization being the most popular one. The choice of this regularization parameter dictates the reconstructed optical image quality and is typically chosen empirically or based on prior experience. An automated method for optimal selection of regularization parameter that is based on regularized minimal residual method (MRM) is proposed and is compared with the traditional generalized cross-validation method. The results obtained using numerical and gelatin phantom data indicate that the MRM-based method is capable of providing the optimal regularization parameter.

  18. Minimal residual method provides optimal regularization parameter for diffuse optical tomography.

    PubMed

    Jagannath, Ravi Prasad K; Yalavarthy, Phaneendra K

    2012-10-01

    The inverse problem in the diffuse optical tomography is known to be nonlinear, ill-posed, and sometimes under-determined, requiring regularization to obtain meaningful results, with Tikhonov-type regularization being the most popular one. The choice of this regularization parameter dictates the reconstructed optical image quality and is typically chosen empirically or based on prior experience. An automated method for optimal selection of regularization parameter that is based on regularized minimal residual method (MRM) is proposed and is compared with the traditional generalized cross-validation method. The results obtained using numerical and gelatin phantom data indicate that the MRM-based method is capable of providing the optimal regularization parameter.

  19. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Technical Reports Server (NTRS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard; Hearty, Thomas; hide

    2011-01-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole disk Earth model simulations used to better under- stand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute s Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model (Tinetti et al., 2006a,b). This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of approx.100 pixels on the visible disk, and four categories of water clouds, which were defined using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to the Earth s lightcurve, absolute brightness, and spectral data, with a root-mean-square error of typically less than 3% for the multiwavelength lightcurves, and residuals of approx.10% for the absolute brightness throughout the visible and NIR spectral range. We extend our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of approx.7%, and temperature errors of less than 1K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated

  20. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Astrophysics Data System (ADS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  1. Earth as an extrasolar planet: Earth model validation using EPOXI earth observations.

    PubMed

    Robinson, Tyler D; Meadows, Victoria S; Crisp, David; Deming, Drake; A'hearn, Michael F; Charbonneau, David; Livengood, Timothy A; Seager, Sara; Barry, Richard K; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M; McFadden, Lucy A; Wellnitz, Dennis D

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  2. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    PubMed Central

    Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-01-01

    Abstract The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward

  3. Correlations of Genotype with Climate Parameters Suggest Caenorhabditis elegans Niche Adaptations

    PubMed Central

    Evans, Kathryn S.; Zhao, Yuehui; Brady, Shannon C.; Long, Lijiang; McGrath, Patrick T.; Andersen, Erik C.

    2016-01-01

    Species inhabit a variety of environmental niches, and the adaptation to a particular niche is often controlled by genetic factors, including gene-by-environment interactions. The genes that vary in order to regulate the ability to colonize a niche are often difficult to identify, especially in the context of complex ecological systems and in experimentally uncontrolled natural environments. Quantitative genetic approaches provide an opportunity to investigate correlations between genetic factors and environmental parameters that might define a niche. Previously, we have shown how a collection of 208 whole-genome sequenced wild Caenorhabditis elegans can facilitate association mapping approaches. To correlate climate parameters with the variation found in this collection of wild strains, we used geographic data to exhaustively curate daily weather measurements in short-term (3 month), middle-term (one year), and long-term (three year) durations surrounding the date of strain isolation. These climate parameters were used as quantitative traits in association mapping approaches, where we identified 11 quantitative trait loci (QTL) for three climatic variables: elevation, relative humidity, and average temperature. We then narrowed the genomic interval of interest to identify gene candidates with variants potentially underlying phenotypic differences. Additionally, we performed two-strain competition assays at high and low temperatures to validate a QTL that could underlie adaptation to temperature and found suggestive evidence supporting that hypothesis. PMID:27866149

  4. Sherborne Missile Fire Frequency with Unconstraint Parameters

    NASA Astrophysics Data System (ADS)

    Dong, Shaquan

    2018-01-01

    For the modeling problem of shipborne missile fire frequency, the fire frequency models with unconstant parameters were proposed, including maximum fire frequency models with unconstant parameters, and actual fire frequency models with unconstant parameters, which can be used to calculate the missile fire frequency with unconstant parameters.

  5. Validation of the Social Appearance Anxiety Scale: factor, convergent, and divergent validity.

    PubMed

    Levinson, Cheri A; Rodebaugh, Thomas L

    2011-09-01

    The Social Appearance Anxiety Scale (SAAS) was created to assess fear of overall appearance evaluation. Initial psychometric work indicated that the measure had a single-factor structure and exhibited excellent internal consistency, test-retest reliability, and convergent validity. In the current study, the authors further examined the factor, convergent, and divergent validity of the SAAS in two samples of undergraduates. In Study 1 (N = 323), the authors tested the factor structure, convergent, and divergent validity of the SAAS with measures of the Big Five personality traits, negative affect, fear of negative evaluation, and social interaction anxiety. In Study 2 (N = 118), participants completed a body evaluation that included measurements of height, weight, and body fat content. The SAAS exhibited excellent convergent and divergent validity with self-report measures (i.e., self-esteem, trait anxiety, ethnic identity, and sympathy), predicted state anxiety experienced during the body evaluation, and predicted body fat content. In both studies, results confirmed a single-factor structure as the best fit to the data. These results lend additional support for the use of the SAAS as a valid measure of social appearance anxiety.

  6. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    PubMed

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  7. Constitutive Equation with Varying Parameters for Superplastic Flow Behavior

    NASA Astrophysics Data System (ADS)

    Guan, Zhiping; Ren, Mingwen; Jia, Hongjie; Zhao, Po; Ma, Pinkui

    2014-03-01

    In this study, constitutive equations for superplastic materials with an extra large elongation were investigated through mechanical analysis. From the view of phenomenology, firstly, some traditional empirical constitutive relations were standardized by restricting some strain paths and parameter conditions, and the coefficients in these relations were strictly given new mechanical definitions. Subsequently, a new, general constitutive equation with varying parameters was theoretically deduced based on the general mechanical equation of state. The superplastic tension test data of Zn-5%Al alloy at 340 °C under strain rates, velocities, and loads were employed for building a new constitutive equation and examining its validity. Analysis results indicated that the constitutive equation with varying parameters could characterize superplastic flow behavior in practical superplastic forming with high prediction accuracy and without any restriction of strain path or deformation condition, showing good industrial or scientific interest. On the contrary, those empirical equations have low prediction capabilities due to constant parameters and poor applicability because of the limit of special strain path or parameter conditions based on strict phenomenology.

  8. Validation and calibration of structural models that combine information from multiple sources.

    PubMed

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  9. Parameters and Scales Used to Assess and Report Findings From Stroboscopy: A Systematic Review.

    PubMed

    Bonilha, Heather Shaw; Desjardins, Maude; Garand, Kendrea L; Martin-Harris, Bonnie

    2017-11-02

    Laryngeal endoscopy with stroboscopy, a critical component of the assessment of voice disorders, is rarely used as a treatment outcome measure in the scientific literature. We hypothesized that this is because of the lack of a widely used standardized, validated, and reliable method to assess and report laryngeal anatomy and physiology, and undertook a systematic literature review to determine the extent of the inconsistencies of the parameters and scales used in voice treatment outcome studies. Systematic literature review. We searched PubMed, Ovid, and Cochrane for studies where laryngeal endoscopy with stroboscopy was used as a treatment outcome measure with search terms representing "stroboscopy" and "treatment" guided by Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement standards. In the 62 included articles, we identified 141 terms representing 49 different parameters, which were further classified into 20 broad categories. The six most common parameters were magnitude of glottal gap, mucosal wave amplitude, location or shape of glottal gap, regularity of vibration, phase symmetry, and presence and size of specific lesions. Parameters were assessed on scales ranging from binary to 100 points. The number of scales used for each parameter varied from 1 to 24, with an average of four different scales per parameter. There is a lack of agreement in the scientific literature regarding which parameters should be assessed to measure voice treatment outcomes and which terms and scales should be used for each parameter. This greatly diminishes comparison and clinical implementation of the results of treatment outcomes research in voice disorders. We highlight a previously published tool and recommend it for future use in research and clinical settings. Copyright © 2017. Published by Elsevier Inc.

  10. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jernigan, Dann A.; Blanchat, Thomas K.

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less

  11. Derivation of global vegetation biophysical parameters from EUMETSAT Polar System

    NASA Astrophysics Data System (ADS)

    García-Haro, Francisco Javier; Campos-Taberner, Manuel; Muñoz-Marí, Jordi; Laparra, Valero; Camacho, Fernando; Sánchez-Zapero, Jorge; Camps-Valls, Gustau

    2018-05-01

    This paper presents the algorithm developed in LSA-SAF (Satellite Application Facility for Land Surface Analysis) for the derivation of global vegetation parameters from the AVHRR (Advanced Very High Resolution Radiometer) sensor on board MetOp (Meteorological-Operational) satellites forming the EUMETSAT (European Organization for the Exploitation of Meteorological Satellites) Polar System (EPS). The suite of LSA-SAF EPS vegetation products includes the leaf area index (LAI), the fractional vegetation cover (FVC), and the fraction of absorbed photosynthetically active radiation (FAPAR). LAI, FAPAR, and FVC characterize the structure and the functioning of vegetation and are key parameters for a wide range of land-biosphere applications. The algorithm is based on a hybrid approach that blends the generalization capabilities offered by physical radiative transfer models with the accuracy and computational efficiency of machine learning methods. One major feature is the implementation of multi-output retrieval methods able to jointly and more consistently estimate all the biophysical parameters at the same time. We propose a multi-output Gaussian process regression (GPRmulti), which outperforms other considered methods over PROSAIL (coupling of PROSPECT and SAIL (Scattering by Arbitrary Inclined Leaves) radiative transfer models) EPS simulations. The global EPS products include uncertainty estimates taking into account the uncertainty captured by the retrieval method and input errors propagation. A sensitivity analysis is performed to assess several sources of uncertainties in retrievals and maximize the positive impact of modeling the noise in training simulations. The paper discusses initial validation studies and provides details about the characteristics and overall quality of the products, which can be of interest to assist the successful use of the data by a broad user's community. The consistent generation and distribution of the EPS vegetation products will

  12. Measurement of drill grinding parameters using laser sensor

    NASA Astrophysics Data System (ADS)

    Yanping, Peng; Kumehara, Hiroyuki; Wei, Zhang; Nomura, Takashi

    2005-12-01

    To measure the grinding parameters and geometry parameters accurately for a drill point is essential to its design and reconditioning. In recent years, a number of non-contact coordinate measuring apparatuses, using CCD camera or laser sensors, are developed. But, a lot work is to be done for further improvement. This paper reports another kind of laser coordinate meter. As an example of its application, the method for geometry inspection of the drill flank surface is detailed. Measured data from laser scanning on the flank surface around some points with several 2-dimensional curves are analyzed with mathematical procedure. If one of these curves turns to be a straight line, it must be the generatrix of the grinding cone. Thus, the grinding parameters are determined by a set of three generatrices. Then, the measurement method and data processing procedure are proposed. Its validity is assessed by measuring a sample with given parameters. The point geometry measured agrees well with the known values. In comparison with other methods in the published literature, it is simpler in computation and more accurate in results.

  13. Quantization of parameters and the string landscape problem

    NASA Astrophysics Data System (ADS)

    Bouhmadi-López, Mariam; Vargas Moniz, Paulo

    2007-05-01

    We broaden the domain of application of Brustein and de Alwis's recent paper, where they introduce a (dynamical) selection principle on the landscape of string solutions using FRW quantum cosmology. More precisely, we (i) explain how their analysis is based in choosing a restrictive range of parameters, thereby affecting the validity of the predictions extracted and (ii) subsequently provide a wider and cohesive description, regarding the probability distribution induced by quantum cosmological transition amplitudes. In addition, employing DeWitt's argument for an initial condition on the wavefunction of the Universe, we found that the string and gravitational parameters become related through interesting expressions involving an integer n, suggesting a quantization relation for some of the involved parameters. This research work was supported by the grants POCI/FP/63916/2005, FEDER-POCI/P/FIS/57547/2004 and Acções Integradas (CRUP-CSIC) Luso-Espanholas E-138/04.

  14. Interpolative modeling of GaAs FET S-parameter data bases for use in Monte Carlo simulations

    NASA Technical Reports Server (NTRS)

    Campbell, L.; Purviance, J.

    1992-01-01

    A statistical interpolation technique is presented for modeling GaAs FET S-parameter measurements for use in the statistical analysis and design of circuits. This is accomplished by interpolating among the measurements in a GaAs FET S-parameter data base in a statistically valid manner.

  15. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  16. Validity of Arabic version of the two-question Quick Inventory of Depression (QID-2-Ar): Screening for multiple sclerosis in an Arab environment and during the Syrian war.

    PubMed

    Kubitary, A; Alsaleh, M A

    2018-03-01

    This study aimed to validate the Arabic version of the two-question Quick Inventory of Depression (QID-2-Ar) in multiple sclerosis (MS) patients living in Syria during the war. A total of 100 Syrian MS patients, aged 18-60 years, were recruited at Damascus Hospital and Ibn Al-Nafees Hospital to validate the QID-2-Ar, including analyses of its screening test parameters and its construct validity. The QID-2-Ar screening parameters for depression tested very positively, and its construct validity was also favorable (P<0.01). The QID-2-Ar is a good screening test for detecting depression. Using a threshold score of ≥1 rather than 2 resulted in more depressed patients being correctly identified. The Arabic version of the QID-2-Ar also has highly favorable psychometric properties. It is valid for assessing depression, especially the two main depressive symptoms (depressive mood and anhedonia) listed in DSM-V. This is a useful tool for researchers and practitioners, and a threshold score of 2 on the QID-2-Ar is recommended to be more certain that all those with depression are detected without having to use a complete depression questionnaire such as the Beck Depression Inventory (BDI)-II. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  17. Hot deformation characteristics of AZ80 magnesium alloy: Work hardening effect and processing parameter sensitivities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Y.; Wan, L.; Guo, Z. H.

    Isothermal compression experiment of AZ80 magnesium alloy was conducted by Gleeble thermo-mechanical simulator in order to quantitatively investigate the work hardening (WH), strain rate sensitivity (SRS) and temperature sensitivity (TS) during hot processing of magnesium alloys. The WH, SRS and TS were described by Zener-Hollomon parameter (Z) coupling of deformation parameters. The relationships between WH rate and true strain as well as true stress were derived from Kocks-Mecking dislocation model and validated by our measurement data. The slope defined through the linear relationship of WH rate and true stress was only related to the annihilation coefficient Ω. Obvious WH behaviormore » could be exhibited at a higher Z condition. Furthermore, we have identified the correlation between the microstructural evolution including β-Mg17Al12 precipitation and the SRS and TS variations. Intensive dynamic recrystallization and homogeneous distribution of β-Mg17Al12 precipitates resulted in greater SRS coefficient at higher temperature. The deformation heat effect and β-Mg17Al12 precipitate content can be regarded as the major factors determining the TS behavior. At low Z condition, the SRS becomes stronger, in contrast to the variation of TS. The optimum hot processing window was validated based on the established SRS and TS values distribution maps for AZ80 magnesium alloy.« less

  18. Risk of malnutrition (over and under-nutrition): validation of the JaNuS screening tool.

    PubMed

    Donini, Lorenzo M; Ricciardi, Laura Maria; Neri, Barbara; Lenzi, Andrea; Marchesini, Giulio

    2014-12-01

    Malnutrition (over and under-nutrition) is highly prevalent in patients admitted to hospital and it is a well-known risk factor for increased morbidity and mortality. Nutritional problems are often misdiagnosed, and especially the coexistence of over and undernutrition is not usually recognized. We aimed to develop and validate a screening tool for the easy detection and reporting of both undernutrition and overnutrition, specifically identifying the clinical conditions where the two types of malnutrition coexist. The study consisted of three phases: 1) selection of an appropriate study population (estimation sample) and of the hospital admission parameters to identify overnutrition and undernutrition; 2) combination of selected variables to create a screening tool to assess the nutritional risk in case of undernutrition, overnutrition, or the copresence of both the conditions, to be used by non-specialist health care professionals; 3) validation of the screening tool in a different patient sample (validation sample). Two groups of variables (12 for undernutrition, 7 for overnutrition) were identified in separate logistic models for their correlation with the outcome variables. Both models showed high efficacy, sensitivity and specificity (overnutrition, 97.7%, 99.6%, 66.6%, respectively; undernutrition, 84.4%, 83.6%, 84.8%). The logistic models were used to construct a two-faced test (named JaNuS - Just A Nutritional Screening) fitting into a two-dimension Cartesian coordinate graphic system. In the validation sample the JaNuS test confirmed its predictive value. Internal consistency and test-retest analysis provide evidence for the reliability of the test. The study provides a screening tool for the assessment of the nutritional risk, based on parameters easy-to-use by health care personnel lacking nutritional competence and characterized by excellent predictive validity. The test might be confidently applied in the clinical setting to determine the importance of

  19. A unitary convolution approximation for the impact-parameter dependent electronic energy loss

    NASA Astrophysics Data System (ADS)

    Schiwietz, G.; Grande, P. L.

    1999-06-01

    In this work, we propose a simple method to calculate the impact-parameter dependence of the electronic energy loss of bare ions for all impact parameters. This perturbative convolution approximation (PCA) is based on first-order perturbation theory, and thus, it is only valid for fast particles with low projectile charges. Using Bloch's stopping-power result and a simple scaling, we get rid of the restriction to low charge states and derive the unitary convolution approximation (UCA). Results of the UCA are then compared with full quantum-mechanical coupled-channel calculations for the impact-parameter dependent electronic energy loss.

  20. A new approach to the extraction of single exponential diode model parameters

    NASA Astrophysics Data System (ADS)

    Ortiz-Conde, Adelmo; García-Sánchez, Francisco J.

    2018-06-01

    A new integration method is presented for the extraction of the parameters of a single exponential diode model with series resistance from the measured forward I-V characteristics. The extraction is performed using auxiliary functions based on the integration of the data which allow to isolate the effects of each of the model parameters. A differentiation method is also presented for data with low level of experimental noise. Measured and simulated data are used to verify the applicability of both proposed method. Physical insight about the validity of the model is also obtained by using the proposed graphical determinations of the parameters.

  1. Validation of the Poisson Stochastic Radiative Transfer Model

    NASA Technical Reports Server (NTRS)

    Zhuravleva, Tatiana; Marshak, Alexander

    2004-01-01

    A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

  2. [Soluble interleukin 2 receptor as activity parameter in serum of systemic and discoid lupus erythematosus].

    PubMed

    Blum, C; Zillikens, D; Tony, H P; Hartmann, A A; Burg, G

    1993-05-01

    The evaluation of disease activity in systemic lupus erythematosus (SLE) is important for selection of the appropriate therapeutic regimen. In addition to the clinical picture, various laboratory parameters are taken into account. However, no validated criteria for the evaluation of the disease activity in SLE have yet been established. Recently, serum levels of soluble interleukin-2 receptor (sIL-2R) have been proposed as a potential parameter for disease activity in SLE. However, the studies reported on this subject so far have focused mainly on certain subsets of the disease, and the evaluation of the disease activity was based on a very limited number of parameters. In the present study, we determined serum levels of sIL-2R in 23 patients with SLE and 30 patients with discoid LE (DLE). Evaluation of disease activity in SLE was based on a comprehensive scale which considered numerous clinical signs and laboratory parameters. In SLE, serum levels of sIL-2R showed a better correlation with disease activity than all the other parameters investigated, including proteinuria, erythrocyte sedimentation rate, serum globulin concentration, titre of antibodies against double-stranded DNA, serum albumin concentration, serum complement levels and white blood cell count. For the first time, we report on elevated serum levels of sIL-2R in DLE, which also correlated with disease activity.

  3. Hexagonal boron nitride and water interaction parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yanbin; Aluru, Narayana R., E-mail: aluru@illinois.edu; Wagner, Lucas K.

    2016-04-28

    The study of hexagonal boron nitride (hBN) in microfluidic and nanofluidic applications at the atomic level requires accurate force field parameters to describe the water-hBN interaction. In this work, we begin with benchmark quality first principles quantum Monte Carlo calculations on the interaction energy between water and hBN, which are used to validate random phase approximation (RPA) calculations. We then proceed with RPA to derive force field parameters, which are used to simulate water contact angle on bulk hBN, attaining a value within the experimental uncertainties. This paper demonstrates that end-to-end multiscale modeling, starting at detailed many-body quantum mechanics andmore » ending with macroscopic properties, with the approximations controlled along the way, is feasible for these systems.« less

  4. Extending amulti-scale parameter regionalization (MPR) method by introducing parameter constrained optimization and flexible transfer functions

    NASA Astrophysics Data System (ADS)

    Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten

    2015-04-01

    A multi-scale parameter-estimation method, as presented by Samaniego et al. (2010), is implemented and extended for the conceptual hydrological model COSERO. COSERO is a HBV-type model that is specialized for alpine-environments, but has been applied over a wide range of basins all over the world (see: Kling et al., 2014 for an overview). Within the methodology available small-scale information (DEM, soil texture, land cover, etc.) is used to estimate the coarse-scale model parameters by applying a set of transfer-functions (TFs) and subsequent averaging methods, whereby only TF hyper-parameters are optimized against available observations (e.g. runoff data). The parameter regionalisation approach was extended in order to allow for a more meta-heuristical handling of the transfer-functions. The two main novelties are: 1. An explicit introduction of constrains into parameter estimation scheme: The constraint scheme replaces invalid parts of the transfer-function-solution space with valid solutions. It is inspired by applications in evolutionary algorithms and related to the combination of learning and evolution. This allows the consideration of physical and numerical constraints as well as the incorporation of a priori modeller-experience into the parameter estimation. 2. Spline-based transfer-functions: Spline-based functions enable arbitrary forms of transfer-functions: This is of importance since in many cases the general relationship between sub-grid information and parameters are known, but not the form of the transfer-function itself. The contribution presents the results and experiences with the adopted method and the introduced extensions. Simulation are performed for the pre-alpine/alpine Traisen catchment in Lower Austria. References: Samaniego, L., Kumar, R., Attinger, S. (2010): Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale, Water Resour. Res., doi: 10.1029/2008WR007327 Kling, H., Stanzel, P., Fuchs, M., and

  5. A Comprehensive Validation Methodology for Sparse Experimental Data

    NASA Technical Reports Server (NTRS)

    Norman, Ryan B.; Blattnig, Steve R.

    2010-01-01

    A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.

  6. Assessment of validity with polytrauma Veteran populations.

    PubMed

    Bush, Shane S; Bass, Carmela

    2015-01-01

    Veterans with polytrauma have suffered injuries to multiple body parts and organs systems, including the brain. The injuries can generate a triad of physical, neurologic/cognitive, and emotional symptoms. Accurate diagnosis is essential for the treatment of these conditions and for fair allocation of benefits. To accurately diagnose polytrauma disorders and their related problems, clinicians take into account the validity of reported history and symptoms, as well as clinical presentations. The purpose of this article is to describe the assessment of validity with polytrauma Veteran populations. Review of scholarly and other relevant literature and clinical experience are utilized. A multimethod approach to validity assessment that includes objective, standardized measures increases the confidence that can be placed in the accuracy of self-reported symptoms and physical, cognitive, and emotional test results. Due to the multivariate nature of polytrauma and the multiple disciplines that play a role in diagnosis and treatment, an ideal model of validity assessment with polytrauma Veteran populations utilizes neurocognitive, neurological, neuropsychiatric, and behavioral measures of validity. An overview of these validity assessment approaches as applied to polytrauma Veteran populations is presented. Veterans, the VA, and society are best served when accurate diagnoses are made.

  7. Calibration and validation of rockfall models

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  8. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  9. 'Mechanical restraint-confounders, risk, alliance score': testing the clinical validity of a new risk assessment instrument.

    PubMed

    Deichmann Nielsen, Lea; Bech, Per; Hounsgaard, Lise; Alkier Gildberg, Frederik

    2017-08-01

    Unstructured risk assessment, as well as confounders (underlying reasons for the patient's risk behaviour and alliance), risk behaviour, and parameters of alliance, have been identified as factors that prolong the duration of mechanical restraint among forensic mental health inpatients. To clinically validate a new, structured short-term risk assessment instrument called the Mechanical Restraint-Confounders, Risk, Alliance Score (MR-CRAS), with the intended purpose of supporting the clinicians' observation and assessment of the patient's readiness to be released from mechanical restraint. The content and layout of MR-CRAS and its user manual were evaluated using face validation by forensic mental health clinicians, content validation by an expert panel, and pilot testing within two, closed forensic mental health inpatient units. The three sub-scales (Confounders, Risk, and a parameter of Alliance) showed excellent content validity. The clinical validations also showed that MR-CRAS was perceived and experienced as a comprehensible, relevant, comprehensive, and useable risk assessment instrument. MR-CRAS contains 18 clinically valid items, and the instrument can be used to support the clinical decision-making regarding the possibility of releasing the patient from mechanical restraint. The present three studies have clinically validated a short MR-CRAS scale that is currently being psychometrically tested in a larger study.

  10. Communications circuit including a linear quadratic estimator

    DOEpatents

    Ferguson, Dennis D.

    2015-07-07

    A circuit includes a linear quadratic estimator (LQE) configured to receive a plurality of measurements a signal. The LQE is configured to weight the measurements based on their respective uncertainties to produce weighted averages. The circuit further includes a controller coupled to the LQE and configured to selectively adjust at least one data link parameter associated with a communication channel in response to receiving the weighted averages.

  11. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  12. Upscaling Cement Paste Microstructure to Obtain the Fracture, Shear, and Elastic Concrete Mechanical LDPM Parameters.

    PubMed

    Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez

    2017-02-28

    Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10 -10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale.

  13. Upscaling Cement Paste Microstructure to Obtain the Fracture, Shear, and Elastic Concrete Mechanical LDPM Parameters

    PubMed Central

    Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez

    2017-01-01

    Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10−10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale. PMID:28772605

  14. Acoustical characterization and parameter optimization of polymeric noise control materials

    NASA Astrophysics Data System (ADS)

    Homsi, Emile N.

    2003-10-01

    The sound transmission loss (STL) characteristics of polymer-based materials are considered. Analytical models that predict, characterize and optimize the STL of polymeric materials, with respect to physical parameters that affect performance, are developed for single layer panel configuration and adapted for layered panel construction with homogenous core. An optimum set of material parameters is selected and translated into practical applications for validation. Sound attenuating thermoplastic materials designed to be used as barrier systems in the automotive and consumer industries have certain acoustical characteristics that vary in function of the stiffness and density of the selected material. The validity and applicability of existing theory is explored, and since STL is influenced by factors such as the surface mass density of the panel's material, a method is modified to improve STL performance and optimize load-bearing attributes. An experimentally derived function is applied to the model for better correlation. In-phase and out-of-phase motion of top and bottom layers are considered. It was found that the layered construction of the co-injection type would exhibit fused planes at the interface and move in-phase. The model for the single layer case is adapted to the layered case where it would behave as a single panel. Primary physical parameters that affect STL are identified and manipulated. Theoretical analysis is linked to the resin's matrix attribute. High STL material with representative characteristics is evaluated versus standard resins. It was found that high STL could be achieved by altering materials' matrix and by integrating design solution in the low frequency range. A suggested numerical approach is described for STL evaluation of simple and complex geometries. In practice, validation on actual vehicle systems proved the adequacy of the acoustical characterization process.

  15. Validation of a new screening, determinative, and confirmatory multi-residue method for nitroimidazoles and their hydroxy metabolites in turkey muscle tissue by liquid chromatography-tandem mass spectrometry.

    PubMed

    Boison, Joe O; Asea, Philip A; Matus, Johanna L

    2012-08-01

    A new and sensitive multi-residue method (MRM) with detection by LC-MS/MS was developed and validated for the screening, determination, and confirmation of residues of 7 nitroimidazoles and 3 of their metabolites in turkey muscle tissues at concentrations ≥ 0.05 ng/g. The compounds were extracted into a solvent with an alkali salt. Sample clean-up and concentration was then done by solid-phase extraction (SPE) and the compounds were quantified by liquid chromatography-tandem mass spectrometry (LC-MS/MS). The characteristic parameters including repeatability, selectivity, ruggedness, stability, level of quantification, and level of confirmation for the new method were determined. Method validation was achieved by independent verification of the parameters measured during method characterization. The seven nitroimidazoles included are metronidazole (MTZ), ronidazole (RNZ), dimetridazole (DMZ), tinidazole (TNZ), ornidazole (ONZ), ipronidazole (IPR), and carnidazole (CNZ). It was discovered during the single laboratory validation of the method that five of the seven nitroimidazoles (i.e. metronidazole, dimetridazole, tinidazole, ornidazole and ipronidazole) and the 3 metabolites (1-(2-hydroxyethyl)-2-hydroxymethyl-5-nitroimidazole (MTZ-OH), 2-hydroxymethyl-1-methyl-5-nitroimidazole (HMMNI, the common metabolite of ronidazole and dimetridazole), and 1-methyl-2-(2'-hydroxyisopropyl)-5-nitroimidazole (IPR-OH) included in this study could be detected, confirmed, and quantified accurately whereas RNZ and CNZ could only be detected and confirmed but not accurately quantified. © Her Majesty the Queen in Right of Canada as Represented by the Minister of Agriculture and Agri-food Canada 2012.

  16. An improved swarm optimization for parameter estimation and biological model selection.

    PubMed

    Abdullah, Afnizanfaizal; Deris, Safaai; Mohamad, Mohd Saberi; Anwar, Sohail

    2013-01-01

    One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This

  17. An Improved Swarm Optimization for Parameter Estimation and Biological Model Selection

    PubMed Central

    Abdullah, Afnizanfaizal; Deris, Safaai; Mohamad, Mohd Saberi; Anwar, Sohail

    2013-01-01

    One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This

  18. Simultaneously extracting multiple parameters via multi-distance and multi-exposure diffuse speckle contrast analysis

    PubMed Central

    Liu, Jialin; Zhang, Hongchao; Lu, Jian; Ni, Xiaowu; Shen, Zhonghua

    2017-01-01

    Recent advancements in diffuse speckle contrast analysis (DSCA) have opened the path for noninvasive acquisition of deep tissue microvasculature blood flow. In fact, in addition to blood flow index αDB, the variations of tissue optical absorption μa, reduced scattering coefficients μs′, as well as coherence factor β can modulate temporal fluctuations of speckle patterns. In this study, we use multi-distance and multi-exposure DSCA (MDME-DSCA) to simultaneously extract multiple parameters such as μa, μs′, αDB, and β. The validity of MDME-DSCA has been validated by the simulated data and phantoms experiments. Moreover, as a comparison, the results also show that it is impractical to simultaneously obtain multiple parameters by multi-exposure DSCA (ME-DSCA). PMID:29082083

  19. Reliability and Construct Validity of the Psychopathic Personality Inventory-Revised in a Swedish Non-Criminal Sample – A Multimethod Approach including Psychophysiological Correlates of Empathy for Pain

    PubMed Central

    Sörman, Karolina; Nilsonne, Gustav; Howner, Katarina; Tamm, Sandra; Caman, Shilan; Wang, Hui-Xin; Ingvar, Martin; Edens, John F.; Gustavsson, Petter; Lilienfeld, Scott O; Petrovic, Predrag; Fischer, Håkan; Kristiansson, Marianne

    2016-01-01

    Cross-cultural investigation of psychopathy measures is important for clarifying the nomological network surrounding the psychopathy construct. The Psychopathic Personality Inventory-Revised (PPI-R) is one of the most extensively researched self-report measures of psychopathic traits in adults. To date however, it has been examined primarily in North American criminal or student samples. To address this gap in the literature, we examined PPI-R’s reliability, construct validity and factor structure in non-criminal individuals (N = 227) in Sweden, using a multimethod approach including psychophysiological correlates of empathy for pain. PPI-R construct validity was investigated in subgroups of participants by exploring its degree of overlap with (i) the Psychopathy Checklist: Screening Version (PCL:SV), (ii) self-rated empathy and behavioral and physiological responses in an experiment on empathy for pain, and (iii) additional self-report measures of alexithymia and trait anxiety. The PPI-R total score was significantly associated with PCL:SV total and factor scores. The PPI-R Coldheartedness scale demonstrated significant negative associations with all empathy subscales and with rated unpleasantness and skin conductance responses in the empathy experiment. The PPI-R higher order Self-Centered Impulsivity and Fearless Dominance dimensions were associated with trait anxiety in opposite directions (positively and negatively, respectively). Overall, the results demonstrated solid reliability (test-retest and internal consistency) and promising but somewhat mixed construct validity for the Swedish translation of the PPI-R. PMID:27300292

  20. Estimating system parameters for solvent-water and plant cuticle-water using quantum chemically estimated Abraham solute parameters.

    PubMed

    Liang, Yuzhen; Torralba-Sanchez, Tifany L; Di Toro, Dominic M

    2018-04-18

    Polyparameter Linear Free Energy Relationships (pp-LFERs) using Abraham system parameters have many useful applications. However, developing the Abraham system parameters depends on the availability and quality of the Abraham solute parameters. Using Quantum Chemically estimated Abraham solute Parameters (QCAP) is shown to produce pp-LFERs that have lower root mean square errors (RMSEs) of predictions for solvent-water partition coefficients than parameters that are estimated using other presently available methods. pp-LFERs system parameters are estimated for solvent-water, plant cuticle-water systems, and for novel compounds using QCAP solute parameters and experimental partition coefficients. Refitting the system parameter improves the calculation accuracy and eliminates the bias. Refitted models for solvent-water partition coefficients using QCAP solute parameters give better results (RMSE = 0.278 to 0.506 log units for 24 systems) than those based on ABSOLV (0.326 to 0.618) and QSPR (0.294 to 0.700) solute parameters. For munition constituents and munition-like compounds not included in the calibration of the refitted model, QCAP solute parameters produce pp-LFER models with much lower RMSEs for solvent-water partition coefficients (RMSE = 0.734 and 0.664 for original and refitted model, respectively) than ABSOLV (4.46 and 5.98) and QSPR (2.838 and 2.723). Refitting plant cuticle-water pp-LFER including munition constituents using QCAP solute parameters also results in lower RMSE (RMSE = 0.386) than that using ABSOLV (0.778) and QSPR (0.512) solute parameters. Therefore, for fitting a model in situations for which experimental data exist and system parameters can be re-estimated, or for which system parameters do not exist and need to be developed, QCAP is the quantum chemical method of choice.

  1. Parameters for the Operation of Bacterial Thiosalt Oxidation Ponds

    PubMed Central

    Silver, M.

    1985-01-01

    Shake flask and pH-controlled reactor tests were used to determine the mathematical parameters for a mixed-culture bacterial thiosalt treatment pond. Values determined were as follows: Km and Vmax (thiosulfate), 9.83 g/liter and 243.9 mg/liter per h, respectively; Ki (lead), 3.17 mg/liter; Ki (copper), 1.27 mg/liter; Q10 between 10 and 30°C, 1.95. From these parameters, the required bioxidation pond volume and residence time could be calculated. Soluble zinc (0.2 g/liter) and particulate mill products and by-products (0.25 g/liter) were not inhibitory. Correlation with an operating thiosalt biooxidation pond showed the parameters used to be valid for thiosalt concentrations up to at least 2 g/liter, lead concentrations of at least 10 mg/liter, and temperatures of >2°C. PMID:16346885

  2. Calculations of key magnetospheric parameters using the isotropic and anisotropic SPSU global MHD code

    NASA Astrophysics Data System (ADS)

    Samsonov, Andrey; Gordeev, Evgeny; Sergeev, Victor

    2017-04-01

    As it was recently suggested (e.g., Gordeev et al., 2015), the global magnetospheric configuration can be characterized by a set of key parameters, such as the magnetopause distance at the subsolar point and on the terminator plane, the magnetic field in the magnetotail lobe and the plasma sheet thermal pressure, the cross polar cap electric potential drop and the total field-aligned current. For given solar wind conditions, the values of these parameters can be obtained from both empirical models and global MHD simulations. We validate the recently developed global MHD code SPSU-16 using the key magnetospheric parameters mentioned above. The code SPSU-16 can calculate both the isotropic and anisotropic MHD equations. In the anisotropic version, we use the modified double-adiabatic equations in which the T⊥/T∥ (the ratio of perpendicular to parallel thermal pressures) has been bounded from above by the mirror and ion-cyclotron thresholds and from below by the firehose threshold. The results of validation for the SPSU-16 code well agree with the previously published results of other global codes. Some key parameters coincide in the isotropic and anisotropic MHD simulations, but some are different.

  3. Optimisation of lateral car dynamics taking into account parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Busch, Jochen; Bestle, Dieter

    2014-02-01

    Simulation studies on an active all-wheel-steering car show that disturbance of vehicle parameters have high influence on lateral car dynamics. This motivates the need of robust design against such parameter uncertainties. A specific parametrisation is established combining deterministic, velocity-dependent steering control parameters with partly uncertain, velocity-independent vehicle parameters for simultaneous use in a numerical optimisation process. Model-based objectives are formulated and summarised in a multi-objective optimisation problem where especially the lateral steady-state behaviour is improved by an adaption strategy based on measurable uncertainties. The normally distributed uncertainties are generated by optimal Latin hypercube sampling and a response surface based strategy helps to cut down time consuming model evaluations which offers the possibility to use a genetic optimisation algorithm. Optimisation results are discussed in different criterion spaces and the achieved improvements confirm the validity of the proposed procedure.

  4. A COMPUTATIONAL FRAMEWORK FOR EVALUATION OF NPS MANAGEMENT SCENARIOS: ROLE OF PARAMETER UNCERTAINTY

    EPA Science Inventory

    Utility of complex distributed-parameter watershed models for evaluation of the effectiveness of non-point source sediment and nutrient abatement scenarios such as Best Management Practices (BMPs) often follows the traditional {calibrate ---> validate ---> predict} procedure. Des...

  5. Physiological, physical and behavioural changes in dogs (Canis familiaris) when kennelled: testing the validity of stress parameters.

    PubMed

    Part, C E; Kiddie, J L; Hayes, W A; Mills, D S; Neville, R F; Morton, D B; Collins, L M

    2014-06-22

    Domestic dogs (Canis familiaris) housed in kennelling establishments are considered at risk of suffering poor welfare. Previous research supporting this hypothesis has typically used cortisol:creatinine ratios (C/Cr) to measure acute and chronic stress in kennelled dogs. However, the value of C/Cr as a welfare indicator has been questioned. This study aimed to test the validity of a range of physiological, physical and behavioural welfare indicators and to establish baseline values reflecting good dog welfare. Measurements were taken from 29 privately-owned dogs (14 males, 15 females), ranging in age and breed, in their own home and in a boarding kennel environment, following a within-subjects, counterbalanced design. Pairwise comparisons revealed that C/Cr and vanillylmandelic acid:creatinine ratios (VMA/Cr) were higher in the kennel than home environment (P=0.003; P=0.01, respectively) and were not associated with differences in movement/exercise between environments. Dogs' surface temperature was lower in kennels (P=0.001) and was not associated with ambient temperature. No association with age, or effects of kennel establishment, kennelling experience, sex or source were found. Dogs were generally more active in kennels, but showed considerable individual variability. C/Cr and 5-HIAA:creatinine ratios (5-HIAA/Cr) were negatively correlated with lip licking in kennels. Baseline values for each parameter are presented. The emotional valence of responses was ambiguous and no definitive evidence was found to suggest that dogs were negatively stressed by kennelling. It was concluded that C/Cr and, particularly, VMA/Cr and surface temperature provide robust indicators of psychological arousal in dogs, while spontaneous behaviour might be better used to facilitate interpretation of physiological and physical data on an individual level. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Statistical classifiers on multifractal parameters for optical diagnosis of cervical cancer

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sabyasachi; Pratiher, Sawon; Kumar, Rajeev; Krishnamoorthy, Vigneshram; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2017-06-01

    An augmented set of multifractal parameters with physical interpretations have been proposed to quantify the varying distribution and shape of the multifractal spectrum. The statistical classifier with accuracy of 84.17% validates the adequacy of multi-feature MFDFA characterization of elastic scattering spectroscopy for optical diagnosis of cancer.

  7. The γ parameter of the stretched-exponential model is influenced by internal gradients: validation in phantoms.

    PubMed

    Palombo, Marco; Gabrielli, Andrea; De Santis, Silvia; Capuani, Silvia

    2012-03-01

    In this paper, we investigate the image contrast that characterizes anomalous and non-gaussian diffusion images obtained using the stretched exponential model. This model is based on the introduction of the γ stretched parameter, which quantifies deviation from the mono-exponential decay of diffusion signal as a function of the b-value. To date, the biophysical substrate underpinning the contrast observed in γ maps, in other words, the biophysical interpretation of the γ parameter (or the fractional order derivative in space, β parameter) is still not fully understood, although it has already been applied to investigate both animal models and human brain. Due to the ability of γ maps to reflect additional microstructural information which cannot be obtained using diffusion procedures based on gaussian diffusion, some authors propose this parameter as a measure of diffusion heterogeneity or water compartmentalization in biological tissues. Based on our recent work we suggest here that the coupling between internal and diffusion gradients provide pseudo-superdiffusion effects which are quantified by the stretching exponential parameter γ. This means that the image contrast of Mγ maps reflects local magnetic susceptibility differences (Δχ(m)), thus highlighting better than T(2)(∗) contrast the interface between compartments characterized by Δχ(m). Thanks to this characteristic, Mγ imaging may represent an interesting tool to develop contrast-enhanced MRI for molecular imaging. The spectroscopic and imaging experiments (performed in controlled micro-beads dispersion) that are reported here, strongly suggest internal gradients, and as a consequence Δχ(m), to be an important factor in fully understanding the source of contrast in anomalous diffusion methods that are based on a stretched exponential model analysis of diffusion data obtained at varying gradient strengths g. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Development and Validation of the Behavioral Tendencies Questionnaire.

    PubMed

    Van Dam, Nicholas T; Brown, Anna; Mole, Tom B; Davis, Jake H; Britton, Willoughby B; Brewer, Judson A

    2015-01-01

    At a fundamental level, taxonomy of behavior and behavioral tendencies can be described in terms of approach, avoid, or equivocate (i.e., neither approach nor avoid). While there are numerous theories of personality, temperament, and character, few seem to take advantage of parsimonious taxonomy. The present study sought to implement this taxonomy by creating a questionnaire based on a categorization of behavioral temperaments/tendencies first identified in Buddhist accounts over fifteen hundred years ago. Items were developed using historical and contemporary texts of the behavioral temperaments, described as "Greedy/Faithful", "Aversive/Discerning", and "Deluded/Speculative". To both maintain this categorical typology and benefit from the advantageous properties of forced-choice response format (e.g., reduction of response biases), binary pairwise preferences for items were modeled using Latent Class Analysis (LCA). One sample (n1 = 394) was used to estimate the item parameters, and the second sample (n2 = 504) was used to classify the participants using the established parameters and cross-validate the classification against multiple other measures. The cross-validated measure exhibited good nomothetic span (construct-consistent relationships with related measures) that seemed to corroborate the ideas present in the original Buddhist source documents. The final 13-block questionnaire created from the best performing items (the Behavioral Tendencies Questionnaire or BTQ) is a psychometrically valid questionnaire that is historically consistent, based in behavioral tendencies, and promises practical and clinical utility particularly in settings that teach and study meditation practices such as Mindfulness Based Stress Reduction (MBSR).

  9. The multi-parameter remote measurement of rainfall

    NASA Technical Reports Server (NTRS)

    Atlas, D.; Ulbrich, C. W.; Meneghini, R.

    1982-01-01

    The measurement of rainfall by remote sensors is investigated. One parameter radar rainfall measurement is limited because both reflectivity and rain rate are dependent on at least two parameters of the drop size distribution (DSD), i.e., representative raindrop size and number concentration. A generalized rain parameter diagram is developed which includes a third distribution parameter, the breadth of the DSD, to better specify rain rate and all possible remote variables. Simulations show the improvement in accuracy attainable through the use of combinations of two and three remote measurables. The spectrum of remote measurables is reviewed. These include path integrated techniques of radiometry and of microwave and optical attenuation.

  10. Noise normalization and windowing functions for VALIDAR in wind parameter estimation

    NASA Astrophysics Data System (ADS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Li, Zhiwen

    2006-05-01

    The wind parameter estimates from a state-of-the-art 2-μm coherent lidar system located at NASA Langley, Virginia, named VALIDAR (validation lidar), were compared after normalizing the noise by its estimated power spectra via the periodogram and the linear predictive coding (LPC) scheme. The power spectra and the Doppler shift estimates were the main parameter estimates for comparison. Different types of windowing functions were implemented in VALIDAR data processing algorithm and their impact on the wind parameter estimates was observed. Time and frequency independent windowing functions such as Rectangular, Hanning, and Kaiser-Bessel and time and frequency dependent apodized windowing function were compared. The briefing of current nonlinear algorithm development for Doppler shift correction subsequently follows.

  11. Parameter estimation of the copernicus decompression model with venous gas emboli in human divers.

    PubMed

    Gutvik, Christian R; Dunford, Richard G; Dujic, Zeljko; Brubakk, Alf O

    2010-07-01

    Decompression Sickness (DCS) may occur when divers decompress from a hyperbaric environment. To prevent this, decompression procedures are used to get safely back to the surface. The models whose procedures are calculated from, are traditionally validated using clinical symptoms as an endpoint. However, DCS is an uncommon phenomenon and the wide variation in individual response to decompression stress is poorly understood. And generally, using clinical examination alone for validation is disadvantageous from a modeling perspective. Currently, the only objective and quantitative measure of decompression stress is Venous Gas Emboli (VGE), measured by either ultrasonic imaging or Doppler. VGE has been shown to be statistically correlated with DCS, and is now widely used in science to evaluate decompression stress from a dive. Until recently no mathematical model has existed to predict VGE from a dive, which motivated the development of the Copernicus model. The present article compiles a selection experimental dives and field data containing computer recorded depth profiles associated with ultrasound measurements of VGE. It describes a parameter estimation problem to fit the model with these data. A total of 185 square bounce dives from DCIEM, Canada, 188 recreational dives with a mix of single, repetitive and multi-day exposures from DAN USA and 84 experimentally designed decompression dives from Split Croatia were used, giving a total of 457 dives. Five selected parameters in the Copernicus bubble model were assigned for estimation and a non-linear optimization problem was formalized with a weighted least square cost function. A bias factor to the DCIEM chamber dives was also included. A Quasi-Newton algorithm (BFGS) from the TOMLAB numerical package solved the problem which was proved to be convex. With the parameter set presented in this article, Copernicus can be implemented in any programming language to estimate VGE from an air dive.

  12. Reliability and validity of gait analysis by android-based smartphone.

    PubMed

    Nishiguchi, Shu; Yamada, Minoru; Nagai, Koutatsu; Mori, Shuhei; Kajiwara, Yuu; Sonoda, Takuya; Yoshimura, Kazuya; Yoshitomi, Hiroyuki; Ito, Hiromu; Okamoto, Kazuya; Ito, Tatsuaki; Muto, Shinyo; Ishihara, Tatsuya; Aoyama, Tomoki

    2012-05-01

    Smartphones are very common devices in daily life that have a built-in tri-axial accelerometer. Similar to previously developed accelerometers, smartphones can be used to assess gait patterns. However, few gait analyses have been performed using smartphones, and their reliability and validity have not been evaluated yet. The purpose of this study was to evaluate the reliability and validity of a smartphone accelerometer. Thirty healthy young adults participated in this study. They walked 20 m at their preferred speeds, and their trunk accelerations were measured using a smartphone and a tri-axial accelerometer that was secured over the L3 spinous process. We developed a gait analysis application and installed it in the smartphone to measure the acceleration. After signal processing, we calculated the gait parameters of each measurement terminal: peak frequency (PF), root mean square (RMS), autocorrelation peak (AC), and coefficient of variance (CV) of the acceleration peak intervals. Remarkable consistency was observed in the test-retest reliability of all the gait parameter results obtained by the smartphone (p<0.001). All the gait parameter results obtained by the smartphone showed statistically significant and considerable correlations with the same parameter results obtained by the tri-axial accelerometer (PF r=0.99, RMS r=0.89, AC r=0.85, CV r=0.82; p<0.01). Our study indicates that the smartphone with gait analysis application used in this study has the capacity to quantify gait parameters with a degree of accuracy that is comparable to that of the tri-axial accelerometer.

  13. Including gauge-group parameters into the theory of interactions: an alternative mass-generating mechanism for gauge fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldaya, V.; Lopez-Ruiz, F. F.; Sanchez-Sastre, E.

    2006-11-03

    We reformulate the gauge theory of interactions by introducing the gauge group parameters into the model. The dynamics of the new 'Goldstone-like' bosons is accomplished through a non-linear {sigma}-model Lagrangian. They are minimally coupled according to a proper prescription which provides mass terms to the intermediate vector bosons without spoiling gauge invariance. The present formalism is explicitly applied to the Standard Model of electroweak interactions.

  14. Quantitative Determination of Spring Water Quality Parameters via Electronic Tongue.

    PubMed

    Carbó, Noèlia; López Carrero, Javier; Garcia-Castillo, F Javier; Tormos, Isabel; Olivas, Estela; Folch, Elisa; Alcañiz Fillol, Miguel; Soto, Juan; Martínez-Máñez, Ramón; Martínez-Bisbal, M Carmen

    2017-12-25

    The use of a voltammetric electronic tongue for the quantitative analysis of quality parameters in spring water is proposed here. The electronic voltammetric tongue consisted of a set of four noble electrodes (iridium, rhodium, platinum, and gold) housed inside a stainless steel cylinder. These noble metals have a high durability and are not demanding for maintenance, features required for the development of future automated equipment. A pulse voltammetry study was conducted in 83 spring water samples to determine concentrations of nitrate (range: 6.9-115 mg/L), sulfate (32-472 mg/L), fluoride (0.08-0.26 mg/L), chloride (17-190 mg/L), and sodium (11-94 mg/L) as well as pH (7.3-7.8). These parameters were also determined by routine analytical methods in spring water samples. A partial least squares (PLS) analysis was run to obtain a model to predict these parameter. Orthogonal signal correction (OSC) was applied in the preprocessing step. Calibration (67%) and validation (33%) sets were selected randomly. The electronic tongue showed good predictive power to determine the concentrations of nitrate, sulfate, chloride, and sodium as well as pH and displayed a lower R² and slope in the validation set for fluoride. Nitrate and fluoride concentrations were estimated with errors lower than 15%, whereas chloride, sulfate, and sodium concentrations as well as pH were estimated with errors below 10%.

  15. Development, Validation, and Verification of a Self-Assessment Tool to Estimate Agnibala (Digestive Strength).

    PubMed

    Singh, Aparna; Singh, Girish; Patwardhan, Kishor; Gehlot, Sangeeta

    2017-01-01

    According to Ayurveda, the traditional system of healthcare of Indian origin, Agni is the factor responsible for digestion and metabolism. Four functional states (Agnibala) of Agni have been recognized: regular, irregular, intense, and weak. The objective of the present study was to develop and validate a self-assessment tool to estimate Agnibala The developed tool was evaluated for its reliability and validity by administering it to 300 healthy volunteers of either gender belonging to 18 to 40-year age group. Besides confirming the statistical validity and reliability, the practical utility of the newly developed tool was also evaluated by recording serum lipid parameters of all the volunteers. The results show that the lipid parameters vary significantly according to the status of Agni The tool, therefore, may be used to screen normal population to look for possible susceptibility to certain health conditions. © The Author(s) 2016.

  16. Selection, calibration, and validation of models of tumor growth.

    PubMed

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  17. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  18. Genetic programming-based mathematical modeling of influence of weather parameters in BOD5 removal by Lemna minor.

    PubMed

    Chandrasekaran, Sivapragasam; Sankararajan, Vanitha; Neelakandhan, Nampoothiri; Ram Kumar, Mahalakshmi

    2017-11-04

    This study, through extensive experiments and mathematical modeling, reveals that other than retention time and wastewater temperature (T w ), atmospheric parameters also play important role in the effective functioning of aquatic macrophyte-based treatment system. Duckweed species Lemna minor is considered in this study. It is observed that the combined effect of atmospheric temperature (T atm ), wind speed (U w ), and relative humidity (RH) can be reflected through one parameter, namely the "apparent temperature" (T a ). A total of eight different models are considered based on the combination of input parameters and the best mathematical model is arrived at which is validated through a new experimental set-up outside the modeling period. The validation results are highly encouraging. Genetic programming (GP)-based models are found to reveal deeper understandings of the wetland process.

  19. A Hardware Model Validation Tool for Use in Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

    2010-01-01

    One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

  20. Resonance Parameter Adjustment Based on Integral Experiments

    DOE PAGES

    Sobes, Vladimir; Leal, Luiz; Arbanas, Goran; ...

    2016-06-02

    Our project seeks to allow coupling of differential and integral data evaluation in a continuous-energy framework and to use the generalized linear least-squares (GLLS) methodology in the TSURFER module of the SCALE code package to update the parameters of a resolved resonance region evaluation. We recognize that the GLLS methodology in TSURFER is identical to the mathematical description of a Bayesian update in SAMMY, the SAMINT code was created to use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Traditionally, SAMMY used differential experimental data to adjust nuclear data parameters. Integral experimental data, suchmore » as in the International Criticality Safety Benchmark Experiments Project, remain a tool for validation of completed nuclear data evaluations. SAMINT extracts information from integral benchmarks to aid the nuclear data evaluation process. Later, integral data can be used to resolve any remaining ambiguity between differential data sets, highlight troublesome energy regions, determine key nuclear data parameters for integral benchmark calculations, and improve the nuclear data covariance matrix evaluation. Moreover, SAMINT is not intended to bias nuclear data toward specific integral experiments but should be used to supplement the evaluation of differential experimental data. Using GLLS ensures proper weight is given to the differential data.« less

  1. Optimal SVM parameter selection for non-separable and unbalanced datasets.

    PubMed

    Jiang, Peng; Missoum, Samy; Chen, Zhao

    2014-10-01

    This article presents a study of three validation metrics used for the selection of optimal parameters of a support vector machine (SVM) classifier in the case of non-separable and unbalanced datasets. This situation is often encountered when the data is obtained experimentally or clinically. The three metrics selected in this work are the area under the ROC curve (AUC), accuracy, and balanced accuracy. These validation metrics are tested using computational data only, which enables the creation of fully separable sets of data. This way, non-separable datasets, representative of a real-world problem, can be created by projection onto a lower dimensional sub-space. The knowledge of the separable dataset, unknown in real-world problems, provides a reference to compare the three validation metrics using a quantity referred to as the "weighted likelihood". As an application example, the study investigates a classification model for hip fracture prediction. The data is obtained from a parameterized finite element model of a femur. The performance of the various validation metrics is studied for several levels of separability, ratios of unbalance, and training set sizes.

  2. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates basedmore » on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  3. Shift Verification and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over amore » burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.« less

  4. Experimental validation of the RATE tool for inferring HLA restrictions of T cell epitopes.

    PubMed

    Paul, Sinu; Arlehamn, Cecilia S Lindestam; Schulten, Veronique; Westernberg, Luise; Sidney, John; Peters, Bjoern; Sette, Alessandro

    2017-06-21

    The RATE tool was recently developed to computationally infer the HLA restriction of given epitopes from immune response data of HLA typed subjects without additional cumbersome experimentation. Here, RATE was validated using experimentally defined restriction data from a set of 191 tuberculosis-derived epitopes and 63 healthy individuals with MTB infection from the Western Cape Region of South Africa. Using this experimental dataset, the parameters utilized by the RATE tool to infer restriction were optimized, which included relative frequency (RF) of the subjects responding to a given epitope and expressing a given allele as compared to the general test population and the associated p-value in a Fisher's exact test. We also examined the potential for further optimization based on the predicted binding affinity of epitopes to potential restricting HLA alleles, and the absolute number of individuals expressing a given allele and responding to the specific epitope. Different statistical measures, including Matthew's correlation coefficient, accuracy, sensitivity and specificity were used to evaluate performance of RATE as a function of these criteria. Based on our results we recommend selection of HLA restrictions with cutoffs of p-value < 0.01 and RF ≥ 1.3. The usefulness of the tool was demonstrated by inferring new HLA restrictions for epitope sets where restrictions could not be experimentally determined due to lack of necessary cell lines and for an additional data set related to recognition of pollen derived epitopes from allergic patients. Experimental data sets were used to validate RATE tool and the parameters used by the RATE tool to infer restriction were optimized. New HLA restrictions were identified using the optimized RATE tool.

  5. PV systems photoelectric parameters determining for field conditions and real operation conditions

    NASA Astrophysics Data System (ADS)

    Shepovalova, Olga V.

    2018-05-01

    In this work, research experience and reference documentation have been generalized related to PV systems photoelectric parameters (PV array output parameters) determining. The basic method has been presented that makes it possible to determine photoelectric parameters with the state-of-the-art reliability and repeatability. This method provides an effective tool for PV systems comparison and evaluation of PV system parameters that the end-user will have in the course of its real operation for compliance with those stipulated in reference documentation. The method takes in consideration all parameters that may possibly affect photoelectric performance and that are supported by sufficiently valid procedures for their values testing. Test conditions, requirements for equipment subject to tests and test preparations have been established and the test procedure for fully equipped PV system in field tests and in real operation conditions has been described.

  6. Validation of Reverse-Engineered and Additive-Manufactured Microsurgical Instrument Prototype.

    PubMed

    Singh, Ramandeep; Suri, Ashish; Anand, Sneh; Baby, Britty

    2016-12-01

    With advancements in imaging techniques, neurosurgical procedures are becoming highly precise and minimally invasive, thus demanding development of new ergonomically aesthetic instruments. Conventionally, neurosurgical instruments are manufactured using subtractive manufacturing methods. Such a process is complex, time-consuming, and impractical for prototype development and validation of new designs. Therefore, an alternative design process has been used utilizing blue light scanning, computer-aided designing, and additive manufacturing direct metal laser sintering (DMLS) for microsurgical instrument prototype development. Deviations of DMLS-fabricated instrument were studied by superimposing scan data of fabricated instrument with the computer-aided designing model. Content and concurrent validity of the fabricated prototypes was done by a group of 15 neurosurgeons by performing sciatic nerve anastomosis in small laboratory animals. Comparative scoring was obtained for the control and study instrument. T test was applied to the individual parameters and P values for force (P < .0001) and surface roughness (P < .01) were found to be statistically significant. These 2 parameters were further analyzed using objective measures. Results depicts that additive manufacturing by DMLS provides an effective method for prototype development. However, direct application of these additive-manufactured instruments in the operating room requires further validation. © The Author(s) 2016.

  7. Dengue score as a diagnostic predictor for pleural effusion and/or ascites: external validation and clinical application.

    PubMed

    Suwarto, Suhendro; Hidayat, Mohammad Jauharsyah; Widjaya, Bing

    2018-02-23

    The Dengue Score is a model for predicting pleural effusion and/or ascites and uses the hematocrit (Hct), albumin concentration, platelet count and aspartate aminotransferase (AST) ratio as independent variables. As this metric has not been validated, we conducted a study to validate the Dengue Score and assess its clinical application. A retrospective study was performed at a private hospital in Jakarta, Indonesia. Patients with dengue infection hospitalized from January 2011 through March 2016 were included. The Dengue Score was calculated using four parameters: Hct increase≥15.1%, serum albumin≤3.49 mg/dL, platelet count≤49,500/μL and AST ratio ≥ 2.51. Each parameter was scored as 1 if present and 0 if absent. To validate the Dengue Score, goodness-of-fit was used to assess calibration, and the area under the receiver operating characteristic curve (AROC) was used to assess discrimination. Associations between clinical parameters and Dengue Score groups were determined by bivariate analysis. A total of 207 patients were included in this study. The calibration of the Dengue Score was acceptable (Hosmer-Lemeshow test, p = 0.11), and the score's discriminative ability was good (AROC = 0.88 (95% CI: 0.83-0.92)). At a cutoff of ≥2, the Dengue Score had a positive predictive value (PPV) of 79.03% and a negative predictive value (NPV) of 90.36% for the diagnostic prediction of pleural effusion and/or ascites. Compared with the Dengue Score ≤ 1 group, the Dengue Score = 2 group was significantly associated with hemoconcentration> 20% (p = 0.029), severe thrombocytopenia (p = 0.029), and increased length of hospital stay (p = 0.003). Compared with the Dengue Score = 2 group, the Dengue Score ≥ 3 group was significantly associated with hemoconcentration> 20% (p = 0.001), severe thrombocytopenia (p = 0.024), severe dengue (p = 0.039), and increased length of hospital stay (p = 0.011). The Dengue Score performed well and can

  8. Remarks on CFD validation: A Boeing Commercial Airplane Company perspective

    NASA Technical Reports Server (NTRS)

    Rubbert, Paul E.

    1987-01-01

    Requirements and meaning of validation of computational fluid dynamics codes are discussed. Topics covered include: validating a code, validating a user, and calibrating a code. All results are presented in viewgraph format.

  9. 40 CFR 761.392 - Preparing validation study samples.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... establish a surface concentration to be included in the standard operating procedure. The surface levels of... Under § 761.79(d)(4) § 761.392 Preparing validation study samples. (a)(1) To validate a procedure to... surfaces must be ≥20 µg/100 cm2. (2) To validate a procedure to decontaminate a specified surface...

  10. 40 CFR 761.392 - Preparing validation study samples.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... establish a surface concentration to be included in the standard operating procedure. The surface levels of... Under § 761.79(d)(4) § 761.392 Preparing validation study samples. (a)(1) To validate a procedure to... surfaces must be ≥20 µg/100 cm2. (2) To validate a procedure to decontaminate a specified surface...

  11. 40 CFR 761.392 - Preparing validation study samples.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... establish a surface concentration to be included in the standard operating procedure. The surface levels of... Under § 761.79(d)(4) § 761.392 Preparing validation study samples. (a)(1) To validate a procedure to... surfaces must be ≥20 µg/100 cm2. (2) To validate a procedure to decontaminate a specified surface...

  12. Experimental validation of plastic constitutive hardening relationship based upon the direction of the Net Burgers Density Vector

    NASA Astrophysics Data System (ADS)

    Sarac, Abdulhamit; Kysar, Jeffrey W.

    2018-02-01

    We present a new methodology for experimental validation of single crystal plasticity constitutive relationships based upon spatially resolved measurements of the direction of the Net Burgers Density Vector, which we refer to as the β-field. The β-variable contains information about the active slip systems as well as the ratios of the Geometrically Necessary Dislocation (GND) densities on the active slip systems. We demonstrate the methodology by comparing single crystal plasticity finite element simulations of plane strain wedge indentations into face-centered cubic nickel to detailed experimental measurements of the β-field. We employ the classical Peirce-Asaro-Needleman (PAN) hardening model in this study due to the straightforward physical interpretation of its constitutive parameters that include latent hardening ratio, initial hardening modulus and the saturation stress. The saturation stress and the initial hardening modulus have relatively large influence on the β-variable compared to the latent hardening ratio. A change in the initial hardening modulus leads to a shift in the boundaries of plastic slip sectors with the plastically deforming region. As the saturation strength varies, both the magnitude of the β-variable and the boundaries of the plastic slip sectors change. We thus demonstrate that the β-variable is sensitive to changes in the constitutive parameters making the variable suitable for validation purposes. We identify a set of constitutive parameters that are consistent with the β-field obtained from the experiment.

  13. Ground-water models: Validate or invalidate

    USGS Publications Warehouse

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  14. Validation of Digital Spiral Analysis as Outcome Parameter for Clinical Trials in Essential Tremor

    PubMed Central

    Haubenberger, Dietrich; Kalowitz, Daniel; Nahab, Fatta B.; Toro, Camilo; Ippolito, Dominic; Luckenbaugh, David A.; Wittevrongel, Loretta; Hallett, Mark

    2014-01-01

    Essential tremor, one of the most prevalent movement disorders, is characterized by kinetic and postural tremor affecting activities of daily living. Spiral drawing is commonly used to visually rate tremor intensity, as part of the routine clinical assessment of tremor and as a tool in clinical trials. We present a strategy to quantify tremor severity from spirals drawn on a digitizing tablet. We validate our method against a well-established visual spiral rating method and compare both methods on their capacity to capture a therapeutic effect, as defined by the change in clinical essential tremor rating scale after an ethanol challenge. Fifty-four Archimedes spirals were drawn using a digitizing tablet by nine ethanol-responsive patients with essential tremor before and at five consecutive time points after the administration of ethanol in a standardized treatment intervention. Quantitative spiral tremor severity was estimated from the velocity tremor peak amplitude after numerical derivation and Fourier transformation of pen-tip positions. In randomly ordered sets, spirals were scored by seven trained raters, using Bain and Findley’s 0 to 10 rating scale. Computerized scores correlated with visual ratings (P < 0.0001). The correlation was significant at each time point before and after ethanol (P < 0.005). Quantitative ratings provided better sensitivity than visual rating to capture the effects of an ethanol challenge (P < 0.05). Using a standardized treatment approach, we were able to demonstrate that spirography time-series analysis is a valid, reliable method to document tremor intensity and a more sensitive measure for small effects than currently available visual spiral rating methods. PMID:21714004

  15. Perceptual control models of pursuit manual tracking demonstrate individual specificity and parameter consistency.

    PubMed

    Parker, Maximilian G; Tyson, Sarah F; Weightman, Andrew P; Abbott, Bruce; Emsley, Richard; Mansell, Warren

    2017-11-01

    Computational models that simulate individuals' movements in pursuit-tracking tasks have been used to elucidate mechanisms of human motor control. Whilst there is evidence that individuals demonstrate idiosyncratic control-tracking strategies, it remains unclear whether models can be sensitive to these idiosyncrasies. Perceptual control theory (PCT) provides a unique model architecture with an internally set reference value parameter, and can be optimized to fit an individual's tracking behavior. The current study investigated whether PCT models could show temporal stability and individual specificity over time. Twenty adults completed three blocks of 15 1-min, pursuit-tracking trials. Two blocks (training and post-training) were completed in one session and the third was completed after 1 week (follow-up). The target moved in a one-dimensional, pseudorandom pattern. PCT models were optimized to the training data using a least-mean-squares algorithm, and validated with data from post-training and follow-up. We found significant inter-individual variability (partial η 2 : .464-.697) and intra-individual consistency (Cronbach's α: .880-.976) in parameter estimates. Polynomial regression revealed that all model parameters, including the reference value parameter, contribute to simulation accuracy. Participants' tracking performances were significantly more accurately simulated by models developed from their own tracking data than by models developed from other participants' data. We conclude that PCT models can be optimized to simulate the performance of an individual and that the test-retest reliability of individual models is a necessary criterion for evaluating computational models of human performance.

  16. Experimental validation of a new heterogeneous mechanical test design

    NASA Astrophysics Data System (ADS)

    Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.

    2018-05-01

    Standard material parameters identification strategies generally use an extensive number of classical tests for collecting the required experimental data. However, a great effort has been made recently by the scientific and industrial communities to support this experimental database on heterogeneous tests. These tests can provide richer information on the material behavior allowing the identification of a more complete set of material parameters. This is a result of the recent development of full-field measurements techniques, like digital image correlation (DIC), that can capture the heterogeneous deformation fields on the specimen surface during the test. Recently, new specimen geometries were designed to enhance the richness of the strain field and capture supplementary strain states. The butterfly specimen is an example of these new geometries, designed through a numerical optimization procedure where an indicator capable of evaluating the heterogeneity and the richness of strain information. However, no experimental validation was yet performed. The aim of this work is to experimentally validate the heterogeneous butterfly mechanical test in the parameter identification framework. For this aim, DIC technique and a Finite Element Model Up-date inverse strategy are used together for the parameter identification of a DC04 steel, as well as the calculation of the indicator. The experimental tests are carried out in a universal testing machine with the ARAMIS measuring system to provide the strain states on the specimen surface. The identification strategy is accomplished with the data obtained from the experimental tests and the results are compared to a reference numerical solution.

  17. A critical analysis of test-retest reliability in instrument validation studies of cancer patients under palliative care: a systematic review

    PubMed Central

    2014-01-01

    Background Patient-reported outcome validation needs to achieve validity and reliability standards. Among reliability analysis parameters, test-retest reliability is an important psychometric property. Retested patients must be in a clinically stable condition. This is particularly problematic in palliative care (PC) settings because advanced cancer patients are prone to a faster rate of clinical deterioration. The aim of this study was to evaluate the methods by which multi-symptom and health-related qualities of life (HRQoL) based on patient-reported outcomes (PROs) have been validated in oncological PC settings with regards to test-retest reliability. Methods A systematic search of PubMed (1966 to June 2013), EMBASE (1980 to June 2013), PsychInfo (1806 to June 2013), CINAHL (1980 to June 2013), and SCIELO (1998 to June 2013), and specific PRO databases was performed. Studies were included if they described a set of validation studies. Studies were included if they described a set of validation studies for an instrument developed to measure multi-symptom or multidimensional HRQoL in advanced cancer patients under PC. The COSMIN checklist was used to rate the methodological quality of the study designs. Results We identified 89 validation studies from 746 potentially relevant articles. From those 89 articles, 31 measured test-retest reliability and were included in this review. Upon critical analysis of the overall quality of the criteria used to determine the test-retest reliability, 6 (19.4%), 17 (54.8%), and 8 (25.8%) of these articles were rated as good, fair, or poor, respectively, and no article was classified as excellent. Multi-symptom instruments were retested over a shortened interval when compared to the HRQoL instruments (median values 24 hours and 168 hours, respectively; p = 0.001). Validation studies that included objective confirmation of clinical stability in their design yielded better results for the test-retest analysis with regard to both

  18. Challenges in validating model results for first year ice

    NASA Astrophysics Data System (ADS)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  19. Mapping Surface Cover Parameters Using Aggregation Rules and Remotely Sensed Cover Classes. Version 1.9

    NASA Technical Reports Server (NTRS)

    Arain, Altaf M.; Shuttleworth, W. James; Yang, Z-Liang; Michaud, Jene; Dolman, Johannes

    1997-01-01

    A coupled model, which combines the Biosphere-Atmosphere Transfer Scheme (BATS) with an advanced atmospheric boundary-layer model, was used to validate hypothetical aggregation rules for BATS-specific surface cover parameters. The model was initialized and tested with observations from the Anglo-Brazilian Amazonian Climate Observational Study and used to simulate surface fluxes for rain forest and pasture mixes at a site near Manaus in Brazil. The aggregation rules are shown to estimate parameters which give area-average surface fluxes similar to those calculated with explicit representation of forest and pasture patches for a range of meteorological and surface conditions relevant to this site, but the agreement deteriorates somewhat when there are large patch-to-patch differences in soil moisture. The aggregation rules, validated as above, were then applied to remotely sensed 1 km land cover data set to obtain grid-average values of BATS vegetation parameters for 2.8 deg x 2.8 deg and 1 deg x 1 deg grids within the conterminous United States. There are significant differences in key vegetation parameters (aerodynamic roughness length, albedo, leaf area index, and stomatal resistance) when aggregate parameters are compared to parameters for the single, dominant cover within the grid. However, the surface energy fluxes calculated by stand-alone BATS with the 2-year forcing, data from the International Satellite Land Surface Climatology Project (ISLSCP) CDROM were reasonably similar using aggregate-vegetation parameters and dominant-cover parameters, but there were some significant differences, particularly in the western USA.

  20. Parameter estimation for lithium ion batteries

    NASA Astrophysics Data System (ADS)

    Santhanagopalan, Shriram

    With an increase in the demand for lithium based batteries at the rate of about 7% per year, the amount of effort put into improving the performance of these batteries from both experimental and theoretical perspectives is increasing. There exist a number of mathematical models ranging from simple empirical models to complicated physics-based models to describe the processes leading to failure of these cells. The literature is also rife with experimental studies that characterize the various properties of the system in an attempt to improve the performance of lithium ion cells. However, very little has been done to quantify the experimental observations and relate these results to the existing mathematical models. In fact, the best of the physics based models in the literature show as much as 20% discrepancy when compared to experimental data. The reasons for such a big difference include, but are not limited to, numerical complexities involved in extracting parameters from experimental data and inconsistencies in interpreting directly measured values for the parameters. In this work, an attempt has been made to implement simplified models to extract parameter values that accurately characterize the performance of lithium ion cells. The validity of these models under a variety of experimental conditions is verified using a model discrimination procedure. Transport and kinetic properties are estimated using a non-linear estimation procedure. The initial state of charge inside each electrode is also maintained as an unknown parameter, since this value plays a significant role in accurately matching experimental charge/discharge curves with model predictions and is not readily known from experimental data. The second part of the dissertation focuses on parameters that change rapidly with time. For example, in the case of lithium ion batteries used in Hybrid Electric Vehicle (HEV) applications, the prediction of the State of Charge (SOC) of the cell under a variety of

  1. Estimates of the atmospheric parameters of M-type stars: a machine-learning perspective

    NASA Astrophysics Data System (ADS)

    Sarro, L. M.; Ordieres-Meré, J.; Bello-García, A.; González-Marcos, A.; Solano, E.

    2018-05-01

    Estimating the atmospheric parameters of M-type stars has been a difficult task due to the lack of simple diagnostics in the stellar spectra. We aim at uncovering good sets of predictive features of stellar atmospheric parameters (Teff, log (g), [M/H]) in spectra of M-type stars. We define two types of potential features (equivalent widths and integrated flux ratios) able to explain the atmospheric physical parameters. We search the space of feature sets using a genetic algorithm that evaluates solutions by their prediction performance in the framework of the BT-Settl library of stellar spectra. Thereafter, we construct eight regression models using different machine-learning techniques and compare their performances with those obtained using the classical χ2 approach and independent component analysis (ICA) coefficients. Finally, we validate the various alternatives using two sets of real spectra from the NASA Infrared Telescope Facility (IRTF) and Dwarf Archives collections. We find that the cross-validation errors are poor measures of the performance of regression models in the context of physical parameter prediction in M-type stars. For R ˜ 2000 spectra with signal-to-noise ratios typical of the IRTF and Dwarf Archives, feature selection with genetic algorithms or alternative techniques produces only marginal advantages with respect to representation spaces that are unconstrained in wavelength (full spectrum or ICA). We make available the atmospheric parameters for the two collections of observed spectra as online material.

  2. Calibration and validation of a general infiltration model

    NASA Astrophysics Data System (ADS)

    Mishra, Surendra Kumar; Ranjan Kumar, Shashi; Singh, Vijay P.

    1999-08-01

    A general infiltration model proposed by Singh and Yu (1990) was calibrated and validated using a split sampling approach for 191 sets of infiltration data observed in the states of Minnesota and Georgia in the USA. Of the five model parameters, fc (the final infiltration rate), So (the available storage space) and exponent n were found to be more predictable than the other two parameters: m (exponent) and a (proportionality factor). A critical examination of the general model revealed that it is related to the Soil Conservation Service (1956) curve number (SCS-CN) method and its parameter So is equivalent to the potential maximum retention of the SCS-CN method and is, in turn, found to be a function of soil sorptivity and hydraulic conductivity. The general model was found to describe infiltration rate with time varying curve number.

  3. Multi-Sensor Observations of Earthquake Related Atmospheric Signals over Major Geohazard Validation Sites

    NASA Technical Reports Server (NTRS)

    Ouzounov, D.; Pulinets, S.; Davindenko, D.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    We are conducting a scientific validation study involving multi-sensor observations in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several atmospheric and environmental parameters, which we found, are associated with the earthquakes, namely: thermal infrared radiation, outgoing long-wavelength radiation, ionospheric electron density, and atmospheric temperature and humidity. For first time we applied this approach to selected GEOSS sites prone to earthquakes or volcanoes. This provides a new opportunity to cross validate our results with the dense networks of in-situ and space measurements. We investigated two different seismic aspects, first the sites with recent large earthquakes, viz.- Tohoku-oki (M9, 2011, Japan) and Emilia region (M5.9, 2012,N. Italy). Our retrospective analysis of satellite data has shown the presence of anomalies in the atmosphere. Second, we did a retrospective analysis to check the re-occurrence of similar anomalous behavior in atmosphere/ionosphere over three regions with distinct geological settings and high seismicity: Taiwan, Japan and Kamchatka, which include 40 major earthquakes (M>5.9) for the period of 2005-2009. We found anomalous behavior before all of these events with no false negatives; false positives were less then 10%. Our initial results suggest that multi-instrument space-borne and ground observations show a systematic appearance of atmospheric anomalies near the epicentral area that could be explained by a coupling between the observed physical parameters and earthquake preparation processes.

  4. The cross-validated AUC for MCP-logistic regression with high-dimensional data.

    PubMed

    Jiang, Dingfeng; Huang, Jian; Zhang, Ying

    2013-10-01

    We propose a cross-validated area under the receiving operator characteristic (ROC) curve (CV-AUC) criterion for tuning parameter selection for penalized methods in sparse, high-dimensional logistic regression models. We use this criterion in combination with the minimax concave penalty (MCP) method for variable selection. The CV-AUC criterion is specifically designed for optimizing the classification performance for binary outcome data. To implement the proposed approach, we derive an efficient coordinate descent algorithm to compute the MCP-logistic regression solution surface. Simulation studies are conducted to evaluate the finite sample performance of the proposed method and its comparison with the existing methods including the Akaike information criterion (AIC), Bayesian information criterion (BIC) or Extended BIC (EBIC). The model selected based on the CV-AUC criterion tends to have a larger predictive AUC and smaller classification error than those with tuning parameters selected using the AIC, BIC or EBIC. We illustrate the application of the MCP-logistic regression with the CV-AUC criterion on three microarray datasets from the studies that attempt to identify genes related to cancers. Our simulation studies and data examples demonstrate that the CV-AUC is an attractive method for tuning parameter selection for penalized methods in high-dimensional logistic regression models.

  5. Nursing Care Interpersonal Relationship Questionnaire: elaboration and validation.

    PubMed

    Borges, José Wicto Pereira; Moreira, Thereza Maria Magalhães; Andrade, Dalton Franscisco de

    2018-01-08

    to elaborate an instrument for the measurement of the interpersonal relationship in nursing care through the Item Response Theory, and the validation thereof. methodological study, which followed the three poles of psychometry: theoretical, empirical and analytical. The Nursing Care Interpersonal Relationship Questionnaire was developed in light of the Imogene King's Interpersonal Conceptual Model and the psychometric properties were studied through the Item Response Theory in a sample of 950 patients attended in Primary, Secondary and Tertiary Health Care. the final instrument consisted of 31 items, with Cronbach's alpha of 0.90 and McDonald's Omega of 0.92. The parameters of the Item Response Theory demonstrated high discrimination in 28 items, being developed a five-level interpretive scale. At the first level, the communication process begins, gaining a wealth of interaction. Subsequent levels demonstrate qualitatively the points of effectiveness of the interpersonal relationship with the involvement of behaviors related to the concepts of transaction and interaction, followed by the concept of role. the instrument was created and proved to be consistent to measure interpersonal relationship in nursing care, as it presented adequate reliability and validity parameters.

  6. Experimental parameter identification of a multi-scale musculoskeletal model controlled by electrical stimulation: application to patients with spinal cord injury.

    PubMed

    Benoussaad, Mourad; Poignet, Philippe; Hayashibe, Mitsuhiro; Azevedo-Coste, Christine; Fattal, Charles; Guiraud, David

    2013-06-01

    We investigated the parameter identification of a multi-scale physiological model of skeletal muscle, based on Huxley's formulation. We focused particularly on the knee joint controlled by quadriceps muscles under electrical stimulation (ES) in subjects with a complete spinal cord injury. A noninvasive and in vivo identification protocol was thus applied through surface stimulation in nine subjects and through neural stimulation in one ES-implanted subject. The identification protocol included initial identification steps, which are adaptations of existing identification techniques to estimate most of the parameters of our model. Then we applied an original and safer identification protocol in dynamic conditions, which required resolution of a nonlinear programming (NLP) problem to identify the serial element stiffness of quadriceps. Each identification step and cross validation of the estimated model in dynamic condition were evaluated through a quadratic error criterion. The results highlighted good accuracy, the efficiency of the identification protocol and the ability of the estimated model to predict the subject-specific behavior of the musculoskeletal system. From the comparison of parameter values between subjects, we discussed and explored the inter-subject variability of parameters in order to select parameters that have to be identified in each patient.

  7. Validating MEDIQUAL Constructs

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  8. Parameter recovery, bias and standard errors in the linear ballistic accumulator model.

    PubMed

    Visser, Ingmar; Poessé, Rens

    2017-05-01

    The linear ballistic accumulator (LBA) model (Brown & Heathcote, , Cogn. Psychol., 57, 153) is increasingly popular in modelling response times from experimental data. An R package, glba, has been developed to fit the LBA model using maximum likelihood estimation which is validated by means of a parameter recovery study. At sufficient sample sizes parameter recovery is good, whereas at smaller sample sizes there can be large bias in parameters. In a second simulation study, two methods for computing parameter standard errors are compared. The Hessian-based method is found to be adequate and is (much) faster than the alternative bootstrap method. The use of parameter standard errors in model selection and inference is illustrated in an example using data from an implicit learning experiment (Visser et al., , Mem. Cogn., 35, 1502). It is shown that typical implicit learning effects are captured by different parameters of the LBA model. © 2017 The British Psychological Society.

  9. PSI-Center Simulations of Validation Platform Experiments

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2013-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), PHD/ELF (UW/MSNW), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, non-local closures, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition is proving to be a powerful method to compare global temporal and spatial structures for validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  10. Development and Validation of the Physics Anxiety Rating Scale

    ERIC Educational Resources Information Center

    Sahin, Mehmet; Caliskan, Serap; Dilek, Ufuk

    2015-01-01

    This study reports the development and validation process for an instrument to measure university students' anxiety in physics courses. The development of the Physics Anxiety Rating Scale (PARS) included the following steps: Generation of scale items, content validation, construct validation, and reliability calculation. The results of construct…

  11. Validity and Reliability of Turkish Male Breast Self-Examination Instrument.

    PubMed

    Erkin, Özüm; Göl, İlknur

    2018-04-01

    This study aims to measure the validity and reliability of Turkish male breast self-examination (MBSE) instrument. The methodological study was performed in 2016 at Ege University, Faculty of Nursing, İzmir, Turkey. The MBSE includes ten steps. For validity studies, face validity, content validity, and construct validity (exploratory factor analysis) were done. For reliability study, Kuder Richardson was calculated. The content validity index was found to be 0.94. Kendall W coefficient was 0.80 (p=0.551). The total variance explained by the two factors was found to be 63.24%. Kuder Richardson 21 was done for reliability study and found to be 0.97 for the instrument. The final instrument included 10 steps and two stages. The Turkish version of MBSE is a valid and reliable instrument for early diagnose. The MBSE can be used in Turkish speaking countries and cultures with two stages and 10 steps.

  12. Validation experiments to determine radiation partitioning of heat flux to an object in a fully turbulent fire.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricks, Allen; Blanchat, Thomas K.; Jernigan, Dann A.

    2006-06-01

    It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heatmore » (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.« less

  13. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  14. A Model Parameter Extraction Method for Dielectric Barrier Discharge Ozone Chamber using Differential Evolution

    NASA Astrophysics Data System (ADS)

    Amjad, M.; Salam, Z.; Ishaque, K.

    2014-04-01

    In order to design an efficient resonant power supply for ozone gas generator, it is necessary to accurately determine the parameters of the ozone chamber. In the conventional method, the information from Lissajous plot is used to estimate the values of these parameters. However, the experimental setup for this purpose can only predict the parameters at one operating frequency and there is no guarantee that it results in the highest ozone gas yield. This paper proposes a new approach to determine the parameters using a search and optimization technique known as Differential Evolution (DE). The desired objective function of DE is set at the resonance condition and the chamber parameter values can be searched regardless of experimental constraints. The chamber parameters obtained from the DE technique are validated by experiment.

  15. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  16. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE PAGES

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.; ...

    2016-07-28

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  17. Validation of tungsten cross sections in the neutron energy region up to 100 keV

    NASA Astrophysics Data System (ADS)

    Pigni, Marco T.; Žerovnik, Gašper; Leal, Luiz. C.; Trkov, Andrej

    2017-09-01

    Following a series of recent cross section evaluations on tungsten isotopes performed at Oak Ridge National Laboratory (ORNL), this paper presents the validation work carried out to test the performance of the evaluated cross sections based on lead-slowing-down (LSD) benchmarks conducted in Grenoble. ORNL completed the resonance parameter evaluation of four tungsten isotopes - 182,183,184,186W - in August 2014 and submitted it as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The evaluations were performed with support from the US Nuclear Criticality Safety Program in an effort to provide improved tungsten cross section and covariance data for criticality safety sensitivity analyses. The validation analysis based on the LSD benchmarks showed an improved agreement with the experimental response when the ORNL tungsten evaluations were included in the ENDF/B-VII.1 library. Comparison with the results obtained with the JEFF-3.2 nuclear data library are also discussed.

  18. A validation of dynamic causal modelling for 7T fMRI.

    PubMed

    Tak, S; Noh, J; Cheong, C; Zeidman, P; Razi, A; Penny, W D; Friston, K J

    2018-07-15

    There is growing interest in ultra-high field magnetic resonance imaging (MRI) in cognitive and clinical neuroscience studies. However, the benefits offered by higher field strength have not been evaluated in terms of effective connectivity and dynamic causal modelling (DCM). In this study, we address the validity of DCM for 7T functional MRI data at two levels. First, we evaluate the predictive validity of DCM estimates based upon 3T and 7T in terms of reproducibility. Second, we assess improvements in the efficiency of DCM estimates at 7T, in terms of the entropy of the posterior distribution over model parameters (i.e., information gain). Using empirical data recorded during fist-closing movements with 3T and 7T fMRI, we found a high reproducibility of average connectivity and condition-specific changes in connectivity - as quantified by the intra-class correlation coefficient (ICC = 0.862 and 0.936, respectively). Furthermore, we found that the posterior entropy of 7T parameter estimates was substantially less than that of 3T parameter estimates; suggesting the 7T data are more informative - and furnish more efficient estimates. In the framework of DCM, we treated field-dependent parameters for the BOLD signal model as free parameters, to accommodate fMRI data at 3T and 7T. In addition, we made the resting blood volume fraction a free parameter, because different brain regions can differ in their vascularization. In this paper, we showed DCM enables one to infer changes in effective connectivity from 7T data reliably and efficiently. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. 43 CFR 3427.3 - Validation of information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Validation of information. 3427.3 Section 3427.3 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND... § 3427.3 Validation of information. Any person submitting a written consent shall include with his filing...

  20. 43 CFR 3427.3 - Validation of information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Validation of information. 3427.3 Section 3427.3 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND... § 3427.3 Validation of information. Any person submitting a written consent shall include with his filing...

  1. 43 CFR 3427.3 - Validation of information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Validation of information. 3427.3 Section 3427.3 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND... § 3427.3 Validation of information. Any person submitting a written consent shall include with his filing...

  2. 43 CFR 3427.3 - Validation of information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Validation of information. 3427.3 Section 3427.3 Public Lands: Interior Regulations Relating to Public Lands (Continued) BUREAU OF LAND... § 3427.3 Validation of information. Any person submitting a written consent shall include with his filing...

  3. Impact of Martian atmosphere parameter uncertainties on entry vehicles aerodynamic for hypersonic rarefied conditions

    NASA Astrophysics Data System (ADS)

    Fei, Huang; Xu-hong, Jin; Jun-ming, Lv; Xiao-li, Cheng

    2016-11-01

    An attempt has been made to analyze impact of Martian atmosphere parameter uncertainties on entry vehicle aerodynamics for hypersonic rarefied conditions with a DSMC code. The code has been validated by comparing Viking vehicle flight data with present computational results. Then, by simulating flows around the Mars Science Laboratory, the impact of errors of free stream parameter uncertainties on aerodynamics is investigated. The validation results show that the present numerical approach can show good agreement with the Viking flight data. The physical and chemical properties of CO2 has strong impact on aerodynamics of Mars entry vehicles, so it is necessary to make proper corrections to the data obtained with air model in hypersonic rarefied conditions, which is consistent with the conclusions drawn in continuum regime. Uncertainties of free stream density and velocity weakly influence aerodynamics and pitching moment. However, aerodynamics appears to be little influenced by free stream temperature, the maximum error of what is below 0.5%. Center of pressure position is not sensitive to free stream parameters.

  4. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  5. Multi-Scale Low-Entropy Method for Optimizing the Processing Parameters during Automated Fiber Placement

    PubMed Central

    Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong

    2017-01-01

    Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro–meso–scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy–enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy–enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates. PMID:28869520

  6. Multi-Scale Low-Entropy Method for Optimizing the Processing Parameters during Automated Fiber Placement.

    PubMed

    Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong

    2017-09-03

    Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro-meso-scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy-enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy-enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates.

  7. Simultaneous measurement of cerebral and muscle tissue parameters during cardiac arrest and cardiopulmonary resuscitation

    NASA Astrophysics Data System (ADS)

    Nosrati, Reyhaneh; Ramadeen, Andrew; Hu, Xudong; Woldemichael, Ermias; Kim, Siwook; Dorian, Paul; Toronov, Vladislav

    2015-03-01

    In this series of animal experiments on resuscitation after cardiac arrest we had a unique opportunity to measure hyperspectral near-infrared spectroscopy (hNIRS) parameters directly on the brain dura, or on the brain through the intact pig skull, and simultaneously the muscle hNIRS parameters. Simultaneously the arterial blood pressure and carotid and femoral blood flow were recorded in real time using invasive sensors. We used a novel hyperspectral signalprocessing algorithm to extract time-dependent concentrations of water, hemoglobin, and redox state of cytochrome c oxidase during cardiac arrest and resuscitation. In addition in order to assess the validity of the non-invasive brain measurements the obtained results from the open brain was compared to the results acquired through the skull. The comparison of hNIRS data acquired on brain surface and through the adult pig skull shows that in both cases the hemoglobin and the redox state cytochrome c oxidase changed in similar ways in similar situations and in agreement with blood pressure and flow changes. The comparison of simultaneously measured brain and muscle changes showed expected differences. Overall the results show feasibility of transcranial hNIRS measurements cerebral parameters including the redox state of cytochrome oxidase in human cardiac arrest patients.

  8. Validation of Real-time Modeling of Coronal Mass Ejections Using the WSA-ENLIL+Cone Heliospheric Model

    NASA Astrophysics Data System (ADS)

    Romano, M.; Mays, M. L.; Taktakishvili, A.; MacNeice, P. J.; Zheng, Y.; Pulkkinen, A. A.; Kuznetsova, M. M.; Odstrcil, D.

    2013-12-01

    Modeling coronal mass ejections (CMEs) is of great interest to the space weather research and forecasting communities. We present recent validation work of real-time CME arrival time predictions at different satellites using the WSA-ENLIL+Cone three-dimensional MHD heliospheric model available at the Community Coordinated Modeling Center (CCMC) and performed by the Space Weather Research Center (SWRC). SWRC is an in-house research-based operations team at the CCMC which provides interplanetary space weather forecasting for NASA's robotic missions and performs real-time model validation. The quality of model operation is evaluated by comparing its output to a measurable parameter of interest such as the CME arrival time and geomagnetic storm strength. The Kp index is calculated from the relation given in Newell et al. (2007), using solar wind parameters predicted by the WSA-ENLIL+Cone model at Earth. The CME arrival time error is defined as the difference between the predicted arrival time and the observed in-situ CME shock arrival time at the ACE, STEREO A, or STEREO B spacecraft. This study includes all real-time WSA-ENLIL+Cone model simulations performed between June 2011-2013 (over 400 runs) at the CCMC/SWRC. We report hit, miss, false alarm, and correct rejection statistics for all three spacecraft. For hits we show the average absolute CME arrival time error, and the dependence of this error on CME input parameters such as speed, width, and direction. We also present the predicted geomagnetic storm strength (using the Kp index) error for Earth-directed CMEs.

  9. Assessment of Spatial Transferability of Process-Based Hydrological Model Parameters in Two Neighboring Catchments in the Himalayan Region

    NASA Astrophysics Data System (ADS)

    Nepal, S.

    2016-12-01

    The spatial transferability of the model parameters of the process-oriented distributed J2000 hydrological model was investigated in two glaciated sub-catchments of the Koshi river basin in eastern Nepal. The basins had a high degree of similarity with respect to their static landscape features. The model was first calibrated (1986-1991) and validated (1992-1997) in the Dudh Koshi sub-catchment. The calibrated and validated model parameters were then transferred to the nearby Tamor catchment (2001-2009). A sensitivity and uncertainty analysis was carried out for both sub-catchments to discover the sensitivity range of the parameters in the two catchments. The model represented the overall hydrograph well in both sub-catchments, including baseflow and medium range flows (rising and recession limbs). The efficiency results according to both Nash-Sutcliffe and the coefficient of determination was above 0.84 in both cases. The sensitivity analysis showed that the same parameter was most sensitive for Nash-Sutcliffe (ENS) and Log Nash-Sutcliffe (LNS) efficiencies in both catchments. However, there were some differences in sensitivity to ENS and LNS for moderate and low sensitive parameters, although the majority (13 out of 16 for ENS and 16 out of 16 for LNS) had a sensitivity response in a similar range. A generalized likelihood uncertainty estimation (GLUE) result suggest that most of the time the observed runoff is within the parameter uncertainty range, although occasionally the values lie outside the uncertainty range, especially during flood peaks and more in the Tamor. This may be due to the limited input data resulting from the small number of precipitation stations and lack of representative stations in high-altitude areas, as well as to model structural uncertainty. The results indicate that transfer of the J2000 parameters to a neighboring catchment in the Himalayan region with similar physiographic landscape characteristics is viable. This indicates the

  10. Evaluation of spectroscopic databases through radiative transfer simulations compared to observations. Application to the validation of GEISA 2015 with IASI and TCCON

    NASA Astrophysics Data System (ADS)

    Armante, Raymond; Scott, Noelle; Crevoisier, Cyril; Capelle, Virginie; Crepeau, Laurent; Jacquinet, Nicole; Chédin, Alain

    2016-09-01

    of particular interest for several currently exploited or planned Earth space missions: the thermal infrared domain and the short-wave infrared domain, for which observations from the space-borne IASI instrument and from the ground-based FTS instruments at the Parkfalls TCCON site are used respectively. Main results include: (i) the validation of the positions and intensities of line parameters, with overall significantly lower residuals for GEISA-2015 than for GEISA-2011 and (iii) the validation of the choice made on the parameters (such as pressure shift and air-broadened width) which has not been given by the provider but completed by ourselves. For example, comparisons between residuals obtained with GEISA-2015 and HITRAN-2012 have highlighted a specific issue with some HWHM values in the latter that can be clearly identified on the 'calculated-observed' residuals.

  11. High Vertically Resolved Atmospheric and Surface/Cloud Parameters Retrieved with Infrared Atmospheric Sounding Interferometer (IASI)

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Larar, Allen M.; Smith, WIlliam L.; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    The Joint Airborne IASI Validation Experiment (JAIVEx) was conducted during April 2007 mainly for validation of the IASI on the MetOp satellite. IASI possesses an ultra-spectral resolution of 0.25/cm and a spectral coverage from 645 to 2760/cm. Ultra-spectral resolution infrared spectral radiance obtained from near nadir observations provide atmospheric, surface, and cloud property information. An advanced retrieval algorithm with a fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. This physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the cloud-free and/or clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals are achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). Preliminary retrievals of atmospheric soundings, surface properties, and cloud optical/microphysical properties with the IASI observations are obtained and presented. These retrievals will be further inter-compared with those obtained from airborne FTS system, such as the NPOESS Airborne Sounder Testbed - Interferometer (NAST-I), dedicated dropsondes, radiosondes, and ground based Raman Lidar. The

  12. A Bayesian-based multilevel factorial analysis method for analyzing parameter uncertainty of hydrological model

    NASA Astrophysics Data System (ADS)

    Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.

    2017-10-01

    In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.

  13. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    NASA Astrophysics Data System (ADS)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  14. Correlation between radiological parameters and patient-rated wrist dysfunction following fractures of the distal radius.

    PubMed

    Karnezis, I A; Panagiotopoulos, E; Tyllianakis, M; Megas, P; Lambiris, E

    2005-12-01

    The present study investigates the correlation between radiological parameters of wrist fractures and the clinical outcome expressed by objective clinical parameters and the level of patient-rated wrist dysfunction. Thirty consecutive cases of unstable distal radial fractures treated with closed reduction and percutaneous fixation were prospectively studied for a period of one year. The outcome parameters included objective clinical and radiological parameters and the previously described and validated patient-rated wrist evaluation (PRWE) score. Analysis showed that for unstable (AO classification types 23-A2, -A3, -C1 and -C2) fractures the fracture type affects the range of wrist palmarflexion (p=0.04) and that the presence of postoperative articular 'step-off' affects the range of wrist dorsiflexion and the patient-rated wrist function at the final time of the study (p<0.01 and p=0.02, respectively). It is also shown that permanent radial shortening and loss of the palmar angle were associated with prolonged wrist pain (p<0.01 and p=0.03, respectively). Our finding that residual articular incongruity correlates with persisting loss of wrist dorsiflexion and wrist dysfunction contradicts the view that loss of articular congruity is associated with late development of articular degeneration but not with early wrist dysfunction. Additionally, this study failed to show any association between the fracture type and the functional outcome as rated by the patients.

  15. Toward On-line Parameter Estimation of Concentric Tube Robots Using a Mechanics-based Kinematic Model

    PubMed Central

    Jang, Cheongjae; Ha, Junhyoung; Dupont, Pierre E.; Park, Frank Chongwoo

    2017-01-01

    Although existing mechanics-based models of concentric tube robots have been experimentally demonstrated to approximate the actual kinematics, determining accurate estimates of model parameters remains difficult due to the complex relationship between the parameters and available measurements. Further, because the mechanics-based models neglect some phenomena like friction, nonlinear elasticity, and cross section deformation, it is also not clear if model error is due to model simplification or to parameter estimation errors. The parameters of the superelastic materials used in these robots can be slowly time-varying, necessitating periodic re-estimation. This paper proposes a method for estimating the mechanics-based model parameters using an extended Kalman filter as a step toward on-line parameter estimation. Our methodology is validated through both simulation and experiments. PMID:28717554

  16. Restraint and the question of validity.

    PubMed

    Paterson, Brodie; Duxbury, Joy

    2007-07-01

    Restraint as an intervention in the management of acute mental distress has a long history that predates the existence of psychiatry. However, it remains a source of controversy with an ongoing debate as to its role. This article critically explores what to date has seemingly been only implicit in the debate surrounding the role of restraint: how should the concept of validity be interpreted when applied to restraint as an intervention? The practice of restraint in mental health is critically examined using two post-positivist constructions of validity, the pragmatic and the psychopolitical, by means of a critical examination of the literature. The current literature provides only weak support for the pragmatic validity of restraint as an intervention and no support to date for its psychopolitical validity. Judgements regarding the validity of any intervention that is coercive must include reference to the psychopolitical dimensions of both practice and policy.

  17. Cardiac magnetic resonance imaging parameters as surrogate endpoints in clinical trials of acute myocardial infarction

    PubMed Central

    2011-01-01

    Cardiac magnetic resonance (CMR) offers a variety of parameters potentially suited as surrogate endpoints in clinical trials of acute myocardial infarction such as infarct size, myocardial salvage, microvascular obstruction or left ventricular volumes and ejection fraction. The present article reviews each of these parameters with regard to the pathophysiological basis, practical aspects, validity, reliability and its relative value (strengths and limitations) as compared to competitive modalities. Randomized controlled trials of acute myocardial infarction which have used CMR parameters as a primary endpoint are presented. PMID:21917147

  18. Stochastic analysis of experimentally determined physical parameters of HPMC:NiCl{sub 2} polymer composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thejas, Urs G.; Somashekar, R., E-mail: rs@physics.uni-mysore.ac.in; Sangappa, Y.

    A stochastic approach to explain the variation of physical parameters in polymer composites is discussed in this study. We have given a statistical model to derive the characteristic variation of physical parameters as a function of dopant concentration. Results of X-ray diffraction study and conductivity have been taken to validate this function, which can be extended to any of the physical parameters and polymer composites. For this study we have considered a polymer composites of HPMC doped with various concentrations of Nickel Chloride.

  19. Validation of Satellite Aerosol Retrievals from AERONET Ground-Based Measurements

    NASA Technical Reports Server (NTRS)

    Holben, Brent; Remer, Lorraine; Torres, Omar; Zhao, Tom; Smith, David E. (Technical Monitor)

    2001-01-01

    Accurate and comprehensive assessment of the parameters that control key atmospheric and biospheric processes including assessment of anthropogenic effects on climate change is a fundamental measurement objective of NASA's EOS program (King and Greenstone, 1999). Satellite assessment programs and associated global climate models require validation and additional parameterization with frequent reliable ground-based observations. A critical and highly uncertain element of the measurement program is characterization of tropospheric aerosols requiring basic observations of aerosols optical and microphysical properties. Unfortunately as yet we do not know the aerosol burden man is contributing to the atmosphere and thus we will have no definitive measure of change for the future. This lack of aerosol assessment is the impetus for some of the EOS measurement activities (Kaufman et al., 1997; King et al., 1999) and the formation of the AERONET program (Holben et al., 1998). The goals of the AERONET program are to develop long term monitoring at globally distributed sites providing critical data for multiannual trend changes in aerosol loading and optical properties with the specific goal of providing a data base for validation of satellite derived aerosol optical properties. The AERONET program has evolved into an international federated network of approximately 100 ground-based remote sensing monitoring stations to characterize the optical and microphysical properties of aerosols.

  20. Development and Validation of the Behavioral Tendencies Questionnaire

    PubMed Central

    Van Dam, Nicholas T.; Brown, Anna; Mole, Tom B.; Davis, Jake H.; Britton, Willoughby B.; Brewer, Judson A.

    2015-01-01

    At a fundamental level, taxonomy of behavior and behavioral tendencies can be described in terms of approach, avoid, or equivocate (i.e., neither approach nor avoid). While there are numerous theories of personality, temperament, and character, few seem to take advantage of parsimonious taxonomy. The present study sought to implement this taxonomy by creating a questionnaire based on a categorization of behavioral temperaments/tendencies first identified in Buddhist accounts over fifteen hundred years ago. Items were developed using historical and contemporary texts of the behavioral temperaments, described as “Greedy/Faithful”, “Aversive/Discerning”, and “Deluded/Speculative”. To both maintain this categorical typology and benefit from the advantageous properties of forced-choice response format (e.g., reduction of response biases), binary pairwise preferences for items were modeled using Latent Class Analysis (LCA). One sample (n1 = 394) was used to estimate the item parameters, and the second sample (n2 = 504) was used to classify the participants using the established parameters and cross-validate the classification against multiple other measures. The cross-validated measure exhibited good nomothetic span (construct-consistent relationships with related measures) that seemed to corroborate the ideas present in the original Buddhist source documents. The final 13-block questionnaire created from the best performing items (the Behavioral Tendencies Questionnaire or BTQ) is a psychometrically valid questionnaire that is historically consistent, based in behavioral tendencies, and promises practical and clinical utility particularly in settings that teach and study meditation practices such as Mindfulness Based Stress Reduction (MBSR). PMID:26535904