Aerosol profiling during the large scale field campaign CINDI-2
NASA Astrophysics Data System (ADS)
Apituley, Arnoud; Roozendael, Michel Van; Richter, Andreas; Wagner, Thomas; Friess, Udo; Hendrick, Francois; Kreher, Karin; Tirpitz, Jan-Lukas
2018-04-01
For the validation of space borne observations of NO2 and other trace gases from hyperspectral imagers, ground based instruments based on the MAXDOAS technique are an excellent choice, since they rely on similar retrieval techniques as the observations from orbit. To ensure proper traceability of the MAXDOAS observations, a thorough validation and intercomparison is mandatory. Advanced MAXDOAS observation and retrieval techniques enable inferring vertical structure of trace gases and aerosols. These techniques and their results need validation by e.g. lidar techniques. For the proper understanding of the results from passive remote sensing techniques, independent observations are needed that include parameters needed to understand the light paths, i.e. in-situ aerosol observations of optical and microphysical properties, and essential are in particular the vertical profiles of aerosol optical properties by (Raman) lidar. The approach used in the CINDI-2 campaign held in Cabauw in 2016 is presented in this paper and the results will be discussed in the presentation at the conference.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Vision-based system identification technique for building structures using a motion capture system
NASA Astrophysics Data System (ADS)
Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon
2015-11-01
This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.
Validation of Regression-Based Myogenic Correction Techniques for Scalp and Source-Localized EEG
McMenamin, Brenton W.; Shackman, Alexander J.; Maxwell, Jeffrey S.; Greischar, Lawrence L.; Davidson, Richard J.
2008-01-01
EEG and EEG source-estimation are susceptible to electromyographic artifacts (EMG) generated by the cranial muscles. EMG can mask genuine effects or masquerade as a legitimate effect - even in low frequencies, such as alpha (8–13Hz). Although regression-based correction has been used previously, only cursory attempts at validation exist and the utility for source-localized data is unknown. To address this, EEG was recorded from 17 participants while neurogenic and myogenic activity were factorially varied. We assessed the sensitivity and specificity of four regression-based techniques: between-subjects, between-subjects using difference-scores, within-subjects condition-wise, and within-subject epoch-wise on the scalp and in data modeled using the LORETA algorithm. Although within-subject epoch-wise showed superior performance on the scalp, no technique succeeded in the source-space. Aside from validating the novel epoch-wise methods on the scalp, we highlight methods requiring further development. PMID:19298626
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, A H; Kerr, L A; Cailliet, G M
2007-11-04
Canary rockfish (Sebastes pinniger) have long been an important part of recreational and commercial rockfish fishing from southeast Alaska to southern California, but localized stock abundances have declined considerably. Based on age estimates from otoliths and other structures, lifespan estimates vary from about 20 years to over 80 years. For the purpose of monitoring stocks, age composition is routinely estimated by counting growth zones in otoliths; however, age estimation procedures and lifespan estimates remain largely unvalidated. Typical age validation techniques have limited application for canary rockfish because they are deep dwelling and may be long lived. In this study, themore » unaged otolith of the pair from fish aged at the Department of Fisheries and Oceans Canada was used in one of two age validation techniques: (1) lead-radium dating and (2) bomb radiocarbon ({sup 14}C) dating. Age estimate accuracy and the validity of age estimation procedures were validated based on the results from each technique. Lead-radium dating proved successful in determining a minimum estimate of lifespan was 53 years and provided support for age estimation procedures up to about 50-60 years. These findings were further supported by {Delta}{sup 14}C data, which indicated a minimum estimate of lifespan was 44 {+-} 3 years. Both techniques validate, to differing degrees, age estimation procedures and provide support for inferring that canary rockfish can live more than 80 years.« less
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil
2014-08-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil
2015-01-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922
NASA Astrophysics Data System (ADS)
Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal
2014-06-01
This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.
NASA Astrophysics Data System (ADS)
Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra
2013-03-01
SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.
NASA Technical Reports Server (NTRS)
Hardman, R. R.; Mahan, J. R.; Smith, M. H.; Gelhausen, P. A.; Van Dalsem, W. R.
1991-01-01
The need for a validation technique for computational fluid dynamics (CFD) codes in STOVL applications has led to research efforts to apply infrared thermal imaging techniques to visualize gaseous flow fields. Specifically, a heated, free-jet test facility was constructed. The gaseous flow field of the jet exhaust was characterized using an infrared imaging technique in the 2 to 5.6 micron wavelength band as well as conventional pitot tube and thermocouple methods. These infrared images are compared to computer-generated images using the equations of radiative exchange based on the temperature distribution in the jet exhaust measured with the thermocouple traverses. Temperature and velocity measurement techniques, infrared imaging, and the computer model of the infrared imaging technique are presented and discussed. From the study, it is concluded that infrared imaging techniques coupled with the radiative exchange equations applied to CFD models are a valid method to qualitatively verify CFD codes used in STOVL applications.
Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2000-01-01
A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.
Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina
2016-06-01
We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.
Blagus, Rok; Lusa, Lara
2015-11-04
Prediction models are used in clinical research to develop rules that can be used to accurately predict the outcome of the patients based on some of their characteristics. They represent a valuable tool in the decision making process of clinicians and health policy makers, as they enable them to estimate the probability that patients have or will develop a disease, will respond to a treatment, or that their disease will recur. The interest devoted to prediction models in the biomedical community has been growing in the last few years. Often the data used to develop the prediction models are class-imbalanced as only few patients experience the event (and therefore belong to minority class). Prediction models developed using class-imbalanced data tend to achieve sub-optimal predictive accuracy in the minority class. This problem can be diminished by using sampling techniques aimed at balancing the class distribution. These techniques include under- and oversampling, where a fraction of the majority class samples are retained in the analysis or new samples from the minority class are generated. The correct assessment of how the prediction model is likely to perform on independent data is of crucial importance; in the absence of an independent data set, cross-validation is normally used. While the importance of correct cross-validation is well documented in the biomedical literature, the challenges posed by the joint use of sampling techniques and cross-validation have not been addressed. We show that care must be taken to ensure that cross-validation is performed correctly on sampled data, and that the risk of overestimating the predictive accuracy is greater when oversampling techniques are used. Examples based on the re-analysis of real datasets and simulation studies are provided. We identify some results from the biomedical literature where the incorrect cross-validation was performed, where we expect that the performance of oversampling techniques was heavily overestimated.
Validation techniques for fault emulation of SRAM-based FPGAs
Quinn, Heather; Wirthlin, Michael
2015-08-07
A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less
The Social Validity Assessment of Social Competence Intervention Behavior Goals
ERIC Educational Resources Information Center
Hurley, Jennifer J.; Wehby, Joseph H.; Feurer, Irene D.
2010-01-01
Social validation is the value judgment from society on the importance of a study. The social validity of behavior goals used in the social competence intervention literature was assessed using the Q-sort technique. The stimulus items were 80 different social competence behavior goals taken from 78 classroom-based social competence intervention…
Actor groups, related needs, and challenges at the climate downscaling interface
NASA Astrophysics Data System (ADS)
Rössler, Ole; Benestad, Rasmus; Diamando, Vlachogannis; Heike, Hübener; Kanamaru, Hideki; Pagé, Christian; Margarida Cardoso, Rita; Soares, Pedro; Maraun, Douglas; Kreienkamp, Frank; Christodoulides, Paul; Fischer, Andreas; Szabo, Peter
2016-04-01
At the climate downscaling interface, numerous downscaling techniques and different philosophies compete on being the best method in their specific terms. Thereby, it remains unclear to what extent and for which purpose these downscaling techniques are valid or even the most appropriate choice. A common validation framework that compares all the different available methods was missing so far. The initiative VALUE closes this gap with such a common validation framework. An essential part of a validation framework for downscaling techniques is the definition of appropriate validation measures. The selection of validation measures should consider the needs of the stakeholder: some might need a temporal or spatial average of a certain variable, others might need temporal or spatial distributions of some variables, still others might need extremes for the variables of interest or even inter-variable dependencies. Hence, a close interaction of climate data providers and climate data users is necessary. Thus, the challenge in formulating a common validation framework mirrors also the challenges between the climate data providers and the impact assessment community. This poster elaborates the issues and challenges at the downscaling interface as it is seen within the VALUE community. It suggests three different actor groups: one group consisting of the climate data providers, the other two groups being climate data users (impact modellers and societal users). Hence, the downscaling interface faces classical transdisciplinary challenges. We depict a graphical illustration of actors involved and their interactions. In addition, we identified four different types of issues that need to be considered: i.e. data based, knowledge based, communication based, and structural issues. They all may, individually or jointly, hinder an optimal exchange of data and information between the actor groups at the downscaling interface. Finally, some possible ways to tackle these issues are discussed.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Huiqiang; Wu, Xizeng, E-mail: xwu@uabmc.edu, E-mail: tqxiao@sinap.ac.cn; Xiao, Tiqiao, E-mail: xwu@uabmc.edu, E-mail: tqxiao@sinap.ac.cn
Purpose: Propagation-based phase-contrast CT (PPCT) utilizes highly sensitive phase-contrast technology applied to x-ray microtomography. Performing phase retrieval on the acquired angular projections can enhance image contrast and enable quantitative imaging. In this work, the authors demonstrate the validity and advantages of a novel technique for high-resolution PPCT by using the generalized phase-attenuation duality (PAD) method of phase retrieval. Methods: A high-resolution angular projection data set of a fish head specimen was acquired with a monochromatic 60-keV x-ray beam. In one approach, the projection data were directly used for tomographic reconstruction. In two other approaches, the projection data were preprocessed bymore » phase retrieval based on either the linearized PAD method or the generalized PAD method. The reconstructed images from all three approaches were then compared in terms of tissue contrast-to-noise ratio and spatial resolution. Results: The authors’ experimental results demonstrated the validity of the PPCT technique based on the generalized PAD-based method. In addition, the results show that the authors’ technique is superior to the direct PPCT technique as well as the linearized PAD-based PPCT technique in terms of their relative capabilities for tissue discrimination and characterization. Conclusions: This novel PPCT technique demonstrates great potential for biomedical imaging, especially for applications that require high spatial resolution and limited radiation exposure.« less
Noninvasive in vivo glucose sensing using an iris based technique
NASA Astrophysics Data System (ADS)
Webb, Anthony J.; Cameron, Brent D.
2011-03-01
Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.
Knowledge-based system verification and validation
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1990-01-01
The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.
Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors
NASA Technical Reports Server (NTRS)
Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele
2010-01-01
This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.
NASA Astrophysics Data System (ADS)
Natali, Marco; Reggente, Melania; Passeri, Daniele; Rossi, Marco
2016-06-01
The development of polymer-based nanocomposites to be used in critical thermal environments requires the characterization of their mechanical properties, which are related to their chemical composition, size, morphology and operating temperature. Atomic force microscopy (AFM) has been proven to be a useful tool to develop techniques for the mechanical characterization of these materials, thanks to its nanometer lateral resolution and to the capability of exerting ultra-low loads, down to the piconewton range. In this work, we demonstrate two techniques, one quasi-static, i.e., AFM-based indentation (I-AFM), and one dynamic, i.e., contact resonance AFM (CR-AFM), for the mechanical characterization of compliant materials at variable temperature. A cross-validation of I-AFM and CR-AFM has been performed by comparing the results obtained on two reference materials, i.e., low-density polyethylene (LDPE) and polycarbonate (PC), which demonstrated the accuracy of the techniques.
Justus, Alan L
2015-05-01
This paper presents technically-based techniques to deal with nuisance personnel contamination monitor (PCM) alarms. The techniques derive from the fundamental physical characteristics of radon progeny. Some PCM alarms, although valid alarms and not actually "false," could be due to nuisance naturally-occurring radionuclides (i.e., radon progeny). Based on certain observed characteristics of the radon progeny, several prompt techniques are discussed that could either remediate or at least mitigate the problem of nuisance alarms. Examples are provided which demonstrate the effective use of the techniques.
Graph-based real-time fault diagnostics
NASA Technical Reports Server (NTRS)
Padalkar, S.; Karsai, G.; Sztipanovits, J.
1988-01-01
A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.
Statistical methodology: II. Reliability and validity assessment in study design, Part B.
Karras, D J
1997-02-01
Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.
ERIC Educational Resources Information Center
Austin, Bryan S.; Leahy, Michael J.
2015-01-01
Purpose: To construct and validate a new self-report instrument, the Clinical Judgment Skill Inventory (CJSI), inclusive of clinical judgment skill competencies that address counselor biases and evidence-based strategies. Method: An Internet-based survey design was used and an exploratory factor analysis was performed on a sample of rehabilitation…
Sensor data validation and reconstruction. Phase 1: System architecture study
NASA Technical Reports Server (NTRS)
1991-01-01
The sensor validation and data reconstruction task reviewed relevant literature and selected applicable validation and reconstruction techniques for further study; analyzed the selected techniques and emphasized those which could be used for both validation and reconstruction; analyzed Space Shuttle Main Engine (SSME) hot fire test data to determine statistical and physical relationships between various parameters; developed statistical and empirical correlations between parameters to perform validation and reconstruction tasks, using a computer aided engineering (CAE) package; and conceptually designed an expert system based knowledge fusion tool, which allows the user to relate diverse types of information when validating sensor data. The host hardware for the system is intended to be a Sun SPARCstation, but could be any RISC workstation with a UNIX operating system and a windowing/graphics system such as Motif or Dataviews. The information fusion tool is intended to be developed using the NEXPERT Object expert system shell, and the C programming language.
Podsakoff, Nathan P; Podsakoff, Philip M; Mackenzie, Scott B; Klinger, Ryan L
2013-01-01
Several researchers have persuasively argued that the most important evidence to consider when assessing construct validity is whether variations in the construct of interest cause corresponding variations in the measures of the focal construct. Unfortunately, the literature provides little practical guidance on how researchers can go about testing this. Therefore, the purpose of this article is to describe how researchers can use video techniques to test whether their scales measure what they purport to measure. First, we discuss how researchers can develop valid manipulations of the focal construct that they hope to measure. Next, we explain how to design a study to use this manipulation to test the validity of the scale. Finally, comparing and contrasting traditional and contemporary perspectives on validation, we discuss the advantages and limitations of video-based validation procedures. PsycINFO Database Record (c) 2013 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2016-05-01
Quality control is critical to manufacturing. Frequently, techniques are used to define object conformity bounds, based on historical quality data. This paper considers techniques for bespoke and small batch jobs that are not statistical model based. These techniques also serve jobs where 100% validation is needed due to the mission or safety critical nature of particular parts. One issue with this type of system is alignment discrepancies between the generated model and the physical part. This paper discusses and evaluates techniques for characterizing and correcting alignment issues between the projected and perceived data sets to prevent errors attributable to misalignment.
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Cunningham, Kevin; Hill, Melissa A.
2013-01-01
Flight test and modeling techniques were developed for efficiently identifying global aerodynamic models that can be used to accurately simulate stall, upset, and recovery on large transport airplanes. The techniques were developed and validated in a high-fidelity fixed-base flight simulator using a wind-tunnel aerodynamic database, realistic sensor characteristics, and a realistic flight deck representative of a large transport aircraft. Results demonstrated that aerodynamic models for stall, upset, and recovery can be identified rapidly and accurately using relatively simple piloted flight test maneuvers. Stall maneuver predictions and comparisons of identified aerodynamic models with data from the underlying simulation aerodynamic database were used to validate the techniques.
Validation techniques of agent based modelling for geospatial simulations
NASA Astrophysics Data System (ADS)
Darvishi, M.; Ahmadi, G.
2014-10-01
One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.
NASA Astrophysics Data System (ADS)
Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie
2017-12-01
In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.
Continual Response Measurement: Design and Validation.
ERIC Educational Resources Information Center
Baggaley, Jon
1987-01-01
Discusses reliability and validity of continual response measurement (CRM), a computer-based measurement technique, and its use in social science research. Highlights include the importance of criterion-referencing the data, guidelines for designing studies using CRM, examples typifying their deductive and inductive functions, and a discussion of…
Validating a Geographical Image Retrieval System.
ERIC Educational Resources Information Center
Zhu, Bin; Chen, Hsinchun
2000-01-01
Summarizes a prototype geographical image retrieval system that demonstrates how to integrate image processing and information analysis techniques to support large-scale content-based image retrieval. Describes an experiment to validate the performance of this image retrieval system against that of human subjects by examining similarity analysis…
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
Increasing the Rate of Presentation and Use of Signals in Elementary Classroom Teachers
ERIC Educational Resources Information Center
Carnine, Douglas W.; Fink, William T.
1978-01-01
Two issues relevant to competency-based teacher training were investigated with 13 elementary teachers--the specification of acceptable implementation levels for validated techniques and the necessity and feasibility of providing training on those techniques. (Author)
All-sky photogrammetry techniques to georeference a cloud field
NASA Astrophysics Data System (ADS)
Crispel, Pierre; Roberts, Gregory
2018-01-01
In this study, we present a novel method of identifying and geolocalizing cloud field elements from a portable all-sky camera stereo network based on the ground and oriented towards zenith. The methodology is mainly based on stereophotogrammetry which is a 3-D reconstruction technique based on triangulation from corresponding stereo pixels in rectified images. In cases where clouds are horizontally separated, identifying individual positions is performed with segmentation techniques based on hue filtering and contour detection algorithms. Macroscopic cloud field characteristics such as cloud layer base heights and velocity fields are also deduced. In addition, the methodology is fitted to the context of measurement campaigns which impose simplicity of implementation, auto-calibration, and portability. Camera internal geometry models are achieved a priori in the laboratory and validated to ensure a certain accuracy in the peripheral parts of the all-sky image. Then, stereophotogrammetry with dense 3-D reconstruction is applied with cameras spaced 150 m apart for two validation cases. The first validation case is carried out with cumulus clouds having a cloud base height at 1500 m a.g.l. The second validation case is carried out with two cloud layers: a cumulus fractus layer with a base height at 1000 m a.g.l. and an altocumulus stratiformis layer with a base height of 2300 m a.g.l. Velocity fields at cloud base are computed by tracking image rectangular patterns through successive shots. The height uncertainty is estimated by comparison with a Vaisala CL31 ceilometer located on the site. The uncertainty on the horizontal coordinates and on the velocity field are theoretically quantified by using the experimental uncertainties of the cloud base height and camera orientation. In the first cumulus case, segmentation of the image is performed to identify individuals clouds in the cloud field and determine the horizontal positions of the cloud centers.
Van den Broeck, Joke; Rossi, Gina; De Clercq, Barbara; Dierckx, Eva; Bastiaansen, Leen
2013-01-01
Research on the applicability of the five factor model (FFM) to capture personality pathology coincided with the development of a FFM personality disorder (PD) count technique, which has been validated in adolescent, young, and middle-aged samples. This study extends the literature by validating this technique in an older sample. Five alternative FFM PD counts based upon the Revised NEO Personality Inventory (NEO PI-R) are computed and evaluated in terms of both convergent and divergent validity with the Assessment of DSM-IV Personality Disorders Questionnaire (shortly ADP-IV; DSM-IV, Diagnostic and Statistical Manual of Mental Disorders - Fourth edition). For the best working count for each PD normative data are presented, from which cut-off scores are derived. The validity of these cut-offs and their usefulness as a screening tool is tested against both a categorical (i.e., the DSM-IV - Text Revision), and a dimensional (i.e., the Dimensional Assessment of Personality Pathology; DAPP) measure of personality pathology. All but the Antisocial and Obsessive-Compulsive counts exhibited adequate convergent and divergent validity, supporting the use of this method in older adults. Using the ADP-IV and the DAPP - Short Form as validation criteria, results corroborate the use of the FFM PD count technique to screen for PDs in older adults, in particular for the Paranoid, Borderline, Histrionic, Avoidant, and Dependent PDs. Given the age-neutrality of the NEO PI-R and the considerable lack of valid personality assessment tools, current findings appear to be promising for the assessment of pathology in older adults.
NASA Astrophysics Data System (ADS)
Adhikari, Nilanjan; Amin, Sk. Abdul; Saha, Achintya; Jha, Tarun
2018-03-01
Matrix metalloproteinase-2 (MMP-2) is a promising pharmacological target for designing potential anticancer drugs. MMP-2 plays critical functions in apoptosis by cleaving the DNA repair enzyme namely poly (ADP-ribose) polymerase (PARP). Moreover, MMP-2 expression triggers the vascular endothelial growth factor (VEGF) having a positive influence on tumor size, invasion, and angiogenesis. Therefore, it is an urgent need to develop potential MMP-2 inhibitors without any toxicity but better pharmacokinetic property. In this article, robust validated multi-quantitative structure-activity relationship (QSAR) modeling approaches were attempted on a dataset of 222 MMP-2 inhibitors to explore the important structural and pharmacophoric requirements for higher MMP-2 inhibition. Different validated regression and classification-based QSARs, pharmacophore mapping and 3D-QSAR techniques were performed. These results were challenged and subjected to further validation to explain 24 in house MMP-2 inhibitors to judge the reliability of these models further. All these models were individually validated internally as well as externally and were supported and validated by each other. These results were further justified by molecular docking analysis. Modeling techniques adopted here not only helps to explore the necessary structural and pharmacophoric requirements but also for the overall validation and refinement techniques for designing potential MMP-2 inhibitors.
NASA Astrophysics Data System (ADS)
Chockalingam, Letchumanan
2005-01-01
The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.
Monitoring fugitive methane and natural gas emissions, validation of measurement techniques.
NASA Astrophysics Data System (ADS)
Robinson, Rod; Innocenti, Fabrizio; Gardiner, Tom; Helmore, Jon; Finlayson, Andrew; Connor, Andy
2017-04-01
The detection and quantification of fugitive and diffuse methane emissions has become an increasing priority in recent years. As the requirements for routine measurement to support industry initiatives increase there is a growing requirement to assess and validate the performance of fugitive emission measurement technologies. For reported emissions traceability and comparability of measurements is important. This talk will present recent work addressing these needs. Differential Absorption Lidar (DIAL) is a laser based remote sensing technology, able to map the concentration of gases in the atmosphere and determine emission fluxes for fugitive emissions. A description of the technique and its application for determining fugitive emissions of methane from oil and gas operations and waste management sites will be given. As DIAL has gained acceptance as a powerful tool for the measurement and quantification of fugitive emissions, and given the rich data it produces, it is being increasingly used to assess and validate other measurement approaches. In addition, to support the validation of technologies, we have developed a portable controlled release facility able to simulate the emissions from area sources. This has been used to assess and validate techniques which are used to monitor emissions. The development and capabilities of the controlled release facility will be described. This talk will report on recent studies using DIAL and the controlled release facility to validate fugitive emission measurement techniques. This includes side by side comparisons of two DIAL systems, the application of both the DIAL technique and the controlled release facility in a major study carried out in 2015 by South Coast Air Quality Management District (SCAQMD) in which a number of optical techniques were assessed and the development of a prototype method validation approach for techniques used to measure methane emissions from shale gas sites. In conclusion the talk will provide an update on the current status in the development of a European Standard for the measurement of fugitive emissions of VOCs and the use of validation data in the standardisation process and discuss the application of this to methane measurement.
Real-Time Onboard Global Nonlinear Aerodynamic Modeling from Flight Data
NASA Technical Reports Server (NTRS)
Brandon, Jay M.; Morelli, Eugene A.
2014-01-01
Flight test and modeling techniques were developed to accurately identify global nonlinear aerodynamic models onboard an aircraft. The techniques were developed and demonstrated during piloted flight testing of an Aermacchi MB-326M Impala jet aircraft. Advanced piloting techniques and nonlinear modeling techniques based on fuzzy logic and multivariate orthogonal function methods were implemented with efficient onboard calculations and flight operations to achieve real-time maneuver monitoring and analysis, and near-real-time global nonlinear aerodynamic modeling and prediction validation testing in flight. Results demonstrated that global nonlinear aerodynamic models for a large portion of the flight envelope were identified rapidly and accurately using piloted flight test maneuvers during a single flight, with the final identified and validated models available before the aircraft landed.
Development and Validation of Cognitive Screening Instruments.
ERIC Educational Resources Information Center
Jarman, Ronald F.
The author suggests that most research on the early detection of learning disabilities is characterisized by an ineffective and a theoretical method of selecting and validating tasks. An alternative technique is proposed, based on a neurological theory of cognitive processes, whereby task analysis is a first step, with empirical analyses as…
Nurmi, Johanna; Hagger, Martin S; Haukkala, Ari; Araújo-Soares, Vera; Hankonen, Nelli
2016-04-01
This study tested the predictive validity of a multitheory process model in which the effect of autonomous motivation from self-determination theory on physical activity participation is mediated by the adoption of self-regulatory techniques based on control theory. Finnish adolescents (N = 411, aged 17-19) completed a prospective survey including validated measures of the predictors and physical activity, at baseline and after one month (N = 177). A subsample used an accelerometer to objectively measure physical activity and further validate the physical activity self-report assessment tool (n = 44). Autonomous motivation statistically significantly predicted action planning, coping planning, and self-monitoring. Coping planning and self-monitoring mediated the effect of autonomous motivation on physical activity, although self-monitoring was the most prominent. Controlled motivation had no effect on self-regulation techniques or physical activity. Developing interventions that support autonomous motivation for physical activity may foster increased engagement in self-regulation techniques and positively affect physical activity behavior.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.
Yamaguti, M.; Muller, E.E.; Piffer, A.I.; Kich, J.D.; Klein, C.S.; Kuchiishi, S.S.
2008-01-01
Since Mycoplasma hyopneumoniae isolation in appropriate media is a difficult task and impractical for daily routine diagnostics, Nested-PCR (N-PCR) techniques are currently used to improve the direct diagnostic sensitivity of Swine Enzootic Pneumonia. In a first experiment, this paper describes a N-PCR technique optimization based on three variables: different sampling sites, sample transport media, and DNA extraction methods, using eight pigs. Based on the optimization results, a second experiment was conducted for testing validity using 40 animals. In conclusion, the obtained results of the N-PCR optimization and validation allow us to recommend this test as a routine monitoring diagnostic method for Mycoplasma hyopneumoniae infection in swine herds. PMID:24031248
A scalable correlator for multichannel diffuse correlation spectroscopy.
Stapels, Christopher J; Kolodziejski, Noah J; McAdams, Daniel; Podolsky, Matthew J; Fernandez, Daniel E; Farkas, Dana; Christian, James F
2016-02-01
Diffuse correlation spectroscopy (DCS) is a technique which enables powerful and robust non-invasive optical studies of tissue micro-circulation and vascular blood flow. The technique amounts to autocorrelation analysis of coherent photons after their migration through moving scatterers and subsequent collection by single-mode optical fibers. A primary cost driver of DCS instruments are the commercial hardware-based correlators, limiting the proliferation of multi-channel instruments for validation of perfusion analysis as a clinical diagnostic metric. We present the development of a low-cost scalable correlator enabled by microchip-based time-tagging, and a software-based multi-tau data analysis method. We will discuss the capabilities of the instrument as well as the implementation and validation of 2- and 8-channel systems built for live animal and pre-clinical settings.
Development and validation of a sensor-based health monitoring model for the Parkview Bridge deck.
DOT National Transportation Integrated Search
2012-01-31
Accelerated bridge construction (ABC) using full-depth precast deck panels is an innovative technique that brings all : the benefits listed under ABC to full fruition. However, this technique needs to be evaluated and the performance of : the bridge ...
Walczyk, Jeffrey J.; Igou, Frank P.; Dixon, Alexa P.; Tcholakian, Talar
2013-01-01
This article critically reviews techniques and theories relevant to the emerging field of “lie detection by inducing cognitive load selectively on liars.” To help these techniques benefit from past mistakes, we start with a summary of the polygraph-based Controlled Question Technique (CQT) and the major criticisms of it made by the National Research Council (2003), including that it not based on a validated theory and administration procedures have not been standardized. Lessons from the more successful Guilty Knowledge Test are also considered. The critical review that follows starts with the presentation of models and theories offering insights for cognitive lie detection that can undergird theoretically load-inducing approaches. This is followed by evaluation of specific research-based, load-inducing proposals, especially for their susceptibility to rehearsal and other countermeasures. To help organize these proposals and suggest new direction for innovation and refinement, a theoretical taxonomy is presented based on the type of cognitive load induced in examinees (intrinsic or extraneous) and how open-ended the responses to test items are. Finally, four recommendations are proffered that can help researchers and practitioners to avert the corresponding mistakes with the CQT and yield new, valid cognitive lie detection technologies. PMID:23378840
A NASA/RAE cooperation in the development of a real-time knowledge-based autopilot
NASA Technical Reports Server (NTRS)
Daysh, Colin; Corbin, Malcolm; Butler, Geoff; Duke, Eugene L.; Belle, Steven D.; Brumbaugh, Randal W.
1991-01-01
As part of a US/UK cooperative aeronautical research program, a joint activity between the NASA Dryden Flight Research Facility and the Royal Aerospace Establishment on knowledge-based systems was established. This joint activity is concerned with tools and techniques for the implementation and validation of real-time knowledge-based systems. The proposed next stage of this research is described, in which some of the problems of implementing and validating a knowledge-based autopilot for a generic high-performance aircraft are investigated.
DOT National Transportation Integrated Search
2012-01-31
Accelerated bridge construction (ABC) using full-depth precast deck panels is an innovative technique that brings all the benefits listed under ABC to full fruition. However, this technique needs to be evaluated and the performance of the bridge need...
Pandey, Daya Shankar; Pan, Indranil; Das, Saptarshi; Leahy, James J; Kwapinski, Witold
2015-03-01
A multi-gene genetic programming technique is proposed as a new method to predict syngas yield production and the lower heating value for municipal solid waste gasification in a fluidized bed gasifier. The study shows that the predicted outputs of the municipal solid waste gasification process are in good agreement with the experimental dataset and also generalise well to validation (untrained) data. Published experimental datasets are used for model training and validation purposes. The results show the effectiveness of the genetic programming technique for solving complex nonlinear regression problems. The multi-gene genetic programming are also compared with a single-gene genetic programming model to show the relative merits and demerits of the technique. This study demonstrates that the genetic programming based data-driven modelling strategy can be a good candidate for developing models for other types of fuels as well. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bedewi, Nabih E.; Yang, Jackson C. S.
1987-01-01
Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The results of an experiment conducted on an offshore platform scale model to verify the validity of the technique and to demonstrate its application in damage detection are presented.
Wavelet Transform Based Filter to Remove the Notches from Signal Under Harmonic Polluted Environment
NASA Astrophysics Data System (ADS)
Das, Sukanta; Ranjan, Vikash
2017-12-01
The work proposes to annihilate the notches present in the synchronizing signal required for converter operation appearing due to switching of semiconductor devices connected to the system in the harmonic polluted environment. The disturbances in the signal are suppressed by wavelet based novel filtering technique. In the proposed technique, the notches in the signal are determined and eliminated by the wavelet based multi-rate filter using `Daubechies4' (db4) as mother wavelet. The computational complexity of the adapted technique is very less as compared to any other conventional notch filtering techniques. The proposed technique is developed in MATLAB/Simulink and finally validated with dSPACE-1103 interface. The recovered signal, thus obtained, is almost free of the notches.
The Predictive Validity of CBM Writing Indices for Eighth-Grade Students
ERIC Educational Resources Information Center
Amato, Janelle M.; Watkins, Marley W.
2011-01-01
Curriculum-based measurement (CBM) is an alternative to traditional assessment techniques. Technical work has begun to identify CBM writing indices that are psychometrically sound for monitoring older students' writing proficiency. This study examined the predictive validity of CBM writing indices in a sample of 447 eighth-grade students.…
Center of pressure based segment inertial parameters validation
Rezzoug, Nasser; Gorce, Philippe; Isableu, Brice; Venture, Gentiane
2017-01-01
By proposing efficient methods for estimating Body Segment Inertial Parameters’ (BSIP) estimation and validating them with a force plate, it is possible to improve the inverse dynamic computations that are necessary in multiple research areas. Until today a variety of studies have been conducted to improve BSIP estimation but to our knowledge a real validation has never been completely successful. In this paper, we propose a validation method using both kinematic and kinetic parameters (contact forces) gathered from optical motion capture system and a force plate respectively. To compare BSIPs, we used the measured contact forces (Force plate) as the ground truth, and reconstructed the displacements of the Center of Pressure (COP) using inverse dynamics from two different estimation techniques. Only minor differences were seen when comparing the estimated segment masses. Their influence on the COP computation however is large and the results show very distinguishable patterns of the COP movements. Improving BSIP techniques is crucial and deviation from the estimations can actually result in large errors. This method could be used as a tool to validate BSIP estimation techniques. An advantage of this approach is that it facilitates the comparison between BSIP estimation methods and more specifically it shows the accuracy of those parameters. PMID:28662090
NASA Astrophysics Data System (ADS)
Chowdhury, D. P.; Pal, Sujit; Parthasarathy, R.; Mathur, P. K.; Kohli, A. K.; Limaye, P. K.
1998-09-01
Thin layer activation (TLA) technique has been developed in Zr based alloy materials, e.g., zircaloy II, using 40 MeV α-particles from Variable Energy Cyclotron Centre at Calcutta. A brief description of the methodology of TLA technique is presented to determine the surface wear. The sensitivity of the measurement of surface wear in zircaloy material is found to be 0.22±0.05 μm. The surface wear is determined by TLA technique in zircaloy material which is used in pressurised heavy water reactor and the values have been compared with that obtained by conventional technique for the analytical validation of the TLA technique.
NASA Astrophysics Data System (ADS)
Benalcazar, Wladimir A.; Jiang, Zhi; Marks, Daniel L.; Geddes, Joseph B.; Boppart, Stephen A.
2009-02-01
We validate a molecular imaging technique called Nonlinear Interferometric Vibrational Imaging (NIVI) by comparing vibrational spectra with those acquired from Raman microscopy. This broadband coherent anti-Stokes Raman scattering (CARS) technique uses heterodyne detection and OCT acquisition and design principles to interfere a CARS signal generated by a sample with a local oscillator signal generated separately by a four-wave mixing process. These are mixed and demodulated by spectral interferometry. Its confocal configuration allows the acquisition of 3D images based on endogenous molecular signatures. Images from both phantom and mammary tissues have been acquired by this instrument and its spectrum is compared with its spontaneous Raman signatures.
The object the metaphor the power and evergreen or the eighth way to make a hypermedia project fail
NASA Technical Reports Server (NTRS)
Warren, Bruce A.
1990-01-01
A patented software technique is described that is necessary and sufficient to keep hypermedia data bases current with the manufacturing technology. The technique proved its validity in four years of use in petrochemical plants. This technique is based on the following principles: (1) the data base must be object structured, i.e., all components must retain visible individuality; (2) the author must be seeing and experiencing the multimedia data objects as he creates; and (3) the hypermedia tools must possess power in the form of unlimited capacity.
Comparability of a Paper-Based Language Test and a Computer-Based Language Test.
ERIC Educational Resources Information Center
Choi, Inn-Chull; Kim, Kyoung Sung; Boo, Jaeyool
2003-01-01
Utilizing the Test of English Proficiency, developed by Seoul National University (TEPS), examined comparability between the paper-based language test and the computer-based language test based on content and construct validation employing content analyses based on corpus linguistic techniques in addition to such statistical analyses as…
NASA Astrophysics Data System (ADS)
Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin
2015-10-01
The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Antrakusuma, B.; Masykuri, M.; Ulfa, M.
2018-04-01
Evolution of Android technology can be applied to chemistry learning, one of the complex chemistry concept was solubility equilibrium. this concept required the science process skills (SPS). This study aims to: 1) Characteristic scientific based chemistry Android module to empowering SPS, and 2) Validity of the module based on content validity and feasibility test. This research uses a Research and Development approach (RnD). Research subjects were 135 s1tudents and three teachers at three high schools in Boyolali, Central of Java. Content validity of the module was tested by seven experts using Aiken’s V technique, and the module feasibility was tested to students and teachers in each school. Characteristics of chemistry module can be accessed using the Android device. The result of validation of the module contents got V = 0.89 (Valid), and the results of the feasibility test Obtained 81.63% (by the student) and 73.98% (by the teacher) indicates this module got good criteria.
Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika
2017-01-01
Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
NASA Technical Reports Server (NTRS)
Sellers, William L., III; Dwoyer, Douglas L.
1992-01-01
The design of a hypersonic aircraft poses unique challenges to the engineering community. Problems with duplicating flight conditions in ground based facilities have made performance predictions risky. Computational fluid dynamics (CFD) has been proposed as an additional means of providing design data. At the present time, CFD codes are being validated based on sparse experimental data and then used to predict performance at flight conditions with generally unknown levels of uncertainty. This paper will discuss the facility and measurement techniques that are required to support CFD development for the design of hypersonic aircraft. Illustrations are given of recent success in combining experimental and direct numerical simulation in CFD model development and validation for hypersonic perfect gas flows.
Study on rapid valid acidity evaluation of apple by fiber optic diffuse reflectance technique
NASA Astrophysics Data System (ADS)
Liu, Yande; Ying, Yibin; Fu, Xiaping; Jiang, Xuesong
2004-03-01
Some issues related to nondestructive evaluation of valid acidity in intact apples by means of Fourier transform near infrared (FTNIR) (800-2631nm) method were addressed. A relationship was established between the diffuse reflectance spectra recorded with a bifurcated optic fiber and the valid acidity. The data were analyzed by multivariate calibration analysis such as partial least squares (PLS) analysis and principal component regression (PCR) technique. A total of 120 Fuji apples were tested and 80 of them were used to form a calibration data set. The influence of data preprocessing and different spectra treatments were also investigated. Models based on smoothing spectra were slightly worse than models based on derivative spectra and the best result was obtained when the segment length was 5 and the gap size was 10. Depending on data preprocessing and multivariate calibration technique, the best prediction model had a correlation efficient (0.871), a low RMSEP (0.0677), a low RMSEC (0.056) and a small difference between RMSEP and RMSEC by PLS analysis. The results point out the feasibility of FTNIR spectral analysis to predict the fruit valid acidity non-destructively. The ratio of data standard deviation to the root mean square error of prediction (SDR) is better to be less than 3 in calibration models, however, the results cannot meet the demand of actual application. Therefore, further study is required for better calibration and prediction.
Yue, Jin-feng; Qiao, Guan-hua; Liu, Ni; Nan, Fa-jun; Gao, Zhao-bing
2016-01-01
Aim: To establish an improved, high-throughput screening techniques for identifying novel KCNQ2 channel activators. Methods: KCNQ2 channels were stably expressed in CHO cells (KCNQ2 cells). Thallium flux assay was used for primary screening, and 384-well automated patch-clamp IonWorks Barracuda was used for hit validation. Two validated activators were characterized using a conventional patch-clamp recording technique. Results: From a collection of 80 000 compounds, the primary screening revealed a total of 565 compounds that potentiated the fluorescence signals in thallium flux assay by more than 150%. When the 565 hits were examined in IonWorks Barracuda, 38 compounds significantly enhanced the outward currents recorded in KCNQ2 cells, and were confirmed as KCNQ2 activators. In the conventional patch-clamp recordings, two validated activators ZG1732 and ZG2083 enhanced KCNQ2 currents with EC50 values of 1.04±0.18 μmol/L and 1.37±0.06 μmol/L, respectively. Conclusion: The combination of thallium flux assay and IonWorks Barracuda assay is an efficient high-throughput screening (HTS) route for discovering KCNQ2 activators. PMID:26725738
Intrathoracic airway measurement: ex-vivo validation
NASA Astrophysics Data System (ADS)
Reinhardt, Joseph M.; Raab, Stephen A.; D'Souza, Neil D.; Hoffman, Eric A.
1997-05-01
High-resolution x-ray CT (HRCT) provides detailed images of the lungs and bronchial tree. HRCT-based imaging and quantitation of peripheral bronchial airway geometry provides a valuable tool for assessing regional airway physiology. Such measurements have been sued to address physiological questions related to the mechanics of airway collapse in sleep apnea, the measurement of airway response to broncho-constriction agents, and to evaluate and track the progression of disease affecting the airways, such as asthma and cystic fibrosis. Significant attention has been paid to the measurements of extra- and intra-thoracic airways in 2D sections from volumetric x-ray CT. A variety of manual and semi-automatic techniques have been proposed for airway geometry measurement, including the use of standardized display window and level settings for caliper measurements, methods based on manual or semi-automatic border tracing, and more objective, quantitative approaches such as the use of the 'half-max' criteria. A recently proposed measurements technique uses a model-based deconvolution to estimate the location of the inner and outer airway walls. Validation using a plexiglass phantom indicates that the model-based method is more accurate than the half-max approach for thin-walled structures. In vivo validation of these airway measurement techniques is difficult because of the problems in identifying a reliable measurement 'gold standard.' In this paper we report on ex vivo validation of the half-max and model-based methods using an excised pig lung. The lung is sliced into thin sections of tissue and scanned using an electron beam CT scanner. Airways of interest are measured from the CT images, and also measured with using a microscope and micrometer to obtain a measurement gold standard. The result show no significant difference between the model-based measurements and the gold standard; while the half-max estimates exhibited a measurement bias and were significantly different than the gold standard.
NASA Astrophysics Data System (ADS)
Yepes, Pablo P.; Eley, John G.; Liu, Amy; Mirkovic, Dragan; Randeniya, Sharmalee; Titt, Uwe; Mohan, Radhe
2016-04-01
Monte Carlo (MC) methods are acknowledged as the most accurate technique to calculate dose distributions. However, due its lengthy calculation times, they are difficult to utilize in the clinic or for large retrospective studies. Track-repeating algorithms, based on MC-generated particle track data in water, accelerate dose calculations substantially, while essentially preserving the accuracy of MC. In this study, we present the validation of an efficient dose calculation algorithm for intensity modulated proton therapy, the fast dose calculator (FDC), based on a track-repeating technique. We validated the FDC algorithm for 23 patients, which included 7 brain, 6 head-and-neck, 5 lung, 1 spine, 1 pelvis and 3 prostate cases. For validation, we compared FDC-generated dose distributions with those from a full-fledged Monte Carlo based on GEANT4 (G4). We compared dose-volume-histograms, 3D-gamma-indices and analyzed a series of dosimetric indices. More than 99% of the voxels in the voxelized phantoms describing the patients have a gamma-index smaller than unity for the 2%/2 mm criteria. In addition the difference relative to the prescribed dose between the dosimetric indices calculated with FDC and G4 is less than 1%. FDC reduces the calculation times from 5 ms per proton to around 5 μs.
An Application-Based Discussion of Construct Validity and Internal Consistency Reliability.
ERIC Educational Resources Information Center
Taylor, Dianne L.; Campbell, Kathleen T.
Several techniques for conducting studies of measurement integrity are explained and illustrated using a heuristic data set from a study of teachers' participation in decision making (D. L. Taylor, 1991). The sample consisted of 637 teachers. It is emphasized that validity and reliability are characteristics of data, and do not inure to tests as…
A formal approach to validation and verification for knowledge-based control systems
NASA Technical Reports Server (NTRS)
Castore, Glen
1987-01-01
As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.
A semi-automatic method for left ventricle volume estimate: an in vivo validation study
NASA Technical Reports Server (NTRS)
Corsi, C.; Lamberti, C.; Sarti, A.; Saracino, G.; Shiota, T.; Thomas, J. D.
2001-01-01
This study aims to the validation of the left ventricular (LV) volume estimates obtained by processing volumetric data utilizing a segmentation model based on level set technique. The validation has been performed by comparing real-time volumetric echo data (RT3DE) and magnetic resonance (MRI) data. A validation protocol has been defined. The validation protocol was applied to twenty-four estimates (range 61-467 ml) obtained from normal and pathologic subjects, which underwent both RT3DE and MRI. A statistical analysis was performed on each estimate and on clinical parameters as stroke volume (SV) and ejection fraction (EF). Assuming MRI estimates (x) as a reference, an excellent correlation was found with volume measured by utilizing the segmentation procedure (y) (y=0.89x + 13.78, r=0.98). The mean error on SV was 8 ml and the mean error on EF was 2%. This study demonstrated that the segmentation technique is reliably applicable on human hearts in clinical practice.
Levecke, Bruno; De Wilde, Nathalie; Vandenhoute, Els; Vercruysse, Jozef
2009-01-01
Background Soil-transmitted helminths, such as Trichuris trichiura, are of major concern in public health. Current efforts to control these helminth infections involve periodic mass treatment in endemic areas. Since these large-scale interventions are likely to intensify, monitoring the drug efficacy will become indispensible. However, studies comparing detection techniques based on sensitivity, fecal egg counts (FEC), feasibility for mass diagnosis and drug efficacy estimates are scarce. Methodology/Principal Findings In the present study, the ether-based concentration, the Parasep Solvent Free (SF), the McMaster and the FLOTAC techniques were compared based on both validity and feasibility for the detection of Trichuris eggs in 100 fecal samples of nonhuman primates. In addition, the drug efficacy estimates of quantitative techniques was examined using a statistical simulation. Trichuris eggs were found in 47% of the samples. FLOTAC was the most sensitive technique (100%), followed by the Parasep SF (83.0% [95% confidence interval (CI): 82.4–83.6%]) and the ether-based concentration technique (76.6% [95% CI: 75.8–77.3%]). McMaster was the least sensitive (61.7% [95% CI: 60.7–62.6%]) and failed to detect low FEC. The quantitative comparison revealed a positive correlation between the four techniques (Rs = 0.85–0.93; p<0.0001). However, the ether-based concentration technique and the Parasep SF detected significantly fewer eggs than both the McMaster and the FLOTAC (p<0.0083). Overall, the McMaster was the most feasible technique (3.9 min/sample for preparing, reading and cleaning of the apparatus), followed by the ether-based concentration technique (7.7 min/sample) and the FLOTAC (9.8 min/sample). Parasep SF was the least feasible (17.7 min/sample). The simulation revealed that the sensitivity is less important for monitoring drug efficacy and that both FLOTAC and McMaster were reliable estimators. Conclusions/Significance The results of this study demonstrated that McMaster is a promising technique when making use of FEC to monitor drug efficacy in Trichuris. PMID:19172171
Internal Cluster Validation on Earthquake Data in the Province of Bengkulu
NASA Astrophysics Data System (ADS)
Rini, D. S.; Novianti, P.; Fransiska, H.
2018-04-01
K-means method is an algorithm for cluster n object based on attribute to k partition, where k < n. There is a deficiency of algorithms that is before the algorithm is executed, k points are initialized randomly so that the resulting data clustering can be different. If the random value for initialization is not good, the clustering becomes less optimum. Cluster validation is a technique to determine the optimum cluster without knowing prior information from data. There are two types of cluster validation, which are internal cluster validation and external cluster validation. This study aims to examine and apply some internal cluster validation, including the Calinski-Harabasz (CH) Index, Sillhouette (S) Index, Davies-Bouldin (DB) Index, Dunn Index (D), and S-Dbw Index on earthquake data in the Bengkulu Province. The calculation result of optimum cluster based on internal cluster validation is CH index, S index, and S-Dbw index yield k = 2, DB Index with k = 6 and Index D with k = 15. Optimum cluster (k = 6) based on DB Index gives good results for clustering earthquake in the Bengkulu Province.
Mass spectrometry for fragment screening.
Chan, Daniel Shiu-Hin; Whitehouse, Andrew J; Coyne, Anthony G; Abell, Chris
2017-11-08
Fragment-based approaches in chemical biology and drug discovery have been widely adopted worldwide in both academia and industry. Fragment hits tend to interact weakly with their targets, necessitating the use of sensitive biophysical techniques to detect their binding. Common fragment screening techniques include differential scanning fluorimetry (DSF) and ligand-observed NMR. Validation and characterization of hits is usually performed using a combination of protein-observed NMR, isothermal titration calorimetry (ITC) and X-ray crystallography. In this context, MS is a relatively underutilized technique in fragment screening for drug discovery. MS-based techniques have the advantage of high sensitivity, low sample consumption and being label-free. This review highlights recent examples of the emerging use of MS-based techniques in fragment screening. © 2017 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.
CMOS array design automation techniques. [metal oxide semiconductors
NASA Technical Reports Server (NTRS)
Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.
1975-01-01
A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.
On using sample selection methods in estimating the price elasticity of firms' demand for insurance.
Marquis, M Susan; Louis, Thomas A
2002-01-01
We evaluate a technique based on sample selection models that has been used by health economists to estimate the price elasticity of firms' demand for insurance. We demonstrate that, this technique produces inflated estimates of the price elasticity. We show that alternative methods lead to valid estimates.
Performance analysis of clustering techniques over microarray data: A case study
NASA Astrophysics Data System (ADS)
Dash, Rasmita; Misra, Bijan Bihari
2018-03-01
Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.
Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti
2017-08-11
In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.
Remote sensing of high-latitude ionization profiles by ground-based and spaceborne instrumentation
NASA Technical Reports Server (NTRS)
Vondrak, R. R.
1981-01-01
Ionospheric specification and modeling are now largely based on data provided by active remote sensing with radiowave techniques (ionosondes, incoherent-scatter radars, and satellite beacons). More recently, passive remote sensing techniques have been developed that can be used to monitor quantitatively the spatial distribution of high-latitude E-region ionization. These passive methods depend on the measurement, or inference, of the energy distribution of precipitating kilovolt electrons, the principal source of the nighttime E-region at high latitudes. To validate these techniques, coordinated measurements of the auroral ionosphere have been made with the Chatanika incoherent-scatter radar and a variety of ground-based and spaceborne sensors
Financial model calibration using consistency hints.
Abu-Mostafa, Y S
2001-01-01
We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.
Empirical validation of an agent-based model of wood markets in Switzerland
Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver
2018-01-01
We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300
Deflection-Based Aircraft Structural Loads Estimation with Comparison to Flight
NASA Technical Reports Server (NTRS)
Lizotte, Andrew M.; Lokos, William A.
2005-01-01
Traditional techniques in structural load measurement entail the correlation of a known load with strain-gage output from the individual components of a structure or machine. The use of strain gages has proved successful and is considered the standard approach for load measurement. However, remotely measuring aerodynamic loads using deflection measurement systems to determine aeroelastic deformation as a substitute to strain gages may yield lower testing costs while improving aircraft performance through reduced instrumentation weight. With a reliable strain and structural deformation measurement system this technique was examined. The objective of this study was to explore the utility of a deflection-based load estimation, using the active aeroelastic wing F/A-18 aircraft. Calibration data from ground tests performed on the aircraft were used to derive left wing-root and wing-fold bending-moment and torque load equations based on strain gages, however, for this study, point deflections were used to derive deflection-based load equations. Comparisons between the strain-gage and deflection-based methods are presented. Flight data from the phase-1 active aeroelastic wing flight program were used to validate the deflection-based load estimation method. Flight validation revealed a strong bending-moment correlation and slightly weaker torque correlation. Development of current techniques, and future studies are discussed.
Deflection-Based Structural Loads Estimation From the Active Aeroelastic Wing F/A-18 Aircraft
NASA Technical Reports Server (NTRS)
Lizotte, Andrew M.; Lokos, William A.
2005-01-01
Traditional techniques in structural load measurement entail the correlation of a known load with strain-gage output from the individual components of a structure or machine. The use of strain gages has proved successful and is considered the standard approach for load measurement. However, remotely measuring aerodynamic loads using deflection measurement systems to determine aeroelastic deformation as a substitute to strain gages may yield lower testing costs while improving aircraft performance through reduced instrumentation weight. This technique was examined using a reliable strain and structural deformation measurement system. The objective of this study was to explore the utility of a deflection-based load estimation, using the active aeroelastic wing F/A-18 aircraft. Calibration data from ground tests performed on the aircraft were used to derive left wing-root and wing-fold bending-moment and torque load equations based on strain gages, however, for this study, point deflections were used to derive deflection-based load equations. Comparisons between the strain-gage and deflection-based methods are presented. Flight data from the phase-1 active aeroelastic wing flight program were used to validate the deflection-based load estimation method. Flight validation revealed a strong bending-moment correlation and slightly weaker torque correlation. Development of current techniques, and future studies are discussed.
Validation of motion correction techniques for liver CT perfusion studies
Chandler, A; Wei, W; Anderson, E F; Herron, D H; Ye, Z; Ng, C S
2012-01-01
Objectives Motion in images potentially compromises the evaluation of temporally acquired CT perfusion (CTp) data; image registration should mitigate this, but first requires validation. Our objective was to compare the relative performance of manual, rigid and non-rigid registration techniques to correct anatomical misalignment in acquired liver CTp data sets. Methods 17 data sets in patients with liver tumours who had undergone a CTp protocol were evaluated. Each data set consisted of a cine acquisition during a breath-hold (Phase 1), followed by six further sets of cine scans (each containing 11 images) acquired during free breathing (Phase 2). Phase 2 images were registered to a reference image from Phase 1 cine using two semi-automated intensity-based registration techniques (rigid and non-rigid) and a manual technique (the only option available in the relevant vendor CTp software). The performance of each technique to align liver anatomy was assessed by four observers, independently and blindly, on two separate occasions, using a semi-quantitative visual validation study (employing a six-point score). The registration techniques were statistically compared using an ordinal probit regression model. Results 306 registrations (2448 observer scores) were evaluated. The three registration techniques were significantly different from each other (p=0.03). On pairwise comparison, the semi-automated techniques were significantly superior to the manual technique, with non-rigid significantly superior to rigid (p<0.0001), which in turn was significantly superior to manual registration (p=0.04). Conclusion Semi-automated registration techniques achieved superior alignment of liver anatomy compared with the manual technique. We hope this will translate into more reliable CTp analyses. PMID:22374283
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mistry, Nilesh N., E-mail: nmistry@som.umaryland.edu; Diwanji, Tejan; Shi, Xiutao
2013-11-15
Purpose: Current implementations of methods based on Hounsfield units to evaluate regional lung ventilation do not directly incorporate tissue-based mass changes that occur over the respiratory cycle. To overcome this, we developed a 4-dimensional computed tomography (4D-CT)-based technique to evaluate fractional regional ventilation (FRV) that uses an individualized ratio of tidal volume to end-expiratory lung volume for each voxel. We further evaluated the effect of different breathing maneuvers on regional ventilation. The results from this work will help elucidate the relationship between global and regional lung function. Methods and Materials: Eight patients underwent 3 sets of 4D-CT scans during 1more » session using free-breathing, audiovisual guidance, and active breathing control. FRV was estimated using a density-based algorithm with mass correction. Internal validation between global and regional ventilation was performed by use of the imaging data collected during the use of active breathing control. The impact of breathing maneuvers on FRV was evaluated comparing the tidal volume from 3 breathing methods. Results: Internal validation through comparison between the global and regional changes in ventilation revealed a strong linear correlation (slope of 1.01, R{sup 2} of 0.97) between the measured global lung volume and the regional lung volume calculated by use of the “mass corrected” FRV. A linear relationship was established between the tidal volume measured with the automated breathing control system and FRV based on 4D-CT imaging. Consistently larger breathing volumes were observed when coached breathing techniques were used. Conclusions: The technique presented improves density-based evaluation of lung ventilation and establishes a link between global and regional lung ventilation volumes. Furthermore, the results obtained are comparable with those of other techniques of functional evaluation such as spirometry and hyperpolarized-gas magnetic resonance imaging. These results were demonstrated on retrospective analysis of patient data, and further research using prospective data is under way to validate this technique against established clinical tests.« less
NASA Astrophysics Data System (ADS)
Gopal Madhav Annamdas, Venu; Kiong Soh, Chee
2017-04-01
The last decade has seen the use of various wired-wireless and contact-contactless sensors in several structural health monitoring (SHM) techniques. Most SHM sensors that are predominantly used for strain measurements may be ineffective for damage detection and vice versa, indicating the uniapplicability of these sensors. However, piezoelectric (PE)-based macro fiber composite (MFC) and lead zirconium titanate (PZT) sensors have been on the rise in SHM, vibration and damping control, etc, due to their superior actuation and sensing abilities. These PE sensors have created much interest for their multi-applicability in various technologies such as electromechanical impedance (EMI)-based SHM. This research employs piezo diaphragms, a cheaper alternative to several expensive types of PZT/MFC sensors for the EMI technique. These piezo diaphragms were validated last year for their applicability in damage detection using the frequency domain. Here we further validate their applicability in strain monitoring using the real time domain. Hence, these piezo diaphragms can now be classified as PE sensors and used with PZT and MFC sensors in the EMI technique for monitoring damage and loading. However, no single technique or single type of sensor will be sufficient for large SHM, thus requiring the necessary deployment of more than one technique with different types of sensors such as a piezoresistive strain gauge based wireless sensor network for strain measurements to complement the EMI technique. Furthermore, we present a novel procedure of converting a regular PE sensor in the ‘frequency domain’ to ‘real time domain’ for strain applications.
Dridi, Wafa; Toutain, Jean; Sommier, Alain; Essafi, Wafa; Leal-Calderon, Fernando; Cansell, Maud
2017-09-01
An experimental device based on the measurement of the heat flux dissipated during chemical reactions, previously validated for monitoring lipid oxidation in plant oils, was extended to follow lipid oxidation in water-in-oil emulsions. Firstly, validation of the approach was performed by correlating conjugated diene concentrations measured by spectrophotometry and the heat flux dissipated by oxidation reactions and measured directly in water-in-oil emulsions, in isothermal conditions at 60°C. Secondly, several emulsions based on plant oils differing in their n-3 fatty acid content were compared. The oxidability parameter derived from the enthalpy curves reflected the α-linolenic acid proportion in the oils. On the whole, the micro-calorimetry technique provides a sensitive method to assess lipid oxidation in water-in-oil emulsions without requiring any phase extraction. Copyright © 2017 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Accurate, nonintrusive, and inexpensive techniques are needed to measure energy expenditure (EE) in free-living populations. Our primary aim in this study was to validate cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on observable participant cha...
NASA Astrophysics Data System (ADS)
Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB
2017-11-01
Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.
Assessment of simulation fidelity using measurements of piloting technique in flight
NASA Technical Reports Server (NTRS)
Clement, W. F.; Cleveland, W. B.; Key, D. L.
1984-01-01
The U.S. Army and NASA joined together on a project to conduct a systematic investigation and validation of a ground based piloted simulation of the Army/Sikorsky UH-60A helicopter. Flight testing was an integral part of the validation effort. Nap-of-the-Earth (NOE) piloting tasks which were investigated included the bob-up, the hover turn, the dash/quickstop, the sidestep, the dolphin, and the slalom. Results from the simulation indicate that the pilot's NOE task performance in the simulator is noticeably and quantifiably degraded when compared with the task performance results generated in flight test. The results of the flight test and ground based simulation experiments support a unique rationale for the assessment of simulation fidelity: flight simulation fidelity should be judged quantitatively by measuring pilot's control strategy and technique as induced by the simulator. A quantitative comparison is offered between the piloting technique observed in a flight simulator and that observed in flight test for the same tasks performed by the same pilots.
Assessing the validity of discourse analysis: transdisciplinary convergence
NASA Astrophysics Data System (ADS)
Jaipal-Jamani, Kamini
2014-12-01
Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.
NASA Astrophysics Data System (ADS)
Yi, Yong; Chen, Zhengying; Wang, Liming
2018-05-01
Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.
Pallaro, Anabel; Tarducci, Gabriel
2014-12-01
The application of nuclear techniques in the area of nutrition is safe because they use stable isotopes. The deuterium dilution method is used in body composition and human milk intake analysis. It is a reference method for body fat and validates inexpensive tools because of its accuracy, simplicity of application in individuals and population and the background of its usefulness in adults and children as an evaluation tool in clinical and health programs. It is a non-invasive technique as it uses saliva, which facilitates the assessment in pediatric populations. Changes in body fat are associated with non-communicable diseases; moreover, normal weight individuals with high fat deposition were reported. Furthermore, this technique is the only accurate way to determine whether infants are exclusively breast-fed and validate conventional methods based on surveys to mothers.
Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.
Sisco, Edward; Dake, Jeffrey; Bridge, Candice
2013-10-10
Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A new technique for measuring listening and reading literacy in developing countries
NASA Astrophysics Data System (ADS)
Greene, Barbara A.; Royer, James M.; Anzalone, Stephen
1990-03-01
One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.
A Pressure Plate-Based Method for the Automatic Assessment of Foot Strike Patterns During Running.
Santuz, Alessandro; Ekizos, Antonis; Arampatzis, Adamantios
2016-05-01
The foot strike pattern (FSP, description of how the foot touches the ground at impact) is recognized to be a predictor of both performance and injury risk. The objective of the current investigation was to validate an original foot strike pattern assessment technique based on the numerical analysis of foot pressure distribution. We analyzed the strike patterns during running of 145 healthy men and women (85 male, 60 female). The participants ran on a treadmill with integrated pressure plate at three different speeds: preferred (shod and barefoot 2.8 ± 0.4 m/s), faster (shod 3.5 ± 0.6 m/s) and slower (shod 2.3 ± 0.3 m/s). A custom-designed algorithm allowed the automatic footprint recognition and FSP evaluation. Incomplete footprints were simultaneously identified and corrected from the software itself. The widely used technique of analyzing high-speed video recordings was checked for its reliability and has been used to validate the numerical technique. The automatic numerical approach showed a good conformity with the reference video-based technique (ICC = 0.93, p < 0.01). The great improvement in data throughput and the increased completeness of results allow the use of this software as a powerful feedback tool in a simple experimental setup.
Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet
2011-10-01
The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations. Only the model for one specimen met the validation criteria for average and peak pressure of both articulations; however the experimental measures for peak pressure also exhibited high variability. MRI-based modeling can reliably be used for evaluating the contact area and contact force with similar confidence as in currently available experimental techniques. Average contact pressure, and peak contact pressure were more variable from all measurement techniques, and these measures from MRI-based modeling should be used with some caution.
Review of surface steam sterilization for validation purposes.
van Doornmalen, Joost; Kopinga, Klaas
2008-03-01
Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.
Detrended fluctuation analysis for major depressive disorder.
Mumtaz, Wajid; Malik, Aamir Saeed; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Amin, Hafeezullah
2015-01-01
Clinical utility of Electroencephalography (EEG) based diagnostic studies is less clear for major depressive disorder (MDD). In this paper, a novel machine learning (ML) scheme was presented to discriminate the MDD patients and healthy controls. The proposed method inherently involved feature extraction, selection, classification and validation. The EEG data acquisition involved eyes closed (EC) and eyes open (EO) conditions. At feature extraction stage, the de-trended fluctuation analysis (DFA) was performed, based on the EEG data, to achieve scaling exponents. The DFA was performed to analyzes the presence or absence of long-range temporal correlations (LRTC) in the recorded EEG data. The scaling exponents were used as input features to our proposed system. At feature selection stage, 3 different techniques were used for comparison purposes. Logistic regression (LR) classifier was employed. The method was validated by a 10-fold cross-validation. As results, we have observed that the effect of 3 different reference montages on the computed features. The proposed method employed 3 different types of feature selection techniques for comparison purposes as well. The results show that the DFA analysis performed better in LE data compared with the IR and AR data. In addition, during Wilcoxon ranking, the AR performed better than LE and IR. Based on the results, it was concluded that the DFA provided useful information to discriminate the MDD patients and with further validation can be employed in clinics for diagnosis of MDD.
A technique for global monitoring of net solar irradiance at the ocean surface. II - Validation
NASA Technical Reports Server (NTRS)
Chertock, Beth; Frouin, Robert; Gautier, Catherine
1992-01-01
The generation and validation of the first satellite-based long-term record of surface solar irradiance over the global oceans are addressed. The record is generated using Nimbus-7 earth radiation budget (ERB) wide-field-of-view plentary-albedo data as input to a numerical algorithm designed and implemented based on radiative transfer theory. The mean monthly values of net surface solar irradiance are computed on a 9-deg latitude-longitude spatial grid for November 1978-October 1985. The new data set is validated in comparisons with short-term, regional, high-resolution, satellite-based records. The ERB-based values of net surface solar irradiance are compared with corresponding values based on radiance measurements taken by the Visible-Infrared Spin Scan Radiometer aboard GOES series satellites. Errors in the new data set are estimated to lie between 10 and 20 W/sq m on monthly time scales.
James S. Han; Theodore Mianowski; Yi-yu Lin
1999-01-01
The efficacy of fiber length measurement techniques such as digitizing, the Kajaani procedure, and NIH Image are compared in order to determine the optimal tool. Kenaf bast fibers, aspen, and red pine fibers were collected from different anatomical parts, and the fiber lengths were compared using various analytical tools. A statistical analysis on the validity of the...
Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets
NASA Astrophysics Data System (ADS)
Goel, Amit; Montgomery, Michele
2015-08-01
Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.
Chafetz, Michael D
2010-08-01
This article is about Social Security Administration (SSA) policy with regard to the Psychological Consultative Examination (PCE) for Social Security Disability, particularly with respect to validation of the responses and findings. First, the nature of the consultation and the importance of understanding the boundaries and ethics of the psychologist's role are described. Issues particular to working with low-functioning claimants usually form a large part of these examinations. The psychologist must understand various forms of non-credible behavior during the PCE, and how malingering might be considered among other non-credible presentations. Issues pertaining to symptom validity testing in low-functioning claimants are further explored. SSA policy with respect to symptom validity testing is carefully examined, with an attempt to answer specific concerns and show how psychological science can be of assistance, particularly with evidence-based practice. Additionally, the nature and importance of techniques to avoid the mislabeling of claimants as malingerers are examined. SSA requires the use of accepted diagnostic techniques with which to establish impairment, and this article describes the implementation of that requirement, particularly with respect to validating the findings.
Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.
2018-01-01
The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation metric will provide a quantified confidence and probability of success for the final SLS dynamics model, which will be critical for a successful launch program, and can be applied in the many other industries where an accurate dynamic model is required.
Orsi, Rebecca
2017-02-01
Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Narumi, Ryohei; Tomonaga, Takeshi
2016-01-01
Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.
van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W
2016-10-01
Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.
Validation and upgrading of physically based mathematical models
NASA Technical Reports Server (NTRS)
Duval, Ronald
1992-01-01
The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.
Differential Absorption Lidar Measurements of Fugitive Benzene Emissions
NASA Astrophysics Data System (ADS)
Robinson, R. A.; Innocenti, F.; Helmore, J.; Gardiner, T.; Finlayson, A.; Connor, A.
2016-12-01
The Differential Absorption Lidar (DIAL) technique is based on the optical analogue of radar; lidar (light detection and ranging). It provides the capability to remotely measure the concentration and spatial distribution of compounds in the atmosphere. The ability to scan the optical measurement beam throughout the atmosphere enables pollutant concentrations to be mapped, and emission fluxes to be determined when combined with wind data. The NPL DIAL systems can operate in the UV and infrared spectral, enabling the measurement of a range of air pollutants and GHGs including hazardous air pollutants such as benzene. The mobile ground based DIAL systems developed at NPL for pollution monitoring have been used for over 25 years. They have been deployed for routine monitoring, emission factor studies, research investigations and targeted monitoring campaigns. More recently the NPL DIAL has been used in studies to validate other monitoring techniques. In support of this capability, NPL have developed a portable, configurable controlled release system (CRF) able to simulate emissions from typical sources. This has been developed to enable the validation and assessment of fugitive emission monitoring techniques. Following a brief summary of the technique, we outline recent developments in the use of DIAL for monitoring fugitive and diffuse emissions, including the development of a European Standard Method for fugitive emission monitoring. We will present the results of a number of validation exercises using the CRF presenting an update on the performance of DIAL for emission quantification and discuss the wider validation of novel technologies. We will report on recent measurements of the emissions of benzene from industrial sites including a large scale emissions monitoring study carried out by the South Coast Air Quality Management District (SCAQMD) and will report on the measurement of emissions from petrochemical facilities and examine an example of the identification and quantification of a significant benzene release from a facility in Europe. We will discuss the use of advanced techniques such as DIAL in support of the recently introduced EPA refinery rule (and the long term sampling approach in EPA method 325) and explore the role these techniques can have in providing improved data on emissions.
NASA Technical Reports Server (NTRS)
Smith, Phillip N.
1990-01-01
The automation of low-altitude rotorcraft flight depends on the ability to detect, locate, and navigate around obstacles lying in the rotorcraft's intended flightpath. Computer vision techniques provide a passive method of obstacle detection and range estimation, for obstacle avoidance. Several algorithms based on computer vision methods have been developed for this purpose using laboratory data; however, further development and validation of candidate algorithms require data collected from rotorcraft flight. A data base containing low-altitude imagery augmented with the rotorcraft and sensor parameters required for passive range estimation is not readily available. Here, the emphasis is on the methodology used to develop such a data base from flight-test data consisting of imagery, rotorcraft and sensor parameters, and ground-truth range measurements. As part of the data preparation, a technique for obtaining the sensor calibration parameters is described. The data base will enable the further development of algorithms for computer vision-based obstacle detection and passive range estimation, as well as provide a benchmark for verification of range estimates against ground-truth measurements.
A Psychoeducational School-Based Group Intervention for Socially Anxious Children
ERIC Educational Resources Information Center
Vassilopoulos, Stephanos P.; Brouzos, Andreas; Damer, Diana E.; Mellou, Angeliki; Mitropoulou, Alexandra
2013-01-01
This study investigated the impact of a psychoeducational group for social anxiety aimed at elementary children. An 8-week psychoeducational program based on empirically validated risk factors was designed. Interventions included cognitive restructuring, anxiety management techniques, and social skills training. Pre-and posttest data from 3 groups…
Recent literature has shown that bioavailability-based techniques, such as Tenax extraction, can estimate sediment exposure to benthos. In a previous study by the authors,Tenax extraction was used to create and validate a literature-based Tenax model to predict oligochaete bioac...
The study on network security based on software engineering
NASA Astrophysics Data System (ADS)
Jia, Shande; Ao, Qian
2012-04-01
Developing a SP is a sensitive task because the SP itself can lead to security weaknesses if it is not conform to the security properties. Hence, appropriate techniques are necessary to overcome such problems. These techniques must accompany the policy throughout its deployment phases. The main contribution of this paper is then, the proposition of three of these activities: validation, test and multi-SP conflict management. Our techniques are inspired by the well established techniques of the software engineering for which we have found some similarities with the security domain.
Jaspers, Mariëlle E H; van Haasterecht, Ludo; van Zuijlen, Paul P M; Mokkink, Lidwine B
2018-06-22
Reliable and valid assessment of burn wound depth or healing potential is essential to treatment decision-making, to provide a prognosis, and to compare studies evaluating different treatment modalities. The aim of this review was to critically appraise, compare and summarize the quality of relevant measurement properties of techniques that aim to assess burn wound depth or healing potential. A systematic literature search was performed using PubMed, EMBASE and Cochrane Library. Two reviewers independently evaluated the methodological quality of included articles using an adapted version of the Consensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. A synthesis of evidence was performed to rate the measurement properties for each technique and to draw an overall conclusion on quality of the techniques. Thirty-six articles were included, evaluating various techniques, classified as (1) laser Doppler techniques; (2) thermography or thermal imaging; (3) other measurement techniques. Strong evidence was found for adequate construct validity of laser Doppler imaging (LDI). Moderate evidence was found for adequate construct validity of thermography, videomicroscopy, and spatial frequency domain imaging (SFDI). Only two studies reported on the measurement property reliability. Furthermore, considerable variation was observed among comparator instruments. Considering the evidence available, it appears that LDI is currently the most favorable technique; thereby assessing burn wound healing potential. Additional research is needed into thermography, videomicroscopy, and SFDI to evaluate their full potential. Future studies should focus on reliability and measurement error, and provide a precise description of which construct is aimed to measure. Copyright © 2018 Elsevier Ltd and ISBI. All rights reserved.
An improved sample loading technique for cellular metabolic response monitoring under pressure
NASA Astrophysics Data System (ADS)
Gikunda, Millicent Nkirote
To monitor cellular metabolism under pressure, a pressure chamber designed around a simple-to-construct capillary-based spectroscopic chamber coupled to a microliter-flow perfusion system is used in the laboratory. Although cyanide-induced metabolic responses from Saccharomyces cerevisiae (baker's yeast) could be controllably induced and monitored under pressure, previously used sample loading technique was not well controlled. An improved cell-loading technique which is based on use of a secondary inner capillary into which the sample is loaded then inserted into the capillary pressure chamber, has been developed. As validation, we demonstrate the ability to measure the chemically-induced metabolic responses at pressures of up to 500 bars. This technique is shown to be less prone to sample loss due to perfusive flow than the previous techniques used.
Dohrenbusch, R
2009-06-01
Chronic pain accompanied by disability and handicap is a frequent symptom necessitating medical assessment. Current guidelines for the assessment of malingering suggest discrimination between explanatory demonstration, aggravation and simulation. However, this distinction has not clearly been put into operation and validated. The necessity of assessment strategies based on general principles of psychological assessment and testing is emphasized. Standardized and normalized psychological assessment methods and symptom validation techniques should be used in the assessment of subjects with chronic pain problems. An adaptive procedure for assessing the validity of complaints is suggested to minimize effort and costs.
Comparison of recent major short-haul studies
DOT National Transportation Integrated Search
1973-04-01
This report summarizes recent studies relating to the impact of short-haul air-transportation planning, and reviews these studies to determine their assumptions, data bases, and validity of the major analytical techniques employed. Finally, a compari...
Chen, Li; Mossa-Basha, Mahmud; Balu, Niranjan; Canton, Gador; Sun, Jie; Pimentel, Kristi; Hatsukami, Thomas S; Hwang, Jenq-Neng; Yuan, Chun
2018-06-01
To develop a quantitative intracranial artery measurement technique to extract comprehensive artery features from time-of-flight MR angiography (MRA). By semiautomatically tracing arteries based on an open-curve active contour model in a graphical user interface, 12 basic morphometric features and 16 basic intensity features for each artery were identified. Arteries were then classified as one of 24 types using prediction from a probability model. Based on the anatomical structures, features were integrated within 34 vascular groups for regional features of vascular trees. Eight 3D MRA acquisitions with intracranial atherosclerosis were assessed to validate this technique. Arterial tracings were validated by an experienced neuroradiologist who checked agreement at bifurcation and stenosis locations. This technique achieved 94% sensitivity and 85% positive predictive values (PPV) for bifurcations, and 85% sensitivity and PPV for stenosis. Up to 1,456 features, such as length, volume, and averaged signal intensity for each artery, as well as vascular group in each of the MRA images, could be extracted to comprehensively reflect characteristics, distribution, and connectivity of arteries. Length for the M1 segment of the middle cerebral artery extracted by this technique was compared with reviewer-measured results, and the intraclass correlation coefficient was 0.97. A semiautomated quantitative method to trace, label, and measure intracranial arteries from 3D-MRA was developed and validated. This technique can be used to facilitate quantitative intracranial vascular research, such as studying cerebrovascular adaptation to aging and disease conditions. Magn Reson Med 79:3229-3238, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
ERIC Educational Resources Information Center
Hatcher, Tim; Colton, Sharon
2007-01-01
Purpose: The purpose of this article is to highlight the results of the online Delphi research project; in particular the procedures used to establish an online and innovative process of content validation and obtaining "rich" and descriptive information using the internet and current e-learning technologies. The online Delphi was proven to be an…
Quantitative analysis of Sudan dye adulteration in paprika powder using FTIR spectroscopy.
Lohumi, Santosh; Joshi, Ritu; Kandpal, Lalit Mohan; Lee, Hoonsoo; Kim, Moon S; Cho, Hyunjeong; Mo, Changyeun; Seo, Young-Wook; Rahman, Anisur; Cho, Byoung-Kwan
2017-05-01
As adulteration of foodstuffs with Sudan dye, especially paprika- and chilli-containing products, has been reported with some frequency, this issue has become one focal point for addressing food safety. FTIR spectroscopy has been used extensively as an analytical method for quality control and safety determination for food products. Thus, the use of FTIR spectroscopy for rapid determination of Sudan dye in paprika powder was investigated in this study. A net analyte signal (NAS)-based methodology, named HLA/GO (hybrid linear analysis in the literature), was applied to FTIR spectral data to predict Sudan dye concentration. The calibration and validation sets were designed to evaluate the performance of the multivariate method. The obtained results had a high determination coefficient (R 2 ) of 0.98 and low root mean square error (RMSE) of 0.026% for the calibration set, and an R 2 of 0.97 and RMSE of 0.05% for the validation set. The model was further validated using a second validation set and through the figures of merit, such as sensitivity, selectivity, and limits of detection and quantification. The proposed technique of FTIR combined with HLA/GO is rapid, simple and low cost, making this approach advantageous when compared with the main alternative methods based on liquid chromatography (LC) techniques.
Orion FSW V and V and Kedalion Engineering Lab Insight
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.
2010-01-01
NASA, along with its prime Orion contractor and its subcontractor s are adapting an avionics system paradigm borrowed from the manned commercial aircraft industry for use in manned space flight systems. Integrated Modular Avionics (IMA) techniques have been proven as a robust avionics solution for manned commercial aircraft (B737/777/787, MD 10/90). This presentation will outline current approaches to adapt IMA, along with its heritage FSW V&V paradigms, into NASA's manned space flight program for Orion. NASA's Kedalion engineering analysis lab is on the forefront of validating many of these contemporary IMA based techniques. Kedalion has already validated many of the proposed Orion FSW V&V paradigms using Orion's precursory Flight Test Article (FTA) Pad Abort 1 (PA-1) program. The Kedalion lab will evolve its architectures, tools, and techniques in parallel with the evolving Orion program.
2018-01-01
A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis. PMID:29686931
Valavala, Sriram; Seelam, Nareshvarma; Tondepu, Subbaiah; Jagarlapudi, V Shanmukha Kumar; Sundarmurthy, Vivekanandan
2018-01-01
A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µ m) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.
Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás
2015-01-01
Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537
Hogue, Aaron; Dauber, Sarah; Henderson, Craig E
2014-01-01
This study introduces a therapist-report measure of evidence-based practices for adolescent conduct and substance use problems. The Inventory of Therapy Techniques-Adolescent Behavior Problems (ITT-ABP) is a post-session measure of 27 techniques representing four approaches: cognitive-behavioral therapy (CBT), family therapy (FT), motivational interviewing (MI), and drug counseling (DC). A total of 822 protocols were collected from 32 therapists treating 71 adolescents in six usual care sites. Factor analyses identified three clinically coherent scales with strong internal consistency across the full sample: FT (8 items; α = .79), MI/CBT (8 items; α = .87), and DC (9 items, α = .90). The scales discriminated between therapists working in a family-oriented site versus other sites and showed moderate convergent validity with therapist reports of allegiance and skill in each approach. The ITT-ABP holds promise as a cost-efficient quality assurance tool for supporting high-fidelity delivery of evidence-based practices in usual care.
Semantics driven approach for knowledge acquisition from EMRs.
Perera, Sujan; Henson, Cory; Thirunarayan, Krishnaprasad; Sheth, Amit; Nair, Suhas
2014-03-01
Semantic computing technologies have matured to be applicable to many critical domains such as national security, life sciences, and health care. However, the key to their success is the availability of a rich domain knowledge base. The creation and refinement of domain knowledge bases pose difficult challenges. The existing knowledge bases in the health care domain are rich in taxonomic relationships, but they lack nontaxonomic (domain) relationships. In this paper, we describe a semiautomatic technique for enriching existing domain knowledge bases with causal relationships gleaned from Electronic Medical Records (EMR) data. We determine missing causal relationships between domain concepts by validating domain knowledge against EMR data sources and leveraging semantic-based techniques to derive plausible relationships that can rectify knowledge gaps. Our evaluation demonstrates that semantic techniques can be employed to improve the efficiency of knowledge acquisition.
Feathering effect detection and artifact agglomeration index-based video deinterlacing technique
NASA Astrophysics Data System (ADS)
Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo
2018-03-01
Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.
Mélin, Frédéric; Zibordi, Giuseppe
2007-06-20
An optically based technique is presented that produces merged spectra of normalized water-leaving radiances L(WN) by combining spectral data provided by independent satellite ocean color missions. The assessment of the merging technique is based on a four-year field data series collected by an autonomous above-water radiometer located on the Acqua Alta Oceanographic Tower in the Adriatic Sea. The uncertainties associated with the merged L(WN) obtained from the Sea-viewing Wide Field-of-view Sensor and the Moderate Resolution Imaging Spectroradiometer are consistent with the validation statistics of the individual sensor products. The merging including the third mission Medium Resolution Imaging Spectrometer is also addressed for a reduced ensemble of matchups.
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
NASA Astrophysics Data System (ADS)
Lu, Zenghai; Matcher, Stephen J.
2013-03-01
We report on a new calibration technique that permits the accurate extraction of sample Jones matrix and hence fast-axis orientation by using fiber-based polarization-sensitive optical coherence tomography (PS-OCT) that is completely based on non polarization maintaining fiber such as SMF-28. In this technique, two quarter waveplates are used to completely specify the parameters of the system fibers in the sample arm so that the Jones matrix of the sample can be determined directly. The device was validated on measurements of a quarter waveplate and an equine tendon sample by a single-mode fiber-based swept-source PS-OCT system.
The anatomy of floating shock fitting. [shock waves computation for flow field
NASA Technical Reports Server (NTRS)
Salas, M. D.
1975-01-01
The floating shock fitting technique is examined. Second-order difference formulas are developed for the computation of discontinuities. A procedure is developed to compute mesh points that are crossed by discontinuities. The technique is applied to the calculation of internal two-dimensional flows with arbitrary number of shock waves and contact surfaces. A new procedure, based on the coalescence of characteristics, is developed to detect the formation of shock waves. Results are presented to validate and demonstrate the versatility of the technique.
Fixed gain and adaptive techniques for rotorcraft vibration control
NASA Technical Reports Server (NTRS)
Roy, R. H.; Saberi, H. A.; Walker, R. A.
1985-01-01
The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.
Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography
NASA Astrophysics Data System (ADS)
Revel, G. M.; Pandarese, G.; Cavuto, A.
2012-06-01
The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
Majumdar, Subhabrata; Basak, Subhash C
2018-04-26
Proper validation is an important aspect of QSAR modelling. External validation is one of the widely used validation methods in QSAR where the model is built on a subset of the data and validated on the rest of the samples. However, its effectiveness for datasets with a small number of samples but large number of predictors remains suspect. Calculating hundreds or thousands of molecular descriptors using currently available software has become the norm in QSAR research, owing to computational advances in the past few decades. Thus, for n chemical compounds and p descriptors calculated for each molecule, the typical chemometric dataset today has high value of p but small n (i.e. n < p). Motivated by the evidence of inadequacies of external validation in estimating the true predictive capability of a statistical model in recent literature, this paper performs an extensive and comparative study of this method with several other validation techniques. We compared four validation methods: leave-one-out, K-fold, external and multi-split validation, using statistical models built using the LASSO regression, which simultaneously performs variable selection and modelling. We used 300 simulated datasets and one real dataset of 95 congeneric amine mutagens for this evaluation. External validation metrics have high variation among different random splits of the data, hence are not recommended for predictive QSAR models. LOO has the overall best performance among all validation methods applied in our scenario. Results from external validation are too unstable for the datasets we analyzed. Based on our findings, we recommend using the LOO procedure for validating QSAR predictive models built on high-dimensional small-sample data. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Survey of statistical techniques used in validation studies of air pollution prediction models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornstein, R D; Anderson, S F
1979-03-01
Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.
Translating the simulation of procedural drilling techniques for interactive neurosurgical training.
Stredney, Don; Rezai, Ali R; Prevedello, Daniel M; Elder, J Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J
2013-10-01
Through previous efforts we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. These volumetric data help drive an interactive multisensory, ie, visual (stereo), aural (stereo), and tactile, simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the Congress of Neurological Surgeons simulation initiative. To deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. We discuss issues of biofidelity and our methods to provide objective, quantitative and automated assessment for the residents. We conclude with a discussion of our experiences by reporting preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum.
Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment
NASA Technical Reports Server (NTRS)
Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun
2006-01-01
Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.
Simulation of hypersonic rarefied flows with the immersed-boundary method
NASA Astrophysics Data System (ADS)
Bruno, D.; De Palma, P.; de Tullio, M. D.
2011-05-01
This paper provides a validation of an immersed boundary method for computing hypersonic rarefied gas flows. The method is based on the solution of the Navier-Stokes equation and is validated versus numerical results obtained by the DSMC approach. The Navier-Stokes solver employs a flexible local grid refinement technique and is implemented on parallel machines using a domain-decomposition approach. Thanks to the efficient grid generation process, based on the ray-tracing technique, and the use of the METIS software, it is possible to obtain the partitioned grids to be assigned to each processor with a minimal effort by the user. This allows one to by-pass the expensive (in terms of time and human resources) classical generation process of a body fitted grid. First-order slip-velocity boundary conditions are employed and tested for taking into account rarefied gas effects.
NASA Astrophysics Data System (ADS)
Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad
2014-12-01
Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.
Measuring Tropospheric Winds from Space Using a Coherent Doppler Lidar Technique
NASA Technical Reports Server (NTRS)
Miller, Timothy L.; Kavaya, Michael J.; Emmitt, G. David
1999-01-01
The global measurement of tropospheric wind profiles has been cited by the operational meteorological community as the most important missing element in the present and planned observing system. The most practical and economical method for obtaining this measurement is from low earth orbit, utilizing a Doppler lidar (laser radar) technique. Specifically, this paper will describe the coherent Doppler wind lidar (CDWL) technique, the design and progress of a current space flight project to fly such a system on the Space Shuttle, and plans for future flights of similar instruments. The SPARCLE (SPAce Readiness Coherent Lidar Experiment) is a Shuttle-based instrument whose flight is targeted for March, 2001. The objectives of SPARCLE are three-fold: Confirm that the coherent Doppler lidar technique can measure line-of-sight winds to within 1-2 m/s accuracy; Collect data to permit validation and improvement of instrument performance models to enable better design of future missions; and Collect wind and backscatter data for future mission optimization and for atmospheric studies. These objectives reflect the nature of the experiment and its program sponsor, NASA's New Millennium Program. The experiment is a technology validation mission whose primary purpose is to provide a space flight validation of this particular technology. (It should be noted that the CDWL technique has successfully been implemented from ground-based and aircraft-based platforms for a number of years.) Since the conduct of the SPARCLE mission is tied to future decisions on the choice of technology for free-flying, operational missions, the collection of data is intrinsically tied to the validation and improvement of instrument performance models that predict the sensitivity and accuracy of any particular present or future instrument system. The challenges unique to space flight for an instrument such as SPARCLE and follow-ons include: Obtaining the required lidar sensitivity from the long distance of orbit height to the lower atmosphere; Maintaining optical alignments after launch to orbit, and during operations in "microgravity"; Obtaining pointing knowledge of sufficient accuracy to remove the speed of the spacecraft (and the rotating Earth) from the measurements; Providing sufficient power (not a problem on the Shuttle) and cooling to the instrument. The paper will describe the status and challenges of the SPARCLE project, the value of obtaining wind data from orbit, and will present a roadmap to future instruments for scientific research and operational meteorology.
A Validation Study of the Impression Replica Technique.
Segerström, Sofia; Wiking-Lima de Faria, Johanna; Braian, Michael; Ameri, Arman; Ahlgren, Camilla
2018-04-17
To validate the well-known and often-used impression replica technique for measuring fit between a preparation and a crown in vitro. The validation consisted of three steps. First, a measuring instrument was validated to elucidate its accuracy. Second, a specimen consisting of male and female counterparts was created and validated by the measuring instrument. Calculations were made for the exact values of three gaps between the male and female. Finally, impression replicas were produced of the specimen gaps and sectioned into four pieces. The replicas were then measured with the use of a light microscope. The values received from measuring the specimen were then compared with the values received from the impression replicas, and the technique was thereby validated. The impression replica technique overvalued all measured gaps. Depending on location of the three measuring sites, the difference between the specimen and the impression replicas varied from 47 to 130 μm. The impression replica technique overestimates gaps within the range of 2% to 11%. The validation of the replica technique enables the method to be used as a reference when testing other methods for evaluating fit in dentistry. © 2018 by the American College of Prosthodontists.
Interpolative modeling of GaAs FET S-parameter data bases for use in Monte Carlo simulations
NASA Technical Reports Server (NTRS)
Campbell, L.; Purviance, J.
1992-01-01
A statistical interpolation technique is presented for modeling GaAs FET S-parameter measurements for use in the statistical analysis and design of circuits. This is accomplished by interpolating among the measurements in a GaAs FET S-parameter data base in a statistically valid manner.
A Foot-Mounted Inertial Measurement Unit (IMU) Positioning Algorithm Based on Magnetic Constraint
Zou, Jiaheng
2018-01-01
With the development of related applications, indoor positioning techniques have been more and more widely developed. Based on Wi-Fi, Bluetooth low energy (BLE) and geomagnetism, indoor positioning techniques often rely on the physical location of fingerprint information. The focus and difficulty of establishing the fingerprint database are in obtaining a relatively accurate physical location with as little given information as possible. This paper presents a foot-mounted inertial measurement unit (IMU) positioning algorithm under the loop closure constraint based on magnetic information. It can provide relatively reliable position information without maps and geomagnetic information and provides a relatively accurate coordinate for the collection of a fingerprint database. In the experiment, the features extracted by the multi-level Fourier transform method proposed in this paper are validated and the validity of loop closure matching is tested with a RANSAC-based method. Moreover, the loop closure detection results show that the cumulative error of the trajectory processed by the graph optimization algorithm is significantly suppressed, presenting a good accuracy. The average error of the trajectory under loop closure constraint is controlled below 2.15 m. PMID:29494542
A Foot-Mounted Inertial Measurement Unit (IMU) Positioning Algorithm Based on Magnetic Constraint.
Wang, Yan; Li, Xin; Zou, Jiaheng
2018-03-01
With the development of related applications, indoor positioning techniques have been more and more widely developed. Based on Wi-Fi, Bluetooth low energy (BLE) and geomagnetism, indoor positioning techniques often rely on the physical location of fingerprint information. The focus and difficulty of establishing the fingerprint database are in obtaining a relatively accurate physical location with as little given information as possible. This paper presents a foot-mounted inertial measurement unit (IMU) positioning algorithm under the loop closure constraint based on magnetic information. It can provide relatively reliable position information without maps and geomagnetic information and provides a relatively accurate coordinate for the collection of a fingerprint database. In the experiment, the features extracted by the multi-level Fourier transform method proposed in this paper are validated and the validity of loop closure matching is tested with a RANSAC-based method. Moreover, the loop closure detection results show that the cumulative error of the trajectory processed by the graph optimization algorithm is significantly suppressed, presenting a good accuracy. The average error of the trajectory under loop closure constraint is controlled below 2.15 m.
Post-coronagraphic tip-tilt sensing for vortex phase masks: The QACITS technique
NASA Astrophysics Data System (ADS)
Huby, E.; Baudoz, P.; Mawet, D.; Absil, O.
2015-12-01
Context. Small inner working angle coronagraphs, such as the vortex phase mask, are essential to exploit the full potential of ground-based telescopes in the context of exoplanet detection and characterization. However, the drawback of this attractive feature is a high sensitivity to pointing errors, which degrades the performance of the coronagraph. Aims: We propose a tip-tilt retrieval technique based on the analysis of the final coronagraphic image, hereafter called Quadrant Analysis of Coronagraphic Images for Tip-tilt Sensing (QACITS). Methods: Under the assumption of small phase aberrations, we show that the behavior of the vortex phase mask can be simply described from the entrance pupil to the Lyot stop plane with Zernike polynomials. This convenient formalism is used to establish the theoretical basis of the QACITS technique. We performed simulations to demonstrate the validity and limits of the technique, including the case of a centrally obstructed pupil. Results: The QACITS technique principle is validated with experimental results in the case of an unobstructed circular aperture, as well as simulations in presence of a central obstruction. The typical configuration of the Keck telescope (24% central obstruction) has been simulated with additional high order aberrations. In these conditions, our simulations show that the QACITS technique is still adapted to centrally obstructed pupils and performs tip-tilt retrieval with a precision of 5 × 10-2λ/D when wavefront errors amount to λ/ 14 rms and 10-2λ/D for λ/ 70 rms errors (with λ the wavelength and D the pupil diameter). Conclusions: We have developed and demonstrated a tip-tilt sensing technique for vortex coronagraphs. The implementation of the QACITS technique is based on the analysis of the scientific image and does not require any modification of the original setup. Current facilities equipped with a vortex phase mask can thus directly benefit from this technique to improve the contrast performance close to the axis.
NASA Technical Reports Server (NTRS)
1991-01-01
The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.
Elson, D S; Jo, J A
2007-01-01
We report a side viewing fibre-based endoscope that is compatible with intravascular imaging and fluorescence lifetime imaging microscopy (FLIM). The instrument has been validated through testing with fluorescent dyes and collagen and elastin powders using the Laguerre expansion deconvolution technique to calculate the fluorescence lifetimes. The instrument has also been tested on freshly excised unstained animal vascular tissues. PMID:19503759
Investigation of laser Doppler anemometry in developing a velocity-based measurement technique
NASA Astrophysics Data System (ADS)
Jung, Ki Won
2009-12-01
Acoustic properties, such as the characteristic impedance and the complex propagation constant, of porous materials have been traditionally characterized based on pressure-based measurement techniques using microphones. Although the microphone techniques have evolved since their introduction, the most general form of the microphone technique employs two microphones in characterizing the acoustic field for one continuous medium. The shortcomings of determining the acoustic field based on only two microphones can be overcome by using numerous microphones. However, the use of a number of microphones requires a careful and intricate calibration procedure. This dissertation uses laser Doppler anemometry (LDA) to establish a new measurement technique which can resolve issues that microphone techniques have: First, it is based on a single sensor, thus the calibration is unnecessary when only overall ratio of the acoustic field is required for the characterization of a system. This includes the measurements of the characteristic impedance and the complex propagation constant of a system. Second, it can handle multiple positional measurements without calibrating the signal at each position. Third, it can measure three dimensional components of velocity even in a system with a complex geometry. Fourth, it has a flexible adaptability which is not restricted to a certain type of apparatus only if the apparatus is transparent. LDA is known to possess several disadvantages, such as the requirement of a transparent apparatus, high cost, and necessity of seeding particles. The technique based on LDA combined with a curvefitting algorithm is validated through measurements on three systems. First, the complex propagation constant of the air is measured in a rigidly terminated cylindrical pipe which has very low dissipation. Second, the radiation impedance of an open-ended pipe is measured. These two parameters can be characterized by the ratio of acoustic field measured at multiple locations. Third, the power dissipated in a variable RLC load is measured. The three experiments validate the LDA technique proposed. The utility of the LDA method is then extended to the measurement of the complex propagation constant of the air inside a 100 ppi reticulated vitreous carbon (RVC) sample. Compared to measurements in the available studies, the measurement with the 100 ppi RVC sample supports the LDA technique in that it can achieve a low uncertainty in the determined quantity. This dissertation concludes with using the LDA technique for modal decomposition of the plane wave mode and the (1,1) mode that are driven simultaneously. This modal decomposition suggests that the LDA technique surpasses microphone-based techniques, because they are unable to determine the acoustic field based on an acoustic model with unconfined propagation constants for each modal component.
SU-D-BRB-01: A Predictive Planning Tool for Stereotactic Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palefsky, S; Roper, J; Elder, E
Purpose: To demonstrate the feasibility of a predictive planning tool which provides SRS planning guidance based on simple patient anatomical properties: PTV size, PTV shape and distance from critical structures. Methods: Ten framed SRS cases treated at Winship Cancer Institute of Emory University were analyzed to extract data on PTV size, sphericity (shape), and distance from critical structures such as the brainstem and optic chiasm. The cases consisted of five pairs. Each pair consisted of two cases with a similar diagnosis (such as pituitary adenoma or arteriovenous malformation) that were treated with different techniques: DCA, or IMRS. A Naive Bayesmore » Classifier was trained on this data to establish the conditions under which each treatment modality was used. This model was validated by classifying ten other randomly-selected cases into DCA or IMRS classes, calculating the probability of each technique, and comparing results to the treated technique. Results: Of the ten cases used to validate the model, nine had their technique predicted correctly. The three cases treated with IMRS were all identified as such. Their probabilities of being treated with IMRS ranged between 59% and 100%. Six of the seven cases treated with DCA were correctly classified. These probabilities ranged between 51% and 95%. One case treated with DCA was incorrectly predicted to be an IMRS plan. The model’s confidence in this case was 91%. Conclusion: These findings indicate that a predictive planning tool based on simple patient anatomical properties can predict the SRS technique used for treatment. The algorithm operated with 90% accuracy. With further validation on larger patient populations, this tool may be used clinically to guide planners in choosing an appropriate treatment technique. The prediction algorithm could also be adapted to guide selection of treatment parameters such as treatment modality and number of fields for radiotherapy across anatomical sites.« less
IoT security with one-time pad secure algorithm based on the double memory technique
NASA Astrophysics Data System (ADS)
Wiśniewski, Remigiusz; Grobelny, Michał; Grobelna, Iwona; Bazydło, Grzegorz
2017-11-01
Secure encryption of data in Internet of Things is especially important as many information is exchanged every day and the number of attack vectors on IoT elements still increases. In the paper a novel symmetric encryption method is proposed. The idea bases on the one-time pad technique. The proposed solution applies double memory concept to secure transmitted data. The presented algorithm is considered as a part of communication protocol and it has been initially validated against known security issues.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1989-01-01
The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.
NASA Astrophysics Data System (ADS)
Lin, Yuan; Choudhury, Kingshuk R.; McAdams, H. Page; Foos, David H.; Samei, Ehsan
2014-03-01
We previously proposed a novel image-based quality assessment technique1 to assess the perceptual quality of clinical chest radiographs. In this paper, an observer study was designed and conducted to systematically validate this technique. Ten metrics were involved in the observer study, i.e., lung grey level, lung detail, lung noise, riblung contrast, rib sharpness, mediastinum detail, mediastinum noise, mediastinum alignment, subdiaphragm-lung contrast, and subdiaphragm area. For each metric, three tasks were successively presented to the observers. In each task, six ROI images were randomly presented in a row and observers were asked to rank the images only based on a designated quality and disregard the other qualities. A range slider on the top of the images was used for observers to indicate the acceptable range based on the corresponding perceptual attribute. Five boardcertificated radiologists from Duke participated in this observer study on a DICOM calibrated diagnostic display workstation and under low ambient lighting conditions. The observer data were analyzed in terms of the correlations between the observer ranking orders and the algorithmic ranking orders. Based on the collected acceptable ranges, quality consistency ranges were statistically derived. The observer study showed that, for each metric, the averaged ranking orders of the participated observers were strongly correlated with the algorithmic orders. For the lung grey level, the observer ranking orders completely accorded with the algorithmic ranking orders. The quality consistency ranges derived from this observer study were close to these derived from our previous study. The observer study indicates that the proposed image-based quality assessment technique provides a robust reflection of the perceptual image quality of the clinical chest radiographs. The derived quality consistency ranges can be used to automatically predict the acceptability of a clinical chest radiograph.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Shiju; Qian, Wei; Guan, Yubao
2016-06-15
Purpose: This study aims to investigate the potential to improve lung cancer recurrence risk prediction performance for stage I NSCLS patients by integrating oversampling, feature selection, and score fusion techniques and develop an optimal prediction model. Methods: A dataset involving 94 early stage lung cancer patients was retrospectively assembled, which includes CT images, nine clinical and biological (CB) markers, and outcome of 3-yr disease-free survival (DFS) after surgery. Among the 94 patients, 74 remained DFS and 20 had cancer recurrence. Applying a computer-aided detection scheme, tumors were segmented from the CT images and 35 quantitative image (QI) features were initiallymore » computed. Two normalized Gaussian radial basis function network (RBFN) based classifiers were built based on QI features and CB markers separately. To improve prediction performance, the authors applied a synthetic minority oversampling technique (SMOTE) and a BestFirst based feature selection method to optimize the classifiers and also tested fusion methods to combine QI and CB based prediction results. Results: Using a leave-one-case-out cross-validation (K-fold cross-validation) method, the computed areas under a receiver operating characteristic curve (AUCs) were 0.716 ± 0.071 and 0.642 ± 0.061, when using the QI and CB based classifiers, respectively. By fusion of the scores generated by the two classifiers, AUC significantly increased to 0.859 ± 0.052 (p < 0.05) with an overall prediction accuracy of 89.4%. Conclusions: This study demonstrated the feasibility of improving prediction performance by integrating SMOTE, feature selection, and score fusion techniques. Combining QI features and CB markers and performing SMOTE prior to feature selection in classifier training enabled RBFN based classifier to yield improved prediction accuracy.« less
Integration of design and inspection
NASA Astrophysics Data System (ADS)
Simmonds, William H.
1990-08-01
Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.
Abbas, Ismail; Rovira, Joan; Casanovas, Josep
2006-12-01
To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.
Baù, Marco; Ferrari, Marco; Ferrari, Vittorio
2017-01-01
A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors. PMID:28574459
Baù, Marco; Ferrari, Marco; Ferrari, Vittorio
2017-06-02
A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors.
Rychlik, Michał; Samborski, Włodzimierz
2015-01-01
The aim of this study was to assess the validity and test-retest reliability of Thermovision Technique of Dry Needling (TTDN) for the gluteus minimus muscle. TTDN is a new thermography approach used to support trigger points (TrPs) diagnostic criteria by presence of short-term vasomotor reactions occurring in the area where TrPs refer pain. Method. Thirty chronic sciatica patients (n=15 TrP-positive and n=15 TrPs-negative) and 15 healthy volunteers were evaluated by TTDN three times during two consecutive days based on TrPs of the gluteus minimus muscle confirmed additionally by referred pain presence. TTDN employs average temperature (T avr), maximum temperature (T max), low/high isothermal-area, and autonomic referred pain phenomenon (AURP) that reflects vasodilatation/vasoconstriction. Validity and test-retest reliability were assessed concurrently. Results. Two components of TTDN validity and reliability, T avr and AURP, had almost perfect agreement according to κ (e.g., thigh: 0.880 and 0.938; calf: 0.902 and 0.956, resp.). The sensitivity for T avr, T max, AURP, and high isothermal-area was 100% for everyone, but specificity of 100% was for T avr and AURP only. Conclusion. TTDN is a valid and reliable method for T avr and AURP measurement to support TrPs diagnostic criteria for the gluteus minimus muscle when digitally evoked referred pain pattern is present. PMID:26137486
Demirci, Oguz; Clark, Vincent P; Calhoun, Vince D
2008-02-15
Schizophrenia is diagnosed based largely upon behavioral symptoms. Currently, no quantitative, biologically based diagnostic technique has yet been developed to identify patients with schizophrenia. Classification of individuals into patient with schizophrenia and healthy control groups based on quantitative biologically based data is of great interest to support and refine psychiatric diagnoses. We applied a novel projection pursuit technique on various components obtained with independent component analysis (ICA) of 70 subjects' fMRI activation maps obtained during an auditory oddball task. The validity of the technique was tested with a leave-one-out method and the detection performance varied between 80% and 90%. The findings suggest that the proposed data reduction algorithm is effective in classifying individuals into schizophrenia and healthy control groups and may eventually prove useful as a diagnostic tool.
Space Station UCS antenna pattern computation and measurement. [UHF Communication Subsystem
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Lu, Ba P.; Johnson, Larry A.; Fournet, Jon S.; Panneton, Robert J.; Ngo, John D.; Eggers, Donald S.; Arndt, G. D.
1993-01-01
The purpose of this paper is to analyze the interference to the Space Station Ultrahigh Frequency (UHF) Communication Subsystem (UCS) antenna radiation pattern due to its environment - Space Station. A hybrid Computational Electromagnetics (CEM) technique was applied in this study. The antenna was modeled using the Method of Moments (MOM) and the radiation patterns were computed using the Uniform Geometrical Theory of Diffraction (GTD) in which the effects of the reflected and diffracted fields from surfaces, edges, and vertices of the Space Station structures were included. In order to validate the CEM techniques, and to provide confidence in the computer-generated results, a comparison with experimental measurements was made for a 1/15 scale Space Station mockup. Based on the results accomplished, good agreement on experimental and computed results was obtained. The computed results using the CEM techniques for the Space Station UCS antenna pattern predictions have been validated.
NASA Astrophysics Data System (ADS)
Hirt, Christian; Reußner, Elisabeth; Rexer, Moritz; Kuhn, Michael
2016-09-01
Over the past years, spectral techniques have become a standard to model Earth's global gravity field to 10 km scales, with the EGM2008 geopotential model being a prominent example. For some geophysical applications of EGM2008, particularly Bouguer gravity computation with spectral techniques, a topographic potential model of adequate resolution is required. However, current topographic potential models have not yet been successfully validated to degree 2160, and notable discrepancies between spectral modeling and Newtonian (numerical) integration well beyond the 10 mGal level have been reported. Here we accurately compute and validate gravity implied by a degree 2160 model of Earth's topographic masses. Our experiments are based on two key strategies, both of which require advanced computational resources. First, we construct a spectrally complete model of the gravity field which is generated by the degree 2160 Earth topography model. This involves expansion of the topographic potential to the 15th integer power of the topography and modeling of short-scale gravity signals to ultrahigh degree of 21,600, translating into unprecedented fine scales of 1 km. Second, we apply Newtonian integration in the space domain with high spatial resolution to reduce discretization errors. Our numerical study demonstrates excellent agreement (8 μGgal RMS) between gravity from both forward modeling techniques and provides insight into the convergence process associated with spectral modeling of gravity signals at very short scales (few km). As key conclusion, our work successfully validates the spectral domain forward modeling technique for degree 2160 topography and increases the confidence in new high-resolution global Bouguer gravity maps.
Song, Fang; Zheng, Chuantao; Yan, Wanhong; Ye, Weilin; Wang, Yiding; Tittel, Frank K
2017-12-11
To suppress sensor noise with unknown statistical properties, a novel self-adaptive direct laser absorption spectroscopy (SA-DLAS) technique was proposed by incorporating a recursive, least square (RLS) self-adaptive denoising (SAD) algorithm and a 3291 nm interband cascade laser (ICL) for methane (CH 4 ) detection. Background noise was suppressed by introducing an electrical-domain noise-channel and an expectation-known-based RLS SAD algorithm. Numerical simulations and measurements were carried out to validate the function of the SA-DLAS technique by imposing low-frequency, high-frequency, White-Gaussian and hybrid noise on the ICL scan signal. Sensor calibration, stability test and dynamic response measurement were performed for the SA-DLAS sensor using standard or diluted CH 4 samples. With the intrinsic sensor noise considered only, an Allan deviation of ~43.9 ppbv with a ~6 s averaging time was obtained and it was further decreased to 6.3 ppbv with a ~240 s averaging time, through the use of self-adaptive filtering (SAF). The reported SA-DLAS technique shows enhanced sensitivity compared to a DLAS sensor using a traditional sensing architecture and filtering method. Indoor and outdoor atmospheric CH 4 measurements were conducted to validate the normal operation of the reported SA-DLAS technique.
Validation and application of Acoustic Mapping Velocimetry
NASA Astrophysics Data System (ADS)
Baranya, Sandor; Muste, Marian
2016-04-01
The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.
Zhang, Zhang; Takarada, Shigeho
2011-01-01
Structural coronary microcirculation abnormalities are important prognostic determinants in clinical settings. However, an assessment of microvascular resistance (MR) requires a velocity wire. A first-pass distribution analysis technique to measure volumetric blood flow has been previously validated. The aim of this study was the in vivo validation of the MR measurement technique using first-pass distribution analysis. Twelve anesthetized swine were instrumented with a transit-time ultrasound flow probe on the proximal segment of the left anterior descending coronary artery (LAD). Microspheres were injected into the LAD to create a model of microvascular dysfunction. Adenosine (400 μg·kg−1·min−1) was used to produce maximum hyperemia. A region of interest in the LAD arterial bed was drawn to generate time-density curves using angiographic images. Volumetric blood flow measurements (Qa) were made using a time-density curve and the assumption that blood was momentarily replaced with contrast agent during the injection. Blood flow from the flow probe (Qp), coronary pressure (Pa), and right atrium pressure (Pv) were continuously recorded. Flow probe-based normalized MR (NMRp) and angiography-based normalized MR (NMRa) were calculated using Qp and Qa, respectively. In 258 measurements, Qa showed a strong correlation with the gold standard Qp (Qa = 0.90 Qp + 6.6 ml/min, r2 = 0.91, P < 0.0001). NMRa correlated linearly with NMRp (NMRa = 0.90 NMRp + 0.02 mmHg·ml−1·min−1, r2 = 0.91, P < 0.0001). Additionally, the Bland-Altman analysis showed a close agreement between NMRa and NMRp. In conclusion, a technique based on angiographic image data for quantifying NMR was validated using a swine model. This study provides a method to measure NMR without using a velocity wire, which can potentially be used to evaluate microvascular conditions during coronary arteriography. PMID:21398596
41 CFR 60-3.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2011 CFR
2011-07-01
... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...
29 CFR 1607.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2010 CFR
2010-07-01
... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...
41 CFR 60-3.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2014 CFR
2014-07-01
... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...
29 CFR 1607.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2014 CFR
2014-07-01
... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...
29 CFR 1607.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2011 CFR
2011-07-01
... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...
41 CFR 60-3.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2012 CFR
2012-07-01
... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...
29 CFR 1607.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2013 CFR
2013-07-01
... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...
41 CFR 60-3.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2013 CFR
2013-07-01
... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...
29 CFR 1607.6 - Use of selection procedures which have not been validated.
Code of Federal Regulations, 2012 CFR
2012-07-01
... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...
Teaching "Instant Experience" with Graphical Model Validation Techniques
ERIC Educational Resources Information Center
Ekstrøm, Claus Thorn
2014-01-01
Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.
Space station advanced automation
NASA Technical Reports Server (NTRS)
Woods, Donald
1990-01-01
In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.
Demonstration of innovative techniques for work zone safety data analysis
DOT National Transportation Integrated Search
2009-07-15
Based upon the results of the simulator data analysis, additional future research can be : identified to validate the driving simulator in terms of similarities with Ohio work zones. For : instance, the speeds observed in the simulator were greater f...
ERIC Educational Resources Information Center
PILNICK, SAUL; AND OTHERS
THE COLLEGEFIELDS PROJECT (CP) WAS PRIMARILY A DEMONSTRATION OF EDUCATIONALLY BASED GROUP REHABILITATION FOR DELINQUENT AND PREDELINQUENT BOYS. SECONDARILY, IT WAS DESIGNED TO VALIDATE THE PROGRAM'S EFFECTIVENESS. GUIDED GROUP INTERACTION WAS THE MAJOR TECHNIQUE IN ALTERING EDUCATIONAL EXPECTATIONS AND REDUCING DELINQUENT BEHAVIORS AND FEELINGS OF…
NASA Technical Reports Server (NTRS)
1998-01-01
AbTech Corporation used an F-18 HARV (High Alpha Research Vehicle) simulation developed by NASA to create an interactive computer-based prototype of the MQ (Model Quest) SV (System Validator) tool. Dryden Flight Research Center provided support to develop, test, and rapidly reprogram the validation function. AbTech's ModelQuest Enterprises highly automated and outperforms other modeling techniques to quickly discover meaningful relationships, patterns, and trends in databases. Applications include technical and business professionals in finance, marketing, business, banking, retail, healthcare, and aerospace.
A novel validation and calibration method for motion capture systems based on micro-triangulation.
Nagymáté, Gergely; Tuchband, Tamás; Kiss, Rita M
2018-06-06
Motion capture systems are widely used to measure human kinematics. Nevertheless, users must consider system errors when evaluating their results. Most validation techniques for these systems are based on relative distance and displacement measurements. In contrast, our study aimed to analyse the absolute volume accuracy of optical motion capture systems by means of engineering surveying reference measurement of the marker coordinates (uncertainty: 0.75 mm). The method is exemplified on an 18 camera OptiTrack Flex13 motion capture system. The absolute accuracy was defined by the root mean square error (RMSE) between the coordinates measured by the camera system and by engineering surveying (micro-triangulation). The original RMSE of 1.82 mm due to scaling error was managed to be reduced to 0.77 mm while the correlation of errors to their distance from the origin reduced from 0.855 to 0.209. A simply feasible but less accurate absolute accuracy compensation method using tape measure on large distances was also tested, which resulted in similar scaling compensation compared to the surveying method or direct wand size compensation by a high precision 3D scanner. The presented validation methods can be less precise in some respects as compared to previous techniques, but they address an error type, which has not been and cannot be studied with the previous validation methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
Qureshi, Abid; Tandon, Himani; Kumar, Manoj
2015-11-01
Peptide-based antiviral therapeutics has gradually paved their way into mainstream drug discovery research. Experimental determination of peptides' antiviral activity as expressed by their IC50 values involves a lot of effort. Therefore, we have developed "AVP-IC50 Pred," a regression-based algorithm to predict the antiviral activity in terms of IC50 values (μM). A total of 759 non-redundant peptides from AVPdb and HIPdb were divided into a training/test set having 683 peptides (T(683)) and a validation set with 76 independent peptides (V(76)) for evaluation. We utilized important peptide sequence features like amino-acid compositions, binary profile of N8-C8 residues, physicochemical properties and their hybrids. Four different machine learning techniques (MLTs) namely Support vector machine, Random Forest, Instance-based classifier, and K-Star were employed. During 10-fold cross validation, we achieved maximum Pearson correlation coefficients (PCCs) of 0.66, 0.64, 0.56, 0.55, respectively, for the above MLTs using the best combination of feature sets. All the predictive models also performed well on the independent validation dataset and achieved maximum PCCs of 0.74, 0.68, 0.59, 0.57, respectively, on the best combination of feature sets. The AVP-IC50 Pred web server is anticipated to assist the researchers working on antiviral therapeutics by enabling them to computationally screen many compounds and focus experimental validation on the most promising set of peptides, thus reducing cost and time efforts. The server is available at http://crdd.osdd.net/servers/ic50avp. © 2015 Wiley Periodicals, Inc.
Miller, Joshua D
2012-12-01
In this article, the development of Five-Factor Model (FFM) personality disorder (PD) prototypes for the assessment of DSM-IV PDs are reviewed, as well as subsequent procedures for scoring individuals' FFM data with regard to these PD prototypes, including similarity scores and simple additive counts that are based on a quantitative prototype matching methodology. Both techniques, which result in very strongly correlated scores, demonstrate convergent and discriminant validity, and provide clinically useful information with regard to various forms of functioning. The techniques described here for use with FFM data are quite different from the prototype matching methods used elsewhere. © 2012 The Author. Journal of Personality © 2012, Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1975-01-01
Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.
Respiratory motion resolved, self-gated 4D-MRI using Rotating Cartesian K-space (ROCK)
Han, Fei; Zhou, Ziwu; Cao, Minsong; Yang, Yingli; Sheng, Ke; Hu, Peng
2017-01-01
Purpose To propose and validate a respiratory motion resolved, self-gated (SG) 4D-MRI technique to assess patient-specific breathing motion of abdominal organs for radiation treatment planning. Methods The proposed 4D-MRI technique was based on the balanced steady-state free-precession (bSSFP) technique and 3D k-space encoding. A novel ROtating Cartesian K-space (ROCK) reordering method was designed that incorporates repeatedly sampled k-space centerline as the SG motion surrogate and allows for retrospective k-space data binning into different respiratory positions based on the amplitude of the surrogate. The multiple respiratory-resolved 3D k-space data were subsequently reconstructed using a joint parallel imaging and compressed sensing method with spatial and temporal regularization. The proposed 4D-MRI technique was validated using a custom-made dynamic motion phantom and was tested in 6 healthy volunteers, in whom quantitative diaphragm and kidney motion measurements based on 4D-MRI images were compared with those based on 2D-CINE images. Results The 5-minute 4D-MRI scan offers high-quality volumetric images in 1.2×1.2×1.6mm3 and 8 respiratory positions, with good soft-tissue contrast. In phantom experiments with triangular motion waveform, the motion amplitude measurements based on 4D-MRI were 11.89% smaller than the ground truth, whereas a −12.5% difference was expected due to data binning effects. In healthy volunteers, the difference between the measurements based on 4D-MRI and the ones based on 2D-CINE were 6.2±4.5% for the diaphragm, 8.2±4.9% and 8.9±5.1% for the right and left kidney. Conclusion The proposed 4D-MRI technique could provide high resolution, high quality, respiratory motion resolved 4D images with good soft-tissue contrast and are free of the “stitching” artifacts usually seen on 4D-CT and 4D-MRI based on resorting 2D-CINE. It could be used to visualize and quantify abdominal organ motion for MRI-based radiation treatment planning. PMID:28133752
Respiratory motion-resolved, self-gated 4D-MRI using rotating cartesian k-space (ROCK).
Han, Fei; Zhou, Ziwu; Cao, Minsong; Yang, Yingli; Sheng, Ke; Hu, Peng
2017-04-01
To propose and validate a respiratory motion resolved, self-gated (SG) 4D-MRI technique to assess patient-specific breathing motion of abdominal organs for radiation treatment planning. The proposed 4D-MRI technique was based on the balanced steady-state free-precession (bSSFP) technique and 3D k-space encoding. A novel rotating cartesian k-space (ROCK) reordering method was designed which incorporates repeatedly sampled k-space centerline as the SG motion surrogate and allows for retrospective k-space data binning into different respiratory positions based on the amplitude of the surrogate. The multiple respiratory-resolved 3D k-space data were subsequently reconstructed using a joint parallel imaging and compressed sensing method with spatial and temporal regularization. The proposed 4D-MRI technique was validated using a custom-made dynamic motion phantom and was tested in six healthy volunteers, in whom quantitative diaphragm and kidney motion measurements based on 4D-MRI images were compared with those based on 2D-CINE images. The 5-minute 4D-MRI scan offers high-quality volumetric images in 1.2 × 1.2 × 1.6 mm 3 and eight respiratory positions, with good soft-tissue contrast. In phantom experiments with triangular motion waveform, the motion amplitude measurements based on 4D-MRI were 11.89% smaller than the ground truth, whereas a -12.5% difference was expected due to data binning effects. In healthy volunteers, the difference between the measurements based on 4D-MRI and the ones based on 2D-CINE were 6.2 ± 4.5% for the diaphragm, 8.2 ± 4.9% and 8.9 ± 5.1% for the right and left kidney. The proposed 4D-MRI technique could provide high-resolution, high-quality, respiratory motion-resolved 4D images with good soft-tissue contrast and are free of the "stitching" artifacts usually seen on 4D-CT and 4D-MRI based on resorting 2D-CINE. It could be used to visualize and quantify abdominal organ motion for MRI-based radiation treatment planning. © 2017 American Association of Physicists in Medicine.
Data selection techniques in the interpretation of MAGSAT data over Australia
NASA Technical Reports Server (NTRS)
Johnson, B. D.; Dampney, C. N. G.
1983-01-01
The MAGSAT data require critical selection in order to produce a self-consistent data set suitable for map construction and subsequent interpretation. Interactive data selection techniques are described which involve the use of a special-purpose profile-oriented data base and a colour graphics display. The careful application of these data selection techniques permits validation every data value and ensures that the best possible self-consistent data set is being used to construct the maps of the magnetic field measured at satellite altitudes over Australia.
Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI
ERIC Educational Resources Information Center
Forer, Barry; Zumbo, Bruno D.
2011-01-01
The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…
Mortimer, Duncan; Segal, Leonie
2008-01-01
Algorithms for converting descriptive measures of health status into quality-adjusted life year (QALY)--weights are now widely available, and their application in economic evaluation is increasingly commonplace. The objective of this study is to describe and compare existing conversion algorithms and to highlight issues bearing on the derivation and interpretation of the QALY-weights so obtained. Systematic review of algorithms for converting descriptive measures of health status into QALY-weights. The review identified a substantial body of literature comprising 46 derivation studies and 16 studies that provided evidence or commentary on the validity of conversion algorithms. Conversion algorithms were derived using 1 of 4 techniques: 1) transfer to utility regression, 2) response mapping, 3) effect size translation, and 4) "revaluing" outcome measures using preference-based scaling techniques. Although these techniques differ in their methodological/theoretical tradition, data requirements, and ease of derivation and application, the available evidence suggests that the sensitivity and validity of derived QALY-weights may be more dependent on the coverage and sensitivity of measures and the disease area/patient group under evaluation than on the technique used in derivation. Despite the recent proliferation of conversion algorithms, a number of questions bearing on the derivation and interpretation of derived QALY-weights remain unresolved. These unresolved issues suggest directions for future research in this area. In the meantime, analysts seeking guidance in selecting derived QALY-weights should consider the validity and feasibility of each conversion algorithm in the disease area and patient group under evaluation rather than restricting their choice to weights from a particular derivation technique.
NASA Astrophysics Data System (ADS)
Jena, S.
2015-12-01
The overexploitation of groundwater resulted in abandoning many shallow tube wells in the river Basin in Eastern India. For the sustainability of groundwater resources, basin-scale modelling of groundwater flow is essential for the efficient planning and management of the water resources. The main intent of this study is to develope a 3-D groundwater flow model of the study basin using the Visual MODFLOW package and successfully calibrate and validate it using 17 years of observed data. The sensitivity analysis was carried out to quantify the susceptibility of aquifer system to the river bank seepage, recharge from rainfall and agriculture practices, horizontal and vertical hydraulic conductivities, and specific yield. To quantify the impact of parameter uncertainties, Sequential Uncertainty Fitting Algorithm (SUFI-2) and Markov chain Monte Carlo (MCMC) techniques were implemented. Results from the two techniques were compared and the advantages and disadvantages were analysed. Nash-Sutcliffe coefficient (NSE) and coefficient of determination (R2) were adopted as two criteria during calibration and validation of the developed model. NSE and R2 values of groundwater flow model for calibration and validation periods were in acceptable range. Also, the MCMC technique was able to provide more reasonable results than SUFI-2. The calibrated and validated model will be useful to identify the aquifer properties, analyse the groundwater flow dynamics and the change in groundwater levels in future forecasts.
Ultrasonic linear array validation via concrete test blocks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoegh, Kyle, E-mail: hoeg0021@umn.edu; Khazanovich, Lev, E-mail: hoeg0021@umn.edu; Ferraro, Chris
2015-03-31
Oak Ridge National Laboratory (ORNL) comparatively evaluated the ability of a number of NDE techniques to generate an image of the volume of 6.5′ X 5.0′ X 10″ concrete specimens fabricated at the Florida Department of Transportation (FDOT) NDE Validation Facility in Gainesville, Florida. These test blocks were fabricated to test the ability of various NDE methods to characterize various placements and sizes of rebar as well as simulated cracking and non-consolidation flaws. The first version of the ultrasonic linear array device, MIRA [version 1], was one of 7 different NDE equipment used to characterize the specimens. This paper dealsmore » with the ability of this equipment to determine subsurface characterizations such as reinforcing steel relative size, concrete thickness, irregularities, and inclusions using Kirchhoff-based migration techniques. The ability of individual synthetic aperture focusing technique (SAFT) B-scan cross sections resulting from self-contained scans are compared with various processing, analysis, and interpretation methods using the various features fabricated in the specimens for validation. The performance is detailed, especially with respect to the limitations and implications for evaluation of a thicker, more heavily reinforced concrete structures.« less
Magnetic resonance imaging measurement of iron overload
Wood, John C.
2010-01-01
Purpose of review To highlight recent advances in magnetic resonance imaging estimation of somatic iron overload. This review will discuss the need and principles of magnetic resonance imaging-based iron measurements, the validation of liver and cardiac iron measurements, and the key institutional requirements for implementation. Recent findings Magnetic resonance imaging assessment of liver and cardiac iron has achieved critical levels of availability, utility, and validity to serve as the primary endpoint of clinical trials. Calibration curves for the magnetic resonance imaging parameters R2 and R2* (or their reciprocals, T2 and T2*) have been developed for the liver and the heart. Interscanner variability for these techniques has proven to be on the order of 5–7%. Summary Magnetic resonance imaging assessment of tissue iron is becoming increasingly important in the management of transfusional iron load because it is noninvasive, relatively widely available and offers a window into presymptomatic organ dysfunction. The techniques are highly reproducible within and across machines and have been chemically validated in the liver and the heart. These techniques will become the standard of care as industry begins to support the acquisition and postprocessing software. PMID:17414205
Borotikar, Bhushan; Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain
2017-01-01
To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions.
Fecal Indicator Bacteria and Environmental Observations: Validation of Virtual Beach
Contamination of recreational waters by fecal material is often assessed using indicator bacteria such as enterococci. Enumeration based on culturing methods can take up to 48 hours to complete, limiting the accuracy of water quality evaluations. Molecular microbial techniques em...
Validating a biometric authentication system: sample size requirements.
Dass, Sarat C; Zhu, Yongfang; Jain, Anil K
2006-12-01
Authentication systems based on biometric features (e.g., fingerprint impressions, iris scans, human face images, etc.) are increasingly gaining widespread use and popularity. Often, vendors and owners of these commercial biometric systems claim impressive performance that is estimated based on some proprietary data. In such situations, there is a need to independently validate the claimed performance levels. System performance is typically evaluated by collecting biometric templates from n different subjects, and for convenience, acquiring multiple instances of the biometric for each of the n subjects. Very little work has been done in 1) constructing confidence regions based on the ROC curve for validating the claimed performance levels and 2) determining the required number of biometric samples needed to establish confidence regions of prespecified width for the ROC curve. To simplify the analysis that address these two problems, several previous studies have assumed that multiple acquisitions of the biometric entity are statistically independent. This assumption is too restrictive and is generally not valid. We have developed a validation technique based on multivariate copula models for correlated biometric acquisitions. Based on the same model, we also determine the minimum number of samples required to achieve confidence bands of desired width for the ROC curve. We illustrate the estimation of the confidence bands as well as the required number of biometric samples using a fingerprint matching system that is applied on samples collected from a small population.
Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls
NASA Technical Reports Server (NTRS)
Anastasiadis, Stergios
1991-01-01
Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric.
DBS-LC-MS/MS assay for caffeine: validation and neonatal application.
Bruschettini, Matteo; Barco, Sebastiano; Romantsik, Olga; Risso, Francesco; Gennai, Iulian; Chinea, Benito; Ramenghi, Luca A; Tripodi, Gino; Cangemi, Giuliana
2016-09-01
DBS might be an appropriate microsampling technique for therapeutic drug monitoring of caffeine in infants. Nevertheless, its application presents several issues that still limit its use. This paper describes a validated DBS-LC-MS/MS method for caffeine. The results of the method validation showed an hematocrit dependence. In the analysis of 96 paired plasma and DBS clinical samples, caffeine levels measured in DBS were statistically significantly lower than in plasma but the observed differences were independent from hematocrit. These results clearly showed the need for extensive validation with real-life samples for DBS-based methods. DBS-LC-MS/MS can be considered to be a good alternative to traditional methods for therapeutic drug monitoring or PK studies in preterm infants.
Validating Ultrasound-based HIFU Lesion-size Monitoring Technique with MR Thermometry and Histology
NASA Astrophysics Data System (ADS)
Zhou, Shiwei; Petruzzello, John; Anand, Ajay; Sethuraman, Shriram; Azevedo, Jose
2010-03-01
In order to control and monitor HIFU lesions accurately and cost-effectively in real-time, we have developed an ultrasound-based therapy monitoring technique using acoustic radiation force to track the change in tissue mechanical properties. We validate our method with concurrent MR thermometry and histology. Comparison studies have been completed on in-vitro bovine liver samples. A single-element 1.1 MHz focused transducer was used to deliver HIFU and produce acoustic radiation force (ARF). A 5 MHz single-element transducer was placed co-axially with the HIFU transducer to acquire the RF data, and track the tissue displacement induced by ARF. During therapy, the monitoring procedure was interleaved with HIFU. MR thermometry (Philips Panorama 1T system) and ultrasound monitoring were performed simultaneously. The tissue temperature and thermal dose (CEM43 = 240 mins) were computed from the MR thermometry data. The tissue displacement induced by the acoustic radiation force was calculated from the ultrasound RF data in real-time using a cross-correlation based method. A normalized displacement difference (NDD) parameter was developed and calibrated to estimate the lesion size. The lesion size estimated by the NDD was compared with both MR thermometry prediction and the histology analysis. For lesions smaller than 8mm, the NDD-based lesion monitoring technique showed very similar performance as MR thermometry. The standard deviation of lesion size error is 0.66 mm, which is comparable to MR thermal dose contour prediction (0.42 mm). A phased array is needed for tracking displacement in 2D and monitoring lesion larger than 8 mm. The study demonstrates the potential of our ultrasound based technique to achieve precise HIFU lesion control through real-time monitoring. The results compare well with histology and an established technique like MR Thermometry. This approach provides feedback control in real-time to terminate therapy when a pre-determined lesion size is achieved, and can be extended to 2D and implemented on commercial ultrasound scanner systems.
NASA Astrophysics Data System (ADS)
Pérez, B.; Brouwer, R.; Beckers, J.; Paradis, D.; Balseiro, C.; Lyons, K.; Cure, M.; Sotillo, M. G.; Hackett, B.; Verlaan, M.; Fanjul, E. A.
2012-03-01
ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast that makes use of several storm surge or circulation models and near-real time tide gauge data in the region, with the following main goals: 1. providing easy access to existing forecasts, as well as to its performance and model validation, by means of an adequate visualization tool; 2. generation of better forecasts of sea level, including confidence intervals, by means of the Bayesian Model Average technique (BMA). The Bayesian Model Average technique generates an overall forecast probability density function (PDF) by making a weighted average of the individual forecasts PDF's; the weights represent the Bayesian likelihood that a model will give the correct forecast and are continuously updated based on the performance of the models during a recent training period. This implies the technique needs the availability of sea level data from tide gauges in near-real time. The system was implemented for the European Atlantic facade (IBIROOS region) and Western Mediterranean coast based on the MATROOS visualization tool developed by Deltares. Results of validation of the different models and BMA implementation for the main harbours are presented for these regions where this kind of activity is performed for the first time. The system is currently operational at Puertos del Estado and has proved to be useful in the detection of calibration problems in some of the circulation models, in the identification of the systematic differences between baroclinic and barotropic models for sea level forecasts and to demonstrate the feasibility of providing an overall probabilistic forecast, based on the BMA method.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1988-01-01
This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.
Modified signed-digit trinary addition using synthetic wavelet filter
NASA Astrophysics Data System (ADS)
Iftekharuddin, K. M.; Razzaque, M. A.
2000-09-01
The modified signed-digit (MSD) number system has been a topic of interest as it allows for parallel carry-free addition of two numbers for digital optical computing. In this paper, harmonic wavelet joint transform (HWJT)-based correlation technique is introduced for optical implementation of MSD trinary adder implementation. The realization of the carry-propagation-free addition of MSD trinary numerals is demonstrated using synthetic HWJT correlator model. It is also shown that the proposed synthetic wavelet filter-based correlator shows high performance in logic processing. Simulation results are presented to validate the performance of the proposed technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Y.A.; Chapman, D.M.; Hill, D.J.
2000-12-15
The dynamic rod worth measurement (DRWM) technique is a method of quickly validating the predicted bank worth of control rods and shutdown rods. The DRWM analytic method is based on three-dimensional, space-time kinetic simulations of the rapid rod movements. Its measurement data is processed with an advanced digital reactivity computer. DRWM has been used as the method of bank worth validation at numerous plant startups with excellent results. The process and methodology of DRWM are described, and the measurement results of using DRWM are presented.
Wass, Sam V
2014-08-01
Convergent research points to the importance of studying the ontogenesis of sustained attention during the early years of life, but little research hitherto has compared and contrasted different techniques available for measuring sustained attention. Here, we compare methods that have been used to assess one parameter of sustained attention, namely infants' peak look duration to novel stimuli. Our focus was to assess whether individual differences in peak look duration are stable across different measurement techniques. In a single cohort of 42 typically developing 11-month-old infants we assessed peak look duration using six different measurement paradigms (four screen-based, two naturalistic). Zero-order correlations suggested that individual differences in peak look duration were stable across all four screen-based paradigms, but no correlations were found between peak look durations observed on the screen-based and the naturalistic paradigms. A factor analysis conducted on the dependent variable of peak look duration identified two factors. All four screen-based tasks loaded onto the first factor, but the two naturalistic tasks did not relate, and mapped onto a different factor. Our results question how individual differences observed on screen-based tasks manifest in more ecologically valid contexts. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Wass, Sam V.
2014-01-01
Convergent research points to the importance of studying the ontogenesis of sustained attention during the early years of life, but little research hitherto has compared and contrasted different techniques available for measuring sustained attention. Here, we compare methods that have been used to assess one parameter of sustained attention, namely infants’ peak look duration to novel stimuli. Our focus was to assess whether individual differences in peak look duration are stable across different measurement techniques. In a single cohort of 42 typically developing 11-month-old infants we assessed peak look duration using six different measurement paradigms (four screen-based, two naturalistic). Zero-order correlations suggested that individual differences in peak look duration were stable across all four screen-based paradigms, but no correlations were found between peak look durations observed on the screen-based and the naturalistic paradigms. A factor analysis conducted on the dependent variable of peak look duration identified two factors. All four screen-based tasks loaded onto the first factor, but the two naturalistic tasks did not relate, and mapped onto a different factor. Our results question how individual differences observed on screen-based tasks manifest in more ecologically valid contexts. PMID:24905901
NASA Astrophysics Data System (ADS)
Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul
2018-04-01
In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.
Raman sorting and identification of single living micro-organisms with optical tweezers
NASA Astrophysics Data System (ADS)
Xie, Changan; Chen, De; Li, Yong-Qing
2005-07-01
We report on a novel technique for sorting and identification of single biological cells and food-borne bacteria based on laser tweezers and Raman spectroscopy (LTRS). With this technique, biological cells of different physiological states in a sample chamber were identified by their Raman spectral signatures and then they were selectively manipulated into a clean collection chamber with optical tweezers through a microchannel. As an example, we sorted the live and dead yeast cells into the collection chamber and validated this with a standard staining technique. We also demonstrated that bacteria existing in spoiled foods could be discriminated from a variety of food particles based on their characteristic Raman spectra and then isolated with laser manipulation. This label-free LTRS sorting technique may find broad applications in microbiology and rapid examination of food-borne diseases.
Distributed acoustic sensing technique and its field trial in SAGD well
NASA Astrophysics Data System (ADS)
Han, Li; He, Xiangge; Pan, Yong; Liu, Fei; Yi, Duo; Hu, Chengjun; Zhang, Min; Gu, Lijuan
2017-10-01
Steam assisted gravity drainage (SAGD) is a very promising way for the development of heavy oil, extra heavy oil and tight oil reservoirs. Proper monitoring of the SAGD operations is essential to avoid operational issues and improve efficiency. Among all the monitoring techniques, micro-seismic monitoring and related interpretation method can give useful information about the steam chamber development and has been extensively studied. Distributed acoustic sensor (DAS) based on Rayleigh backscattering is a newly developed technique that can measure acoustic signal at all points along the sensing fiber. In this paper, we demonstrate a DAS system based on dual-pulse heterodyne demodulation technique and did field trial in SAGD well located in Xinjiang Oilfield, China. The field trail results validated the performance of the DAS system and indicated its applicability in steam-chamber monitoring and hydraulic monitoring.
How accurately can we estimate energetic costs in a marine top predator, the king penguin?
Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J
2007-01-01
King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained.
Empirical performance of interpolation techniques in risk-neutral density (RND) estimation
NASA Astrophysics Data System (ADS)
Bahaludin, H.; Abdullah, M. H.
2017-03-01
The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.
NASA Technical Reports Server (NTRS)
Yam, Yeung; Johnson, Timothy L.; Lang, Jeffrey H.
1987-01-01
A model reduction technique based on aggregation with respect to sensor and actuator influence functions rather than modes is presented for large systems of coupled second-order differential equations. Perturbation expressions which can predict the effects of spillover on both the reduced-order plant model and the neglected plant model are derived. For the special case of collocated actuators and sensors, these expressions lead to the derivation of constraints on the controller gains that are, given the validity of the perturbation technique, sufficient to guarantee the stability of the closed-loop system. A case study demonstrates the derivation of stabilizing controllers based on the present technique. The use of control and observation synthesis in modifying the dimension of the reduced-order plant model is also discussed. A numerical example is provided for illustration.
Cluster-based analysis improves predictive validity of spike-triggered receptive field estimates
Malone, Brian J.
2017-01-01
Spectrotemporal receptive field (STRF) characterization is a central goal of auditory physiology. STRFs are often approximated by the spike-triggered average (STA), which reflects the average stimulus preceding a spike. In many cases, the raw STA is subjected to a threshold defined by gain values expected by chance. However, such correction methods have not been universally adopted, and the consequences of specific gain-thresholding approaches have not been investigated systematically. Here, we evaluate two classes of statistical correction techniques, using the resulting STRF estimates to predict responses to a novel validation stimulus. The first, more traditional technique eliminated STRF pixels (time-frequency bins) with gain values expected by chance. This correction method yielded significant increases in prediction accuracy, including when the threshold setting was optimized for each unit. The second technique was a two-step thresholding procedure wherein clusters of contiguous pixels surviving an initial gain threshold were then subjected to a cluster mass threshold based on summed pixel values. This approach significantly improved upon even the best gain-thresholding techniques. Additional analyses suggested that allowing threshold settings to vary independently for excitatory and inhibitory subfields of the STRF resulted in only marginal additional gains, at best. In summary, augmenting reverse correlation techniques with principled statistical correction choices increased prediction accuracy by over 80% for multi-unit STRFs and by over 40% for single-unit STRFs, furthering the interpretational relevance of the recovered spectrotemporal filters for auditory systems analysis. PMID:28877194
Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Rey-Abella, Ferran; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam
2016-05-01
People with Down syndrome present skeletal abnormalities in their feet that can be analyzed by commonly used gold standard indices (the Hernández-Corvo index, the Chippaux-Smirak index, the Staheli arch index, and the Clarke angle) based on footprint measurements. The use of Photoshop CS5 software (Adobe Systems Software Ireland Ltd, Dublin, Ireland) to measure footprints has been validated in the general population. The present study aimed to assess the reliability and validity of this footprint assessment technique in the population with Down syndrome. Using optical podography and photography, 44 footprints from 22 patients with Down syndrome (11 men [mean ± SD age, 23.82 ± 3.12 years] and 11 women [mean ± SD age, 24.82 ± 6.81 years]) were recorded in a static bipedal standing position. A blinded observer performed the measurements using a validated manual method three times during the 4-month study, with 2 months between measurements. Test-retest was used to check the reliability of the Photoshop CS5 software measurements. Validity and reliability were obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed very good values for the Photoshop CS5 method (ICC, 0.982-0.995). Validity testing also found no differences between the techniques (ICC, 0.988-0.999). The Photoshop CS5 software method is reliable and valid for the study of footprints in young people with Down syndrome.
NASA Astrophysics Data System (ADS)
Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa
2018-03-01
In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.
Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel
2017-06-15
Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.
Validation of the NASA Dryden X-31 simulation and evaluation of mechanization techniques
NASA Technical Reports Server (NTRS)
Dickes, Edward; Kay, Jacob; Ralston, John
1994-01-01
This paper shall discuss the evaluation of the original Dryden X-31 aerodynamic math model, processes involved in the justification and creation of the modified data base, and comparison time history results of the model response with flight test.
Loop-Extended Symbolic Execution on Binary Programs
2009-03-02
1434. Based on its speci- fication [35], one valid message format contains 2 fields: a header byte of value 4, followed by a string giving a database ...potentially become expensive. For instance the polyhedron technique [16] requires costly conversion operations on a multi-dimensional abstract representation
Measuring Electrical Current: The Roads Not Taken
ERIC Educational Resources Information Center
Greenslade, Thomas B., Jr.
2011-01-01
Recently I wrote about the standard Weston meter movement, that is at the heart of all modern analogue current measurements. Now I will discuss other techniques used to measure electric current that, despite being based on valid physical principles, are largely lost in technological history.
Soto, Marcelo A; Lu, Xin; Martins, Hugo F; Gonzalez-Herraez, Miguel; Thévenaz, Luc
2015-09-21
In this paper a technique to measure the distributed birefringence profile along optical fibers is proposed and experimentally validated. The method is based on the spectral correlation between two sets of orthogonally-polarized measurements acquired using a phase-sensitive optical time-domain reflectometer (ϕOTDR). The correlation between the two measured spectra gives a resonance (correlation) peak at a frequency detuning that is proportional to the local refractive index difference between the two orthogonal polarization axes of the fiber. In this way the method enables local phase birefringence measurements at any position along optical fibers, so that any longitudinal fluctuation can be precisely evaluated with metric spatial resolution. The method has been experimentally validated by measuring fibers with low and high birefringence, such as standard single-mode fibers as well as conventional polarization-maintaining fibers. The technique has potential applications in the characterization of optical fibers for telecommunications as well as in distributed optical fiber sensing.
Three-dimensional head anthropometric analysis
NASA Astrophysics Data System (ADS)
Enciso, Reyes; Shaw, Alex M.; Neumann, Ulrich; Mah, James
2003-05-01
Currently, two-dimensional photographs are most commonly used to facilitate visualization, assessment and treatment of facial abnormalities in craniofacial care but are subject to errors because of perspective, projection, lack metric and 3-dimensional information. One can find in the literature a variety of methods to generate 3-dimensional facial images such as laser scans, stereo-photogrammetry, infrared imaging and even CT however each of these methods contain inherent limitations and as such no systems are in common clinical use. In this paper we will focus on development of indirect 3-dimensional landmark location and measurement of facial soft-tissue with light-based techniques. In this paper we will statistically evaluate and validate a current three-dimensional image-based face modeling technique using a plaster head model. We will also develop computer graphics tools for indirect anthropometric measurements in a three-dimensional head model (or polygonal mesh) including linear distances currently used in anthropometry. The measurements will be tested against a validated 3-dimensional digitizer (MicroScribe 3DX).
Measuring adverse events in helicopter emergency medical services: establishing content validity.
Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M
2014-01-01
We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.
Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain
2017-01-01
Purpose To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. Materials and methods The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Results Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Conclusion Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions. PMID:29232401
Expert system verification and validation study: ES V/V Workshop
NASA Technical Reports Server (NTRS)
French, Scott; Hamilton, David
1992-01-01
The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.
NASA Astrophysics Data System (ADS)
Velarde, P.; Valverde, L.; Maestre, J. M.; Ocampo-Martinez, C.; Bordons, C.
2017-03-01
In this paper, a performance comparison among three well-known stochastic model predictive control approaches, namely, multi-scenario, tree-based, and chance-constrained model predictive control is presented. To this end, three predictive controllers have been designed and implemented in a real renewable-hydrogen-based microgrid. The experimental set-up includes a PEM electrolyzer, lead-acid batteries, and a PEM fuel cell as main equipment. The real experimental results show significant differences from the plant components, mainly in terms of use of energy, for each implemented technique. Effectiveness, performance, advantages, and disadvantages of these techniques are extensively discussed and analyzed to give some valid criteria when selecting an appropriate stochastic predictive controller.
Arduino-based noise robust online heart-rate detection.
Das, Sangita; Pal, Saurabh; Mitra, Madhuchhanda
2017-04-01
This paper introduces a noise robust real time heart rate detection system from electrocardiogram (ECG) data. An online data acquisition system is developed to collect ECG signals from human subjects. Heart rate is detected using window-based autocorrelation peak localisation technique. A low-cost Arduino UNO board is used to implement the complete automated process. The performance of the system is compared with PC-based heart rate detection technique. Accuracy of the system is validated through simulated noisy ECG data with various levels of signal to noise ratio (SNR). The mean percentage error of detected heart rate is found to be 0.72% for the noisy database with five different noise levels.
Artificial intelligence techniques for monitoring dangerous infections.
Lamma, Evelina; Mello, Paola; Nanetti, Anna; Riguzzi, Fabrizio; Storari, Sergio; Valastro, Gianfranco
2006-01-01
The monitoring and detection of nosocomial infections is a very important problem arising in hospitals. A hospital-acquired or nosocomial infection is a disease that develops after admission into the hospital and it is the consequence of a treatment, not necessarily a surgical one, performed by the medical staff. Nosocomial infections are dangerous because they are caused by bacteria which have dangerous (critical) resistance to antibiotics. This problem is very serious all over the world. In Italy, almost 5-8% of the patients admitted into hospitals develop this kind of infection. In order to reduce this figure, policies for controlling infections should be adopted by medical practitioners. In order to support them in this complex task, we have developed a system, called MERCURIO, capable of managing different aspects of the problem. The objectives of this system are the validation of microbiological data and the creation of a real time epidemiological information system. The system is useful for laboratory physicians, because it supports them in the execution of the microbiological analyses; for clinicians, because it supports them in the definition of the prophylaxis, of the most suitable antibi-otic therapy and in monitoring patients' infections; and for epidemiologists, because it allows them to identify outbreaks and to study infection dynamics. In order to achieve these objectives, we have adopted expert system and data mining techniques. We have also integrated a statistical module that monitors the diffusion of nosocomial infections over time in the hospital, and that strictly interacts with the knowledge based module. Data mining techniques have been used for improving the system knowledge base. The knowledge discovery process is not antithetic, but complementary to the one based on manual knowledge elicitation. In order to verify the reliability of the tasks performed by MERCURIO and the usefulness of the knowledge discovery approach, we performed a test based on a dataset of real infection events. In the validation task MERCURIO achieved an accuracy of 98.5%, a sensitivity of 98.5% and a specificity of 99%. In the therapy suggestion task, MERCURIO achieved very high accuracy and specificity as well. The executed test provided many insights to experts, too (we discovered some of their mistakes). The knowledge discovery approach was very effective in validating part of the MERCURIO knowledge base, and also in extending it with new validation rules, confirmed by interviewed microbiologists and specific to the hospital laboratory under consideration.
NASA Astrophysics Data System (ADS)
Jiménez-Varona, J.; Ponsin Roca, J.
2015-06-01
Under a contract with AIRBUS MILITARY (AI-M), an exercise to analyze the potential of optimization techniques to improve the wing performances at cruise conditions has been carried out by using an in-house design code. The original wing was provided by AI-M and several constraints were posed for the redesign. To maximize the aerodynamic efficiency at cruise, optimizations were performed using the design techniques developed internally at INTA under a research program (Programa de Termofluidodinámica). The code is a gradient-based optimizaa tion code, which uses classical finite differences approach for gradient computations. Several techniques for search direction computation are implemented for unconstrained and constrained problems. Techniques for geometry modifications are based on different approaches which include perturbation functions for the thickness and/or mean line distributions and others by Bézier curves fitting of certain degree. It is very e important to afford a real design which involves several constraints that reduce significantly the feasible design space. And the assessment of the code is needed in order to check the capabilities and the possible drawbacks. Lessons learnt will help in the development of future enhancements. In addition, the validation of the results was done using also the well-known TAU flow solver and a far-field drag method in order to determine accurately the improvement in terms of drag counts.
Sirimanna, Pramudith; Gladman, Marc A
2017-10-01
Proficiency-based virtual reality (VR) training curricula improve intraoperative performance, but have not been developed for laparoscopic appendicectomy (LA). This study aimed to develop an evidence-based training curriculum for LA. A total of 10 experienced (>50 LAs), eight intermediate (10-30 LAs) and 20 inexperienced (<10 LAs) operators performed guided and unguided LA tasks on a high-fidelity VR simulator using internationally relevant techniques. The ability to differentiate levels of experience (construct validity) was measured using simulator-derived metrics. Learning curves were analysed. Proficiency benchmarks were defined by the performance of the experienced group. Intermediate and experienced participants completed a questionnaire to evaluate the realism (face validity) and relevance (content validity). Of 18 surgeons, 16 (89%) considered the VR model to be visually realistic and 17 (95%) believed that it was representative of actual practice. All 'guided' modules demonstrated construct validity (P < 0.05), with learning curves that plateaued between sessions 6 and 9 (P < 0.01). When comparing inexperienced to intermediates to experienced, the 'unguided' LA module demonstrated construct validity for economy of motion (5.00 versus 7.17 versus 7.84, respectively; P < 0.01) and task time (864.5 s versus 477.2 s versus 352.1 s, respectively, P < 0.01). Construct validity was also confirmed for number of movements, path length and idle time. Validated modules were used for curriculum construction, with proficiency benchmarks used as performance goals. A VR LA model was realistic and representative of actual practice and was validated as a training and assessment tool. Consequently, the first evidence-based internationally applicable training curriculum for LA was constructed, which facilitates skill acquisition to proficiency. © 2017 Royal Australasian College of Surgeons.
Ladner, Tobias; Flitsch, David; Schlepütz, Tino; Büchs, Jochen
2015-10-09
During the past years, new high-throughput screening systems with capabilities of online monitoring turned out to be powerful tools for the characterization of microbial cell cultures. These systems are often easy to use, offer economic advantages compared to larger systems and allow to determine many important process parameters within short time. Fluorescent protein tags tremendously simplified the tracking and observation of cellular activity in vivo. Unfortunately, interferences between established fluorescence based dissolved oxygen tension (DOT) measurement techniques and fluorescence-based protein tags appeared. Therefore, the applicability of new oxygen-sensitive nanoparticles operated within the more suitable infrared wavelength region are introduced and validated for DOT measurement. The biocompatibility of the used dispersed oxygen-sensitive nanoparticles was proven via RAMOS cultivations for Hansenula polymorpha, Gluconobacter oxydans, and Escherichia coli. The applicability of the introduced DOT measurement technique for online monitoring of cultivations was demonstrated and successfully validated. The nanoparticles showed no disturbing effect on the online measurement of the fluorescence intensities of the proteins GFP, mCherry and YFP measured by a BioLector prototype. Additionally, the DOT measurement was not influenced by changing concentrations of these proteins. The kLa values for the applied cultivation conditions were successfully determined based on the measured DOT. The introduced technique appeared to be practically as well as economically advantageous for DOT online measuring in microtiter plates. The disadvantage of limited availability of microtiter plates with immobilized sensor spots (optodes) does not apply for this introduced technique. Due to the infrared wavelength range, used for the DOT measurement, no interferences with biogenic fluorescence or with expressed fluorescent proteins (e.g. YFP, GFP or mCherry) occur.
Evaluating Remotely-Sensed Soil Moisture Retrievals Using Triple Collocation Techniques
USDA-ARS?s Scientific Manuscript database
The validation is footprint-scale (~40 km) surface soil moisture retrievals from space is complicated by a lack of ground-based soil moisture instrumentation and challenges associated with up-scaling point-scale measurements from such instrumentation. Recent work has demonstrated the potential of e...
Malinauskas, Karolis; Palevicius, Paulius; Ragulskis, Minvydas; Ostasevicius, Vytautas; Dauksevicius, Rolanas
2013-01-01
Examination of wrist radial pulse is a noninvasive diagnostic method, which occupies a very important position in Traditional Chinese Medicine. It is based on manual palpation and therefore relies largely on the practitioner′s subjective technical skills and judgment. Consequently, it lacks reliability and consistency, which limits practical applications in clinical medicine. Thus, quantifiable characterization of the wrist pulse diagnosis method is a prerequisite for its further development and widespread use. This paper reports application of a noninvasive CCD sensor-based hybrid measurement system for radial pulse signal analysis. First, artery wall deformations caused by the blood flow are calibrated with a laser triangulation displacement sensor, following by the measurement of the deformations with projection moiré method. Different input pressures and fluids of various viscosities are used in the assembled artificial blood flow system in order to test the performance of laser triangulation technique with detection sensitivity enhancement through microfabricated retroreflective optical element placed on a synthetic vascular graft. Subsequently, the applicability of double-exposure whole-field projection moiré technique for registration of blood flow pulses is considered: a computational model and representative example are provided, followed by in vitro experiment performed on a vascular graft with artificial skin atop, which validates the suitability of the technique for characterization of skin surface deformations caused by the radial pulsation. PMID:23609803
Malinauskas, Karolis; Palevicius, Paulius; Ragulskis, Minvydas; Ostasevicius, Vytautas; Dauksevicius, Rolanas
2013-04-22
Examination of wrist radial pulse is a noninvasive diagnostic method, which occupies a very important position in Traditional Chinese Medicine. It is based on manual palpation and therefore relies largely on the practitioner's subjective technical skills and judgment. Consequently, it lacks reliability and consistency, which limits practical applications in clinical medicine. Thus, quantifiable characterization of the wrist pulse diagnosis method is a prerequisite for its further development and widespread use. This paper reports application of a noninvasive CCD sensor-based hybrid measurement system for radial pulse signal analysis. First, artery wall deformations caused by the blood flow are calibrated with a laser triangulation displacement sensor, following by the measurement of the deformations with projection moiré method. Different input pressures and fluids of various viscosities are used in the assembled artificial blood flow system in order to test the performance of laser triangulation technique with detection sensitivity enhancement through microfabricated retroreflective optical element placed on a synthetic vascular graft. Subsequently, the applicability of double-exposure whole-field projection moiré technique for registration of blood flow pulses is considered: a computational model and representative example are provided, followed by in vitro experiment performed on a vascular graft with artificial skin atop, which validates the suitability of the technique for characterization of skin surface deformations caused by the radial pulsation.
Gao, Chao; Sun, Hanbo; Wang, Tuo; Tang, Ming; Bohnen, Nicolaas I; Müller, Martijn L T M; Herman, Talia; Giladi, Nir; Kalinin, Alexandr; Spino, Cathie; Dauer, William; Hausdorff, Jeffrey M; Dinov, Ivo D
2018-05-08
In this study, we apply a multidisciplinary approach to investigate falls in PD patients using clinical, demographic and neuroimaging data from two independent initiatives (University of Michigan and Tel Aviv Sourasky Medical Center). Using machine learning techniques, we construct predictive models to discriminate fallers and non-fallers. Through controlled feature selection, we identified the most salient predictors of patient falls including gait speed, Hoehn and Yahr stage, postural instability and gait difficulty-related measurements. The model-based and model-free analytical methods we employed included logistic regression, random forests, support vector machines, and XGboost. The reliability of the forecasts was assessed by internal statistical (5-fold) cross validation as well as by external out-of-bag validation. Four specific challenges were addressed in the study: Challenge 1, develop a protocol for harmonizing and aggregating complex, multisource, and multi-site Parkinson's disease data; Challenge 2, identify salient predictive features associated with specific clinical traits, e.g., patient falls; Challenge 3, forecast patient falls and evaluate the classification performance; and Challenge 4, predict tremor dominance (TD) vs. posture instability and gait difficulty (PIGD). Our findings suggest that, compared to other approaches, model-free machine learning based techniques provide a more reliable clinical outcome forecasting of falls in Parkinson's patients, for example, with a classification accuracy of about 70-80%.
Cognitive techniques and language: A return to behavioral origins.
Froján Parga, María X; Núñez de Prado Gordillo, Miguel; de Pascual Verdú, Ricardo
2017-08-01
the main purpose of this study is to offer an alternative explanatory account of the functioning of cognitive techniques that is based on the principles of associative learning and highlights their verbal nature. The traditional accounts are questioned and analyzed in the light of the situation of psychology in the 1970s. conceptual analysis is employed to revise the concepts of language, cognition and behavior. Several operant- and Pavlovian-based approaches to these phenomena are presented, while particular emphasis is given to Mowrer’s (1954) approach and Ryle (1949) and Wittgenstein’s (1953) philosophical contributions to the field. several logical problems are found in regard to the theoretical foundations of cognitive techniques. A combination of both operant and Pavlovian paradigms based on the above-mentioned approaches is offered as an alternative explanatory account of cognitive techniques. This new approach could overcome the conceptual fragilities of the cognitive standpoint and its dependence upon constructs of dubious logical and scientific validity.
NASA Astrophysics Data System (ADS)
Mishra, Anoop; Rafiq, Mohammd
2017-12-01
This is the first attempt to merge highly accurate precipitation estimates from Global Precipitation Measurement (GPM) with gap free satellite observations from Meteosat to develop a regional rainfall monitoring algorithm to estimate heavy rainfall over India and nearby oceanic regions. Rainfall signature is derived from Meteosat observations and is co-located against rainfall from GPM to establish a relationship between rainfall and signature for various rainy seasons. This relationship can be used to monitor rainfall over India and nearby oceanic regions. Performance of this technique was tested by applying it to monitor heavy precipitation over India. It is reported that our algorithm is able to detect heavy rainfall. It is also reported that present algorithm overestimates rainfall areal spread as compared to rain gauge based rainfall product. This deficiency may arise from various factors including uncertainty caused by use of different sensors from different platforms (difference in viewing geometry from MFG and GPM), poor relationship between warm rain (light rain) and IR brightness temperature, and weak characterization of orographic rain from IR signature. We validated hourly rainfall estimated from the present approach with independent observations from GPM. We also validated daily rainfall from this approach with rain gauge based product from India Meteorological Department (IMD). Present technique shows a Correlation Coefficient (CC) of 0.76, a bias of -2.72 mm, a Root Mean Square Error (RMSE) of 10.82 mm, Probability of Detection (POD) of 0.74, False Alarm Ratio (FAR) of 0.34 and a Skill score of 0.36 with daily rainfall from rain gauge based product of IMD at 0.25° resolution. However, FAR reduces to 0.24 for heavy rainfall events. Validation results with rain gauge observations reveal that present technique outperforms available satellite based rainfall estimates for monitoring heavy rainfall over Indian region.
Arab-Mazar, Zahra; Fallahi, Shirzad; Koochaki, Ameneh; Haghighi, Ali; Seyyed Tabaei, Seyyed Javad
2016-02-01
Serological assays for the diagnosis of toxoplasmosis mostly rely on the tachyzoite specific antigens of Toxoplasma gondii, which are difficult to produce by conventional methods. The aim of this study was to clone and express of GRA7 protein of T. gondii and evaluate its potential for immunodiagnosis of toxoplasmosis in cancer patients. As well as validate the results using a new molecular assay, LAMP technique. The GRA7 gene was successfully cloned, expressed and purified by affinity chromatography and the production was evaluated by SDS PAGE, dot blot and western blot analyses. The rGRA7 was used for developing an ELISA based on the rGRA7 using sera from patients with toxoplasmosis and healthy controls. Furthermore, 50 serum samples from leukemic children infected with toxoplasmosis and 50 seronegative controls were included to evaluate the sensitivity and specificity of rGRA7 based ELISA. Finally, the LAMP technique was used to assess the accuracy and validity of the results obtained by rGRA7 based ELISA. The consistency of the results of two tests was determined by using the Kappa coefficient of agreement. The rGRA7 showed higher and optimum immunoreactivity with 1:100 dilution of serum from Toxoplasma infected patients. The sensitivity and specificity of test were calculated as 92 and 94%, respectively. According to the Kappa coefficient of agreement, there was a significant conformance between the results obtained by ELISA based on the rGRA7 and the results of LAMP technique (≈96%, P<0.001). Findings of the present study showed that rGRA7 can be used as a potential immunogenic antigen for developing immunodiagnostic tools for immunodiagnosis of toxoplasmosis in patients including patients with cancer. Copyright © 2015. Published by Elsevier GmbH.
NASA Technical Reports Server (NTRS)
Krishnan, G. S.
1997-01-01
A cost effective model which uses the artificial intelligence techniques in the selection and approval of parts is presented. The knowledge which is acquired from the specialists for different part types are represented in a knowledge base in the form of rules and objects. The parts information is stored separately in a data base and is isolated from the knowledge base. Validation, verification and performance issues are highlighted.
Agent independent task planning
NASA Technical Reports Server (NTRS)
Davis, William S.
1990-01-01
Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.
Ahmad, Zaki Uddin; Chao, Bing; Konggidinata, Mas Iwan; Lian, Qiyu; Zappi, Mark E; Gang, Daniel Dianchen
2018-04-27
Numerous research works have been devoted in the adsorption area using experimental approaches. All these approaches are based on trial and error process and extremely time consuming. Molecular simulation technique is a new tool that can be used to design and predict the performance of an adsorbent. This research proposed a simulation technique that can greatly reduce the time in designing the adsorbent. In this study, a new Rhombic ordered mesoporous carbon (OMC) model is proposed and constructed with various pore sizes and oxygen contents using Materials Visualizer Module to optimize the structure of OMC for resorcinol adsorption. The specific surface area, pore volume, small angle X-ray diffraction pattern, and resorcinol adsorption capacity were calculated by Forcite and Sorption module in Materials Studio Package. The simulation results were validated experimentally through synthesizing OMC with different pore sizes and oxygen contents prepared via hard template method employing SBA-15 silica scaffold. Boric acid was used as the pore expanding reagent to synthesize OMC with different pore sizes (from 4.6 to 11.3 nm) and varying oxygen contents (from 11.9% to 17.8%). Based on the simulation and experimental validation, the optimal pore size was found to be 6 nm for maximum adsorption of resorcinol. Copyright © 2018 Elsevier B.V. All rights reserved.
Monitoring of pipelines in nuclear power plants by measuring laser-based mechanical impedance
NASA Astrophysics Data System (ADS)
Lee, Hyeonseok; Sohn, Hoon; Yang, Suyoung; Yang, Jinyeol
2014-06-01
Using laser-based mechanical impedance (LMI) measurement, this study proposes a damage detection technique that enables structural health monitoring of pipelines under the high temperature and radioactive environments of nuclear power plants (NPPs). The applications of conventional electromechanical impedance (EMI) based techniques to NPPs have been limited, mainly due to the contact nature of piezoelectric transducers, which cannot survive under the high temperature and high radiation environments of NPPs. The proposed LMI measurement technique aims to tackle the limitations of the EMI techniques by utilizing noncontact laser beams for both ultrasound generation and sensing. An Nd:Yag pulse laser is used for ultrasound generation, and a laser Doppler vibrometer is employed for the measurement of the corresponding ultrasound responses. For the monitoring of pipes covered by insulation layers, this study utilizes optical fibers to guide the laser beams to specific target locations. Then, an outlier analysis is adopted for autonomous damage diagnosis. Validation of the proposed LMI technique is carried out on a carbon steel pipe elbow under varying temperatures. A corrosion defect chemically engraved in the specimen is successfully detected.
NASA Astrophysics Data System (ADS)
Rocha, Alby D.; Groen, Thomas A.; Skidmore, Andrew K.; Darvishzadeh, Roshanak; Willemen, Louise
2017-11-01
The growing number of narrow spectral bands in hyperspectral remote sensing improves the capacity to describe and predict biological processes in ecosystems. But it also poses a challenge to fit empirical models based on such high dimensional data, which often contain correlated and noisy predictors. As sample sizes, to train and validate empirical models, seem not to be increasing at the same rate, overfitting has become a serious concern. Overly complex models lead to overfitting by capturing more than the underlying relationship, and also through fitting random noise in the data. Many regression techniques claim to overcome these problems by using different strategies to constrain complexity, such as limiting the number of terms in the model, by creating latent variables or by shrinking parameter coefficients. This paper is proposing a new method, named Naïve Overfitting Index Selection (NOIS), which makes use of artificially generated spectra, to quantify the relative model overfitting and to select an optimal model complexity supported by the data. The robustness of this new method is assessed by comparing it to a traditional model selection based on cross-validation. The optimal model complexity is determined for seven different regression techniques, such as partial least squares regression, support vector machine, artificial neural network and tree-based regressions using five hyperspectral datasets. The NOIS method selects less complex models, which present accuracies similar to the cross-validation method. The NOIS method reduces the chance of overfitting, thereby avoiding models that present accurate predictions that are only valid for the data used, and too complex to make inferences about the underlying process.
Jeon, Joonryong
2017-01-01
In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size. PMID:28704945
Heo, Gwanghee; Jeon, Joonryong
2017-07-12
In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size.
NASA Astrophysics Data System (ADS)
Prayogi, S.; Yuanita, L.; Wasis
2018-01-01
This study aimed to develop Critical-Inquiry-Based-Learning (CIBL) learning model to promote critical thinking (CT) ability of preservice teachers. The CIBL learning model was developed by meeting the criteria of validity, practicality, and effectiveness. Validation of the model involves 4 expert validators through the mechanism of the focus group discussion (FGD). CIBL learning model declared valid to promote CT ability, with the validity level (Va) of 4.20 and reliability (r) of 90,1% (very reliable). The practicality of the model was evaluated when it was implemented that involving 17 of preservice teachers. The CIBL learning model had been declared practice, its measuring from learning feasibility (LF) with very good criteria (LF-score = 4.75). The effectiveness of the model was evaluated from the improvement CT ability after the implementation of the model. CT ability were evaluated using the scoring technique adapted from Ennis-Weir Critical Thinking Essay Test. The average score of CT ability on pretest is - 1.53 (uncritical criteria), whereas on posttest is 8.76 (critical criteria), with N-gain score of 0.76 (high criteria). Based on the results of this study, it can be concluded that developed CIBL learning model is feasible to promote CT ability of preservice teachers.
NASA Astrophysics Data System (ADS)
Petropavlovskikh, I.; Weatherhead, E.; Cede, A.; Oltmans, S. J.; Kireev, S.; Maillard, E.; Bhartia, P. K.; Flynn, L. E.
2005-12-01
The first NPOESS satellite is scheduled to be launched in 2010 and will carry the Ozone Mapping and Profiler Suite (OMPS) instruments for ozone monitoring. Prior this, the OMPS instruments and algorithms will be tested by flight on the NPOESS/NPP satellite, scheduled for launch in 2008. Pre-launch planning for validation, post launch data validation and verification of the nadir and limb profile algorithm are key components for insuring that the NPOESS will produce a high quality, reliable ozone profile data set. The heritage of satellite instrument validation (TOMS, SBUV, GOME, SCIAMACHY, SAGE, HALOE, ATMOS, etc) has always relied upon surface-based observations. While the global coverage of satellite observations is appealing for validating another satellite, there is no substitute for the hard reference point of a ground-based system such as the Dobson or Brewer network, whose instruments are routinely calibrated and intercompared to standard references. The standard solar occultation instruments, SAGE II and HALOE are well beyond their planned lifetimes and might be inoperative during the OMPS period. The Umkehr network has been one of the key data sets for stratospheric ozone trend calculations and has earned its place as a benchmark network for stratospheric ozone profile observations. The normalization of measurements at different solar zenith angle (SZAs) to the measurement at the smallest SZA cancels out many calibration parameters, including the extra-terrestrial solar flux and instrumental constant, thus providing a "self-calibrating" technique in the same manner relied upon by the occultation sensors on satellites. Moreover, the ground-based Umkehr measurement is the only technique that provides data with the same altitude resolution and in the same units (DU) as do the UV-nadir instruments (SBUV-2, GOME-2, OMPS-nadir), i.e., as ozone amount in pressure layers, whereas, occultation instruments measure ozone density with height. A new Umkehr algorithm will enhance the information content of the retrieved profiles and extend the applicability of the technique. Automated Dobson and Brewer instruments offer the potential for greatly expanded network of Umkehr observations once the new algorithm is applied. We will discuss the new algorithm development and present results of its performance in comparisons of retrievals between co-located Brewer and Dobson ozone profiles measured at Arosa station in Switzerland.
NASA Technical Reports Server (NTRS)
Hilsenrath, E.; Bojkov, B. R.; Labow, G.; Weber, M.; Burrows, J.
2004-01-01
Validation of satellite data remains a high priority for the construction of climate data sets. Traditionally ground based measurements have provided the primary comparison data for validation. For some atmospheric parameters such as ozone, a thoroughly validated satellite data record can be used to validate a new instrument s data product in addition to using ground based data. Comparing validated data with new satellite data has several advantages; availability of much more data, which will improve precision, larger geographical coverage, and the footprints are closer in size, which removes uncertainty due to different observed atmospheric volumes. To demonstrate the applicability and some limitations of this technique, observations from the newly launched SCIAMACHY instrument were compared with the NOM-16 SBW/2 and ERS-2 GOME instruments. The SBW/2 data had all ready undergone validation by comparing to the total ozone ground network. Overall the SCIAMACHY data were found to low by 3% with respect to satellite data and 1% low with respect to ground station data. There appears to be seasonal and or solar zenith angle dependences in the comparisons with SBW/2 where differences increase with higher solar zenith angles. It is known that accuracies in both satellite and ground based total ozone algorithms decrease at high solar zenith angles. There is a strong need for more accurate measurement from and the ground under these conditions. At the present time SCIAMACHY data are limited and longer data set with more coverage in both hemispheres is needed to unravel the cause of these differences.
Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model
2010-03-01
EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End
An evaluation of NASA's program in human factors research: Aircrew-vehicle system interaction
NASA Technical Reports Server (NTRS)
1982-01-01
Research in human factors in the aircraft cockpit and a proposed program augmentation were reviewed. The dramatic growth of microprocessor technology makes it entirely feasible to automate increasingly more functions in the aircraft cockpit; the promise of improved vehicle performance, efficiency, and safety through automation makes highly automated flight inevitable. An organized data base and validated methodology for predicting the effects of automation on human performance and thus on safety are lacking and without such a data base and validated methodology for analyzing human performance, increased automation may introduce new risks. Efforts should be concentrated on developing methods and techniques for analyzing man machine interactions, including human workload and prediction of performance.
NASA Astrophysics Data System (ADS)
Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.
2013-03-01
Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.
Zhang, Junwen; Wang, Jing; Xu, Yuming; Xu, Mu; Lu, Feng; Cheng, Lin; Yu, Jianjun; Chang, Gee-Kung
2016-05-01
We propose and experimentally demonstrate a novel fiber-wireless integrated mobile backhaul network based on a hybrid millimeter-wave (MMW) and free-space-optics (FSO) architecture using an adaptive combining technique. Both 60 GHz MMW and FSO links are demonstrated and fully integrated with optical fibers in a scalable and cost-effective backhaul system setup. Joint signal processing with an adaptive diversity combining technique (ADCT) is utilized at the receiver side based on a maximum ratio combining algorithm. Mobile backhaul transportation of 4-Gb/s 16 quadrature amplitude modulation frequency-division multiplexing (QAM-OFDM) data is experimentally demonstrated and tested under various weather conditions synthesized in the lab. Performance improvement in terms of reduced error vector magnitude (EVM) and enhanced link reliability are validated under fog, rain, and turbulence conditions.
NASA Astrophysics Data System (ADS)
Feeley, J.; Zajic, J.; Metcalf, A.; Baucom, T.
2009-12-01
The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP) Calibration and Validation (Cal/Val) team is planning post-launch activities to calibrate the NPP sensors and validate Sensor Data Records (SDRs). The IPO has developed a web-based data collection and visualization tool in order to effectively collect, coordinate, and manage the calibration and validation tasks for the OMPS, ATMS, CrIS, and VIIRS instruments. This tool is accessible to the multi-institutional Cal/Val teams consisting of the Prime Contractor and Government Cal/Val leads along with the NASA NPP Mission team, and is used for mission planning and identification/resolution of conflicts between sensor activities. Visualization techniques aid in displaying task dependencies, including prerequisites and exit criteria, allowing for the identification of a critical path. This presentation will highlight how the information is collected, displayed, and used to coordinate the diverse instrument calibration/validation teams.
Multiple-Group Analysis Using the sem Package in the R System
ERIC Educational Resources Information Center
Evermann, Joerg
2010-01-01
Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…
Prediction of Recidivism in Juvenile Offenders Based on Discriminant Analysis.
ERIC Educational Resources Information Center
Proefrock, David W.
The recent development of strong statistical techniques has made accurate predictions of recidivism possible. To investigate the utility of discriminant analysis methodology in making predictions of recidivism in juvenile offenders, the court records of 271 male and female juvenile offenders, aged 12-16, were reviewed. A cross validation group…
Methodological Approaches to Online Scoring of Essays.
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.; O'Neil, Harold F., Jr.
This report examines the feasibility of scoring essays using computer-based techniques. Essays have been incorporated into many of the standardized testing programs. Issues of validity and reliability must be addressed to deploy automated approaches to scoring fully. Two approaches that have been used to classify documents, surface- and word-based…
Systematic, Cooperative Evaluation.
ERIC Educational Resources Information Center
Nassif, Paula M.
Evaluation procedures based on a systematic evaluation methodology, decision-maker validity, new measurement and design techniques, low cost, and a high level of cooperation on the part of the school staff were used in the assessment of a public school mathematics program for grades 3-8. The mathematics curriculum was organized into Spirals which…
Proceedings of Tenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1985-01-01
Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.
Head movement compensation in real-time magnetoencephalographic recordings.
Little, Graham; Boe, Shaun; Bardouille, Timothy
2014-01-01
Neurofeedback- and brain-computer interface (BCI)-based interventions can be implemented using real-time analysis of magnetoencephalographic (MEG) recordings. Head movement during MEG recordings, however, can lead to inaccurate estimates of brain activity, reducing the efficacy of the intervention. Most real-time applications in MEG have utilized analyses that do not correct for head movement. Effective means of correcting for head movement are needed to optimize the use of MEG in such applications. Here we provide preliminary validation of a novel analysis technique, real-time source estimation (rtSE), that measures head movement and generates corrected current source time course estimates in real-time. rtSE was applied while recording a calibrated phantom to determine phantom position localization accuracy and source amplitude estimation accuracy under stationary and moving conditions. Results were compared to off-line analysis methods to assess validity of the rtSE technique. The rtSE method allowed for accurate estimation of current source activity at the source-level in real-time, and accounted for movement of the source due to changes in phantom position. The rtSE technique requires modifications and specialized analysis of the following MEG work flow steps.•Data acquisition•Head position estimation•Source localization•Real-time source estimation This work explains the technical details and validates each of these steps.
Calculation of Shuttle Base Heating Environments and Comparison with Flight Data
NASA Technical Reports Server (NTRS)
Greenwood, T. F.; Lee, Y. C.; Bender, R. L.; Carter, R. E.
1983-01-01
The techniques, analytical tools, and experimental programs used initially to generate and later to improve and validate the Shuttle base heating design environments are discussed. In general, the measured base heating environments for STS-1 through STS-5 were in good agreement with the preflight predictions. However, some changes were made in the methodology after reviewing the flight data. The flight data is described, preflight predictions are compared with the flight data, and improvements in the prediction methodology based on the data are discussed.
Application of stepwise multiple regression techniques to inversion of Nimbus 'IRIS' observations.
NASA Technical Reports Server (NTRS)
Ohring, G.
1972-01-01
Exploratory studies with Nimbus-3 infrared interferometer-spectrometer (IRIS) data indicate that, in addition to temperature, such meteorological parameters as geopotential heights of pressure surfaces, tropopause pressure, and tropopause temperature can be inferred from the observed spectra with the use of simple regression equations. The technique of screening the IRIS spectral data by means of stepwise regression to obtain the best radiation predictors of meteorological parameters is validated. The simplicity of application of the technique and the simplicity of the derived linear regression equations - which contain only a few terms - suggest usefulness for this approach. Based upon the results obtained, suggestions are made for further development and exploitation of the stepwise regression analysis technique.
Damage Evaluation Based on a Wave Energy Flow Map Using Multiple PZT Sensors
Liu, Yaolu; Hu, Ning; Xu, Hong; Yuan, Weifeng; Yan, Cheng; Li, Yuan; Goda, Riu; Alamusi; Qiu, Jinhao; Ning, Huiming; Wu, Liangke
2014-01-01
A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti's reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map. PMID:24463430
Internet-Based Delphi Research: Case Based Discussion
Donohoe, Holly M.; Stellefson, Michael L.
2013-01-01
The interactive capacity of the Internet offers benefits that are intimately linked with contemporary research innovation in the natural resource and environmental studies domains. However, e-research methodologies, such as the e-Delphi technique, have yet to undergo critical review. This study advances methodological discourse on the e-Delphi technique by critically assessing an e-Delphi case study. The analysis suggests that the benefits of using e-Delphi are noteworthy but the authors acknowledge that researchers are likely to face challenges that could potentially compromise research validity and reliability. To ensure that these issues are sufficiently considered when planning and designing an e-Delphi, important facets of the technique are discussed and recommendations are offered to help the environmental researcher avoid potential pitfalls associated with coordinating e-Delphi research. PMID:23288149
Multispectral Wavefronts Retrieval in Digital Holographic Three-Dimensional Imaging Spectrometry
NASA Astrophysics Data System (ADS)
Yoshimori, Kyu
2010-04-01
This paper deals with a recently developed passive interferometric technique for retrieving a set of spectral components of wavefronts that are propagating from a spatially incoherent, polychromatic object. The technique is based on measurement of 5-D spatial coherence function using a suitably designed interferometer. By applying signal processing, including aperture synthesis and spectral decomposition, one may obtains a set of wavefronts of different spectral bands. Since each wavefront is equivalent to the complex Fresnel hologram at a particular spectrum of the polychromatic object, application of the conventional Fresnel transform yields 3-D image of different spectrum. Thus, this technique of multispectral wavefronts retrieval provides a new type of 3-D imaging spectrometry based on a fully passive interferometry. Experimental results are also shown to demonstrate the validity of the method.
SAMICS Validation. SAMICS Support Study, Phase 3
NASA Technical Reports Server (NTRS)
1979-01-01
SAMICS provides a consistent basis for estimating array costs and compares production technology costs. A review and a validation of the SAMICS model are reported. The review had the following purposes: (1) to test the computational validity of the computer model by comparison with preliminary hand calculations based on conventional cost estimating techniques; (2) to review and improve the accuracy of the cost relationships being used by the model: and (3) to provide an independent verification to users of the model's value in decision making for allocation of research and developement funds and for investment in manufacturing capacity. It is concluded that the SAMICS model is a flexible, accurate, and useful tool for managerial decision making.
NASA Technical Reports Server (NTRS)
Piazza, Anthony; Hudson, Larry D.; Richards, W. Lance
2005-01-01
Fiber Optic Strain Measurements: a) Successfully attached silica fiber optic sensors to both metallics and composites; b) Accomplished valid EFPI strain measurements to 1850 F; c) Successfully attached EFPI sensors to large scale hot-structures; and d) Attached and thermally validated FBG bond and epsilon(sub app). Future Development a) Improve characterization of sensors on C-C and C-SiC substrates; b) Apply application to other composites such as SiC-SiC; c) Assist development of interferometer based Sapphire sensor currently being conducted under a Phase II SBIR; and d) Complete combined thermal/mechanical testing of FBG on composite substrates in controlled laboratory environment.
Assessing and validating RST-FIRES on MSG-SEVIRI data by means a Total Validation Approach (TVA).
NASA Astrophysics Data System (ADS)
Filizzola, Carolina; Corrado, Rosita; Marchese, Francesco; Mazzeo, %Giuseppe; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio
2015-04-01
Several fire detection methods have been developed through the years for detecting forest fires from space. These algorithms (which may be grouped in single channel, multichannel and contextual algorithms) are generally based on the use of fixed thresholds that, being intrinsically exposed to false alarm proliferation, are often used in a conservative way. As a consequence, most of satellite-based algorithms for fire detection show low sensitivity resulting not suitable in operational contexts. In this work, the RST-FIRES algorithm, which is based on an original multi-temporal scheme of satellite data analysis (RST-Robust Satellite Techniques), is presented. The implementation of RST-FIRES on data provided by Spinning Enhanced Visible and InfraRed Imager (SEVIRI) onboard Meteosat Second Generation (MSG) that, offering the best revisit time (i.e. 15 minutes), can be successfully used for detecting fires at early stage, is described here. Moreover, results of a Total Validation Approach (TVA) experimented both in Northern and Southern Italy, in collaboration with local and regional civil protection agencies, are also reported. In particular, TVA allowed us to assess RST-FIRES detections by means of ground check and aerial surveys, demonstrating the good performances offered by RST-FIRES using MSG-SEVIRI data. Indeed, this algorithm was capable of detecting several fires that for their features (e.g., small size, short time duration) would not have appeared in the official reports, highlighting a significant improvement in terms of sensitivity in comparison with other established satellite-based fire detection techniques still preserving a high confidence level of detection.
Confocal laser feedback tomography for skin cancer detection
Mowla, Alireza; Du, Benjamin Wensheng; Taimre, Thomas; Bertling, Karl; Wilson, Stephen; Soyer, H. Peter; Rakić, Aleksandar D.
2017-01-01
Tomographic imaging of soft tissue such as skin has a potential role in cancer detection. The penetration of infrared wavelengths makes a confocal approach based on laser feedback interferometry feasible. We present a compact system using a semiconductor laser as both transmitter and receiver. Numerical and physical models based on the known optical properties of keratinocyte cancers were developed. We validated the technique on three phantoms containing macro-structural changes in optical properties. Experimental results were in agreement with numerical simulations and structural changes were evident which would permit discrimination of healthy tissue and tumour. Furthermore, cancer type discrimination was also able to be visualized using this imaging technique. PMID:28966845
Confocal laser feedback tomography for skin cancer detection.
Mowla, Alireza; Du, Benjamin Wensheng; Taimre, Thomas; Bertling, Karl; Wilson, Stephen; Soyer, H Peter; Rakić, Aleksandar D
2017-09-01
Tomographic imaging of soft tissue such as skin has a potential role in cancer detection. The penetration of infrared wavelengths makes a confocal approach based on laser feedback interferometry feasible. We present a compact system using a semiconductor laser as both transmitter and receiver. Numerical and physical models based on the known optical properties of keratinocyte cancers were developed. We validated the technique on three phantoms containing macro-structural changes in optical properties. Experimental results were in agreement with numerical simulations and structural changes were evident which would permit discrimination of healthy tissue and tumour. Furthermore, cancer type discrimination was also able to be visualized using this imaging technique.
NASA Astrophysics Data System (ADS)
Saito, Terubumi; Tatsuta, Muneaki; Abe, Yamato; Takesawa, Minato
2018-02-01
We have succeeded in the direct measurement for solar cell/module internal conversion efficiency based on a calorimetric method or electrical substitution method by which the absorbed radiant power is determined by replacing the heat absorbed in the cell/module with the electrical power. The technique is advantageous in that the reflectance and transmittance measurements, which are required in the conventional methods, are not necessary. Also, the internal quantum efficiency can be derived from conversion efficiencies by using the average photon energy. Agreements of the measured data with the values estimated from the nominal values support the validity of this technique.
Zhang, Yunlong; Li, Ruoming; Shi, Yuechun; Zhang, Jintao; Chen, Xiangfei; Liu, Shengchun
2015-06-01
A novel fiber Bragg grating aided fiber loop ringdown (FLRD) sensor array and the wavelength-time multiplexing based interrogation technique for the FLRD sensors array are proposed. The interrogation frequency of the system is formulated and the interrelationships among the parameters of the system are analyzed. To validate the performance of the proposed system, a five elements array is experimentally demonstrated, and the system shows the capability of real time monitoring every FLRD element with interrogation frequency of 125.5 Hz.
Crews, Colin
2015-01-01
The principles and application of established and newer methods for the quantitative and semi-quantitative determination of ergot alkaloids in food, feed, plant materials and animal tissues are reviewed. The techniques of sampling, extraction, clean-up, detection, quantification and validation are described. The major procedures for ergot alkaloid analysis comprise liquid chromatography with tandem mass spectrometry (LC-MS/MS) and liquid chromatography with fluorescence detection (LC-FLD). Other methods based on immunoassays are under development and variations of these and minor techniques are available for specific purposes. PMID:26046699
Yuan, XiaoDong; Tang, Wei; Shi, WenWei; Yu, Libao; Zhang, Jing; Yuan, Qing; You, Shan; Wu, Ning; Ao, Guokun; Ma, Tingting
2018-07-01
To develop a convenient and rapid single-kidney CT-GFR technique. One hundred and twelve patients referred for multiphasic renal CT and 99mTc-DTPA renal dynamic imaging Gates-GFR measurement were prospectively included and randomly divided into two groups of 56 patients each: the training group and the validation group. On the basis of the nephrographic phase images, the fractional renal accumulation (FRA) was calculated and correlated with the Gates-GFR in the training group. From this correlation a formula was derived for single-kidney CT-GFR calculation, which was validated by a paired t test and linear regression analysis with the single-kidney Gates-GFR in the validation group. In the training group, the FRA (x-axis) correlated well (r = 0.95, p < 0.001) with single-kidney Gates-GFR (y-axis), producing a regression equation of y = 1665x + 1.5 for single-kidney CT-GFR calculation. In the validation group, the difference between the methods of single-kidney GFR measurements was 0.38 ± 5.57 mL/min (p = 0.471); the regression line is identical to the diagonal (intercept = 0 and slope = 1) (p = 0.727 and p = 0.473, respectively), with a standard deviation of residuals of 5.56 mL/min. A convenient and rapid single-kidney CT-GFR technique was presented and validated in this investigation. • The new CT-GFR method takes about 2.5 min of patient time. • The CT-GFR method demonstrated identical results to the Gates-GFR method. • The CT-GFR method is based on the fractional renal accumulation of iodinated CM. • The CT-GFR method is achieved without additional radiation dose to the patient.
NASA Astrophysics Data System (ADS)
Avianti, R.; Suyatno; Sugiarto, B.
2018-04-01
This study aims to create an appropriate learning material based on CORE (Connecting, Organizing, Reflecting, Extending) model to improve students’ learning achievement in Chemical Bonding Topic. This study used 4-D models as research design and one group pretest-posttest as design of the material treatment. The subject of the study was teaching materials based on CORE model, conducted on 30 students of Science class grade 10. The collecting data process involved some techniques such as validation, observation, test, and questionnaire. The findings were that: (1) all the contents were valid, (2) the practicality and the effectiveness of all the contents were good. The conclusion of this research was that the CORE model is appropriate to improve students’ learning outcomes for studying Chemical Bonding.
NASA Technical Reports Server (NTRS)
Sundstrom, J. L.
1980-01-01
The techniques required to produce and validate six detailed task timeline scenarios for crew workload studies are described. Specific emphasis is given to: general aviation single pilot instrument flight rules operations in a high density traffic area; fixed path metering and spacing operations; and comparative workload operation between the forward and aft-flight decks of the NASA terminal control vehicle. The validation efforts also provide a cursory examination of the resultant demand workload based on the operating procedures depicted in the detailed task scenarios.
Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model
NASA Astrophysics Data System (ADS)
Arumugam, S.; Libera, D.
2017-12-01
Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.
NASA Astrophysics Data System (ADS)
Bi, Chuan-Xing; Hu, Ding-Yu; Zhang, Yong-Bin; Jing, Wen-Qian
2015-06-01
In previous studies, an equivalent source method (ESM)-based technique for recovering the free sound field in a noisy environment has been successfully applied to exterior problems. In order to evaluate its performance when applied to a more general noisy environment, that technique is used to identify active sources inside cavities where the sound field is composed of the field radiated by active sources and that reflected by walls. A patch approach with two semi-closed surfaces covering the target active sources is presented to perform the measurements, and the field that would be radiated by these target active sources into free space is extracted from the mixed field by using the proposed technique, which will be further used as the input of nearfield acoustic holography for source identification. Simulation and experimental results validate the effectiveness of the proposed technique for source identification in cavities, and show the feasibility of performing the measurements with a double layer planar array.
Uniting statistical and individual-based approaches for animal movement modelling.
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.
Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047
NASA Astrophysics Data System (ADS)
Riandry, M. A.; Ismet, I.; Akhsan, H.
2017-09-01
This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.
Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes
NASA Astrophysics Data System (ADS)
Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd
2016-04-01
In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.
1982-02-28
BRE O STNDADS 193- rC low& L --. -: !’- • ,- r;4; [.9 ’- DNA-TR-81-81 VALIDATION OF THE CONTRAST ATTENUATION TECHNIQUE ( CAT ) FOR DEDUCING DUST...TITLE (and Sublitle) S. TYPE OF REPORT & PERIOD COVERED VALIDATION OF THE CONTRAST ATTENUATION TECHNIQUE Technical Report ( CAT ) FOR DEDUCING DUST...SCATTERING AND EXTINCTION CONSIDERATIONS- -------- 77 C DATA ON FILMS*USED FOR THE MILL RACE CAT TEST -- ------- 85 2
Huang, Hui; Liu, Li; Ngadi, Michael O; Gariépy, Claude; Prasher, Shiv O
2014-01-01
Marbling is an important quality attribute of pork. Detection of pork marbling usually involves subjective scoring, which raises the efficiency costs to the processor. In this study, the ability to predict pork marbling using near-infrared (NIR) hyperspectral imaging (900-1700 nm) and the proper image processing techniques were studied. Near-infrared images were collected from pork after marbling evaluation according to current standard chart from the National Pork Producers Council. Image analysis techniques-Gabor filter, wide line detector, and spectral averaging-were applied to extract texture, line, and spectral features, respectively, from NIR images of pork. Samples were grouped into calibration and validation sets. Wavelength selection was performed on calibration set by stepwise regression procedure. Prediction models of pork marbling scores were built using multiple linear regressions based on derivatives of mean spectra and line features at key wavelengths. The results showed that the derivatives of both texture and spectral features produced good results, with correlation coefficients of validation of 0.90 and 0.86, respectively, using wavelengths of 961, 1186, and 1220 nm. The results revealed the great potential of the Gabor filter for analyzing NIR images of pork for the effective and efficient objective evaluation of pork marbling.
Non-Invasive Transcranial Brain Therapy Guided by CT Scans: an In Vivo Monkey Study
NASA Astrophysics Data System (ADS)
Marquet, F.; Pernot, M.; Aubry, J.-F.; Montaldo, G.; Tanter, M.; Boch, A.-L.; Kujas, M.; Seilhean, D.; Fink, M.
2007-05-01
Brain therapy using focused ultrasound remains very limited due to the strong aberrations induced by the skull. A minimally invasive technique using time-reversal was validated recently in-vivo on 20 sheeps. But such a technique requires a hydrophone at the focal point for the first step of the time-reversal procedure. A completely noninvasive therapy requires a reliable model of the acoustic properties of the skull in order to simulate this first step. 3-D simulations based on high-resolution CT images of a skull have been successfully performed with a finite differences code developed in our Laboratory. Thanks to the skull porosity, directly extracted from the CT images, we reconstructed acoustic speed, density and absorption maps and performed the computation. Computed wavefronts are in good agreement with experimental wavefronts acquired through the same part of the skull and this technique was validated in-vitro in the laboratory. A stereotactic frame has been designed and built in order to perform non invasive transcranial focusing in vivo. Here we describe all the steps of our new protocol, from the CT-scans to the therapy treatment and the first in vivo results on a monkey will be presented. This protocol is based on protocols already existing in radiotherapy.
Translating the Simulation of Procedural Drilling Techniques for Interactive Neurosurgical Training
Stredney, Don; Rezai, Ali R.; Prevedello, Daniel M.; Elder, J. Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J.
2014-01-01
Background Through previous and concurrent efforts, we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. This volumetric data helps drive an interactive multi-sensory, i.e., visual (stereo), aural (stereo), and tactile simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the CNS simulation initiative. Objective The goal of this multi-level development is to deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. Methods We discuss issues of biofidelity as well as our methods to provide objective, quantitative automated assessment for the residents. Results We conclude with a discussion of our experiences by reporting on preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. Conclusion We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum. PMID:24051887
NASA Astrophysics Data System (ADS)
Schulz, Hans Martin; Thies, Boris; Chang, Shih-Chieh; Bendix, Jörg
2016-03-01
The mountain cloud forest of Taiwan can be delimited from other forest types using a map of the ground fog frequency. In order to create such a frequency map from remotely sensed data, an algorithm able to detect ground fog is necessary. Common techniques for ground fog detection based on weather satellite data cannot be applied to fog occurrences in Taiwan as they rely on several assumptions regarding cloud properties. Therefore a new statistical method for the detection of ground fog in mountainous terrain from MODIS Collection 051 data is presented. Due to the sharpening of input data using MODIS bands 1 and 2, the method provides fog masks in a resolution of 250 m per pixel. The new technique is based on negative correlations between optical thickness and terrain height that can be observed if a cloud that is relatively plane-parallel is truncated by the terrain. A validation of the new technique using camera data has shown that the quality of fog detection is comparable to that of another modern fog detection scheme developed and validated for the temperate zones. The method is particularly applicable to optically thinner water clouds. Beyond a cloud optical thickness of ≈ 40, classification errors significantly increase.
IACOANGELI, Maurizio; NOCCHI, Niccolò; NASI, Davide; DI RIENZO, Alessandro; DOBRAN, Mauro; GLADI, Maurizio; COLASANTI, Roberto; ALVARO, Lorenzo; POLONARA, Gabriele; SCERRATI, Massimo
2016-01-01
The most important target of minimally invasive surgery is to obtain the best therapeutic effect with the least iatrogenic injury. In this background, a pivotal role in contemporary neurosurgery is played by the supraorbital key-hole approach proposed by Perneczky for anterior cranial base surgery. In this article, it is presented as a possible valid alternative to the traditional craniotomies in anterior cranial fossa meningiomas removal. From January 2008 to January 2012 at our department 56 patients underwent anterior cranial base meningiomas removal. Thirty-three patients were submitted to traditional approaches while 23 to supraorbital key-hole technique. A clinical and neuroradiological pre- and postoperative evaluation were performed, with attention to eventual complications, length of surgical procedure, and hospitalization. Compared to traditional approaches the supraorbital key-hole approach was associated neither to a greater range of postoperative complications nor to a longer surgical procedure and hospitalization while permitting the same lesion control. With this technique, minimization of brain exposition and manipulation with reduction of unwanted iatrogenic injuries, neurovascular structures preservation, and a better aesthetic result are possible. The supraorbital key-hole approach according to Perneckzy could represent a valid alternative to traditional approaches in anterior cranial base meningiomas surgery. PMID:26804334
Experiment T002: Manual navigation sightings
NASA Technical Reports Server (NTRS)
Smith, D.
1971-01-01
Navigation-type measurements through the window of the stabilized Gemini 12 spacecraft by the use of a hand-held sextant are reported. The major objectives were as follows: (1) to evaluate the ability of the crewmen to make accurate navigational measurements by the use of simple instruments in an authentic space flight environment; (2) to evaluate the operational feasibility of the measurement techniques by the use of the pressure suit with the helmet off and with the helmet on and the visor closed; (3) to evaluate operational problems associated with the spacecraft environment; and (4) to validate ground based simulation techniques by comparison of the inflight results with base line data obtained by the pilot by the use of simulators and celestial targets from ground based observatories.
Verification and Validation of KBS with Neural Network Components
NASA Technical Reports Server (NTRS)
Wen, Wu; Callahan, John
1996-01-01
Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.
NASA Astrophysics Data System (ADS)
Pahlavani, P.; Gholami, A.; Azimi, S.
2017-09-01
This paper presents an indoor positioning technique based on a multi-layer feed-forward (MLFF) artificial neural networks (ANN). Most of the indoor received signal strength (RSS)-based WLAN positioning systems use the fingerprinting technique that can be divided into two phases: the offline (calibration) phase and the online (estimation) phase. In this paper, RSSs were collected for all references points in four directions and two periods of time (Morning and Evening). Hence, RSS readings were sampled at a regular time interval and specific orientation at each reference point. The proposed ANN based model used Levenberg-Marquardt algorithm for learning and fitting the network to the training data. This RSS readings in all references points and the known position of these references points was prepared for training phase of the proposed MLFF neural network. Eventually, the average positioning error for this network using 30% check and validation data was computed approximately 2.20 meter.
Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R
2012-01-01
This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.
Cutti, Andrea Giovanni; Cappello, Angelo; Davalli, Angelo
2006-01-01
Soft tissue artefact is the dominant error source for upper extremity motion analyses that use skin-mounted markers, especially in humeral axial rotation. A new in vivo technique is presented that is based on the definition of a humerus bone-embedded frame almost "artefact free" but influenced by the elbow orientation in the measurement of the humeral axial rotation, and on an algorithm designed to solve this kinematic coupling. The technique was validated in vivo in a study of six healthy subjects who performed five arm-movement tasks. For each task the similarity between a gold standard pattern and the axial rotation pattern before and after the application of the compensation algorithm was evaluated in terms of explained variance, gain, phase and offset. In addition the root mean square error between the patterns was used as a global similarity estimator. After the application, for four out of five tasks, patterns were highly correlated, in phase, with almost equal gain and limited offset; the root mean square error decreased from the original 9 degrees to 3 degrees . The proposed technique appears to help compensate for the soft tissue artefact affecting axial rotation. A further development is also proposed to make the technique effective also for the pure prono-supination task.
Survey of Verification and Validation Techniques for Small Satellite Software Development
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2015-01-01
The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.
Measuring Adverse Events in Helicopter Emergency Medical Services: Establishing Content Validity
Patterson, P. Daniel; Lave, Judith R.; Martin-Gill, Christian; Weaver, Matthew D.; Wadas, Richard J.; Arnold, Robert M.; Roth, Ronald N.; Mosesso, Vincent N.; Guyette, Francis X.; Rittenberger, Jon C.; Yealy, Donald M.
2015-01-01
Introduction We sought to create a valid framework for detecting Adverse Events (AEs) in the high-risk setting of Helicopter Emergency Medical Services (HEMS). Methods We assembled a panel of 10 expert clinicians (n=6 emergency medicine physicians and n=4 prehospital nurses and flight paramedics) affiliated with a large multi-state HEMS organization in the Northeast U.S. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the Content Validity Index (CVI), to quantify the validity of the framework’s content. Results The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: 1) a trigger tool, 2) a method for rating proximal cause, and 3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. Conclusions We demonstrate a standardized process for the development of a content valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS. PMID:24003951
NASA Astrophysics Data System (ADS)
Gerzen, Tatjana; Wilken, Volker; Hoque, Mainul; Minkwitz, David; Schlueter, Stefan
2016-04-01
The ionosphere is the upper part of the Earth's atmosphere, where sufficient free electrons exist to affect the propagation of radio waves. Therefore, the treatment of the ionosphere is a critical issue for many applications dealing with trans-ionospheric signals such as GNSS positioning, GNSS related augmentation systems (e.g. EGNOS and WAAS) and remote sensing. The European Geostationary Navigation Overlay Service (EGNOS) is the European Satellite Based Augmentation Service (SBAS) that provides value added services, in particular to safety critical GNSS applications, e.g. aviation and maritime traffic. In the frame of the European GNSS Evolution Programme (EGEP), ESA has launched several activities, supporting the design, development and qualification of the operational EGNOS infrastructure and associated services. Ionospheric Reference Scenarios (IRSs) are used by ESA in order to conduct the EGNOS performance simulations and to assure the capability for maintaining accuracy, integrity and availability of the EGNOS system, especially during ionospheric storm conditions. The project Data Assimilation Techniques for Ionospheric Reference Scenarios (DAIS) - aims the provision of improved EGNOS IRSs. The main tasks are the calculation and validation of time series of IRSs by a 3D assimilation approach that combines space borne and ground based GNSS observations as well as ionosonde measurements with an ionospheric background model. The special focus thereby is to demonstrate that space-based measurements can significantly contribute to fill data gaps in GNSS ground networks (particularly in Africa and over the oceans) when generating the IRSs. In this project we selected test periods of perturbed and nominal ionospheric conditions and filtered the collected data for outliers. We defined and developed an applicable technique for the 3D assimilation and applied this technique for the generation of IRSs covering the EGNOS V3 extended service area. Afterwards the generated 3D ionosphere reconstructions as well as the final IRSs are validated with independent GNSS slant TEC (Total Electron Content) data, vertical sounding observations and JASON 1 and 2 derived vertical TEC. This presentation gives an overview about the DAIS project and the achieved results. We outline the assimilation approach, show the reconstruction and the validation results and finally address open questions.
Techniques for Down-Sampling a Measured Surface Height Map for Model Validation
NASA Technical Reports Server (NTRS)
Sidick, Erkin
2012-01-01
This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces
A New Computational Technique for the Generation of Optimised Aircraft Trajectories
NASA Astrophysics Data System (ADS)
Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto
2017-12-01
A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.
NASA Astrophysics Data System (ADS)
Salsone, Silvia; Taylor, Andrew; Gomez, Juliana; Pretty, Iain; Ellwood, Roger; Dickinson, Mark; Lombardo, Giuseppe; Zakian, Christian
2012-07-01
Near infrared (NIR) multispectral imaging is a novel noninvasive technique that maps and quantifies dental caries. The technique has the ability to reduce the confounding effect of stain present on teeth. The aim of this study was to develop and validate a quantitative NIR multispectral imaging system for caries detection and assessment against a histological reference standard. The proposed technique is based on spectral imaging at specific wavelengths in the range from 1000 to 1700 nm. A total of 112 extracted teeth (molars and premolars) were used and images of occlusal surfaces at different wavelengths were acquired. Three spectral reflectance images were combined to generate a quantitative lesion map of the tooth. The maximum value of the map at the corresponding histological section was used as the NIR caries score. The NIR caries score significantly correlated with the histological reference standard (Spearman's Coefficient=0.774, p<0.01). Caries detection sensitivities and specificities of 72% and 91% for sound areas, 36% and 79% for lesions on the enamel, and 82% and 69% for lesions in dentin were found. These results suggest that NIR spectral imaging is a novel and promising method for the detection, quantification, and mapping of dental caries.
Salsone, Silvia; Taylor, Andrew; Gomez, Juliana; Pretty, Iain; Ellwood, Roger; Dickinson, Mark; Lombardo, Giuseppe; Zakian, Christian
2012-07-01
Near infrared (NIR) multispectral imaging is a novel noninvasive technique that maps and quantifies dental caries. The technique has the ability to reduce the confounding effect of stain present on teeth. The aim of this study was to develop and validate a quantitative NIR multispectral imaging system for caries detection and assessment against a histological reference standard. The proposed technique is based on spectral imaging at specific wavelengths in the range from 1000 to 1700 nm. A total of 112 extracted teeth (molars and premolars) were used and images of occlusal surfaces at different wavelengths were acquired. Three spectral reflectance images were combined to generate a quantitative lesion map of the tooth. The maximum value of the map at the corresponding histological section was used as the NIR caries score. The NIR caries score significantly correlated with the histological reference standard (Spearman's Coefficient=0.774, p<0.01). Caries detection sensitivities and specificities of 72% and 91% for sound areas, 36% and 79% for lesions on the enamel, and 82% and 69% for lesions in dentin were found. These results suggest that NIR spectral imaging is a novel and promising method for the detection, quantification, and mapping of dental caries.
Benchmarking the ATLAS software through the Kit Validation engine
NASA Astrophysics Data System (ADS)
De Salvo, Alessandro; Brasolin, Franco
2010-04-01
The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.
NASA Astrophysics Data System (ADS)
Higuita Cano, Mauricio; Mousli, Mohamed Islam Aniss; Kelouwani, Sousso; Agbossou, Kodjo; Hammoudi, Mhamed; Dubé, Yves
2017-03-01
This work investigates the design and validation of a fuel cell management system (FCMS) which can perform when the fuel cell is at water freezing temperature. This FCMS is based on a new tracking technique with intelligent prediction, which combined the Maximum Efficiency Point Tracking with variable perturbation-current step and the fuzzy logic technique (MEPT-FL). Unlike conventional fuel cell control systems, our proposed FCMS considers the cold-weather conditions, the reduction of fuel cell set-point oscillations. In addition, the FCMS is built to respond quickly and effectively to the variations of electric load. A temperature controller stage is designed in conjunction with the MEPT-FL in order to operate the FC at low-temperature values whilst tracking at the same time the maximum efficiency point. The simulation results have as well experimental validation suggest that propose approach is effective and can achieve an average efficiency improvement up to 8%. The MEPT-FL is validated using a Proton Exchange Membrane Fuel Cell (PEMFC) of 500 W.
Testing and Validating Machine Learning Classifiers by Metamorphic Testing☆
Xie, Xiaoyuan; Ho, Joshua W. K.; Murphy, Christian; Kaiser, Gail; Xu, Baowen; Chen, Tsong Yueh
2011-01-01
Machine Learning algorithms have provided core functionality to many application domains - such as bioinformatics, computational linguistics, etc. However, it is difficult to detect faults in such applications because often there is no “test oracle” to verify the correctness of the computed outputs. To help address the software quality, in this paper we present a technique for testing the implementations of machine learning classification algorithms which support such applications. Our approach is based on the technique “metamorphic testing”, which has been shown to be effective to alleviate the oracle problem. Also presented include a case study on a real-world machine learning application framework, and a discussion of how programmers implementing machine learning algorithms can avoid the common pitfalls discovered in our study. We also conduct mutation analysis and cross-validation, which reveal that our method has high effectiveness in killing mutants, and that observing expected cross-validation result alone is not sufficiently effective to detect faults in a supervised classification program. The effectiveness of metamorphic testing is further confirmed by the detection of real faults in a popular open-source classification program. PMID:21532969
Validation assessment of shoreline extraction on medium resolution satellite image
NASA Astrophysics Data System (ADS)
Manaf, Syaifulnizam Abd; Mustapha, Norwati; Sulaiman, Md Nasir; Husin, Nor Azura; Shafri, Helmi Zulhaidi Mohd
2017-10-01
Monitoring coastal zones helps provide information about the conditions of the coastal zones, such as erosion or accretion. Moreover, monitoring the shorelines can help measure the severity of such conditions. Such measurement can be performed accurately by using Earth observation satellite images rather than by using traditional ground survey. To date, shorelines can be extracted from satellite images with a high degree of accuracy by using satellite image classification techniques based on machine learning to identify the land and water classes of the shorelines. In this study, the researchers validated the results of extracted shorelines of 11 classifiers using a reference shoreline provided by the local authority. Specifically, the validation assessment was performed to examine the difference between the extracted shorelines and the reference shorelines. The research findings showed that the SVM Linear was the most effective image classification technique, as evidenced from the lowest mean distance between the extracted shoreline and the reference shoreline. Furthermore, the findings showed that the accuracy of the extracted shoreline was not directly proportional to the accuracy of the image classification.
Zwahlen, Marcel; Wells, Jonathan C.; Bender, Nicole; Henneberg, Maciej
2017-01-01
Background Manual anthropometric measurements are time-consuming and challenging to perform within acceptable intra- and inter-individual error margins in large studies. Three-dimensional (3D) laser body scanners provide a fast and precise alternative: within a few seconds the system produces a 3D image of the body topography and calculates some 150 standardised body size measurements. Objective The aim was to enhance the small number of existing validation studies and compare scan and manual techniques based on five selected measurements. We assessed the agreement between two repeated measurements within the two methods, analysed the direct agreement between the two methods, and explored the differences between the techniques when used in regressions assessing the effect of health related determinants on body shape indices. Methods We performed two repeated body scans on 123 volunteering young men using a Vitus Smart XXL body scanner. We manually measured height, waist, hip, buttock, and chest circumferences twice for each participant according to the WHO guidelines. The participants also filled in a basic questionnaire. Results Mean differences between the two scan measurements were smaller than between the two manual measurements, and precision as well as intra-class correlation coefficients were higher. Both techniques were strongly correlated. When comparing means between both techniques we found significant differences: Height was systematically shorter by 2.1 cm, whereas waist, hip and bust circumference measurements were larger in the scans by 1.17–4.37 cm. In consequence, body shape indices also became larger and the prevalence of overweight was greater when calculated from the scans. Between 4.1% and 7.3% of the probands changed risk category from normal to overweight when classified based on the scans. However, when employing regression analyses the two measurement techniques resulted in very similar coefficients, confidence intervals, and p-values. Conclusion For performing a large number of measurements in a large group of probands in a short time, body scans generally showed good feasibility, reliability, and validity in comparison to manual measurements. The systematic differences between the methods may result from their technical nature (contact vs. non-contact). PMID:28289559
Roland, Michelle; Hull, M L; Howell, S M
2011-05-01
In a previous paper, we reported the virtual axis finder, which is a new method for finding the rotational axes of the knee. The virtual axis finder was validated through simulations that were subject to limitations. Hence, the objective of the present study was to perform a mechanical validation with two measurement modalities: 3D video-based motion analysis and marker-based roentgen stereophotogrammetric analysis (RSA). A two rotational axis mechanism was developed, which simulated internal-external (or longitudinal) and flexion-extension (FE) rotations. The actual axes of rotation were known with respect to motion analysis and RSA markers within ± 0.0006 deg and ± 0.036 mm and ± 0.0001 deg and ± 0.016 mm, respectively. The orientation and position root mean squared errors for identifying the longitudinal rotation (LR) and FE axes with video-based motion analysis (0.26 deg, 0.28 m, 0.36 deg, and 0.25 mm, respectively) were smaller than with RSA (1.04 deg, 0.84 mm, 0.82 deg, and 0.32 mm, respectively). The random error or precision in the orientation and position was significantly better (p=0.01 and p=0.02, respectively) in identifying the LR axis with video-based motion analysis (0.23 deg and 0.24 mm) than with RSA (0.95 deg and 0.76 mm). There was no significant difference in the bias errors between measurement modalities. In comparing the mechanical validations to virtual validations, the virtual validations produced comparable errors to those of the mechanical validation. The only significant difference between the errors of the mechanical and virtual validations was the precision in the position of the LR axis while simulating video-based motion analysis (0.24 mm and 0.78 mm, p=0.019). These results indicate that video-based motion analysis with the equipment used in this study is the superior measurement modality for use with the virtual axis finder but both measurement modalities produce satisfactory results. The lack of significant differences between validation techniques suggests that the virtual sensitivity analysis previously performed was appropriately modeled. Thus, the virtual axis finder can be applied with a thorough understanding of its errors in a variety of test conditions.
Ciceri, E; Recchia, S; Dossi, C; Yang, L; Sturgeon, R E
2008-01-15
The development and validation of a method for the determination of mercury in sediments using a sector field inductively coupled plasma mass spectrometer (SF-ICP-MS) for detection is described. The utilization of isotope dilution (ID) calibration is shown to solve analytical problems related to matrix composition. Mass bias is corrected using an internal mass bias correction technique, validated against the traditional standard bracketing method. The overall analytical protocol is validated against NRCC PACS-2 marine sediment CRM. The estimated limit of detection is 12ng/g. The proposed procedure was applied to the analysis of a real sediment core sampled to a depth of 160m in Lake Como, where Hg concentrations ranged from 66 to 750ng/g.
The Effectiveness of Guided Inquiry-based Learning Material on Students’ Science Literacy Skills
NASA Astrophysics Data System (ADS)
Aulia, E. V.; Poedjiastoeti, S.; Agustini, R.
2018-01-01
The purpose of this research is to describe the effectiveness of guided inquiry-based learning material to improve students’ science literacy skills on solubility and solubility product concepts. This study used Research and Development (R&D) design and was implemented to the 11th graders of Muhammadiyah 4 Senior High School Surabaya in 2016/2017 academic year with one group pre-test and post-test design. The data collection techniques used were validation, observation, test, and questionnaire. The results of this research showed that the students’ science literacy skills are different after implementation of guided inquiry-based learning material. The guided inquiry-based learning material is effective to improve students’ science literacy skills on solubility and solubility product concepts by getting N-gain score with medium and high category. This improvement caused by the developed learning material such as lesson plan, student worksheet, and science literacy skill tests were categorized as valid and very valid. In addition, each of the learning phases in lesson plan has been well implemented. Therefore, it can be concluded that the guided inquiry-based learning material are effective to improve students’ science literacy skills on solubility and solubility product concepts in senior high school.
Synchrophasor-Assisted Prediction of Stability/Instability of a Power System
NASA Astrophysics Data System (ADS)
Saha Roy, Biman Kumar; Sinha, Avinash Kumar; Pradhan, Ashok Kumar
2013-05-01
This paper presents a technique for real-time prediction of stability/instability of a power system based on synchrophasor measurements obtained from phasor measurement units (PMUs) at generator buses. For stability assessment the technique makes use of system severity indices developed using bus voltage magnitude obtained from PMUs and generator electrical power. Generator power is computed using system information and PMU information like voltage and current phasors obtained from PMU. System stability/instability is predicted when the indices exceeds a threshold value. A case study is carried out on New England 10-generator, 39-bus system to validate the performance of the technique.
Discrete Wavelet Transform for Fault Locations in Underground Distribution System
NASA Astrophysics Data System (ADS)
Apisit, C.; Ngaopitakkul, A.
2010-10-01
In this paper, a technique for detecting faults in underground distribution system is presented. Discrete Wavelet Transform (DWT) based on traveling wave is employed in order to detect the high frequency components and to identify fault locations in the underground distribution system. The first peak time obtained from the faulty bus is employed for calculating the distance of fault from sending end. The validity of the proposed technique is tested with various fault inception angles, fault locations and faulty phases. The result is found that the proposed technique provides satisfactory result and will be very useful in the development of power systems protection scheme.
Report of the panel on international programs
NASA Technical Reports Server (NTRS)
Anderson, Allen Joel; Fuchs, Karl W.; Ganeka, Yasuhiro; Gaur, Vinod; Green, Andrew A.; Siegfried, W.; Lambert, Anthony; Rais, Jacub; Reighber, Christopher; Seeger, Herman
1991-01-01
The panel recommends that NASA participate and take an active role in the continuous monitoring of existing regional networks, the realization of high resolution geopotential and topographic missions, the establishment of interconnection of the reference frames as defined by different space techniques, the development and implementation of automation for all ground-to-space observing systems, calibration and validation experiments for measuring techniques and data, the establishment of international space-based networks for real-time transmission of high density space data in standardized formats, tracking and support for non-NASA missions, and the extension of state-of-the art observing and analysis techniques to developing nations.
NASA Astrophysics Data System (ADS)
Cui, Yi-an; Liu, Lanbo; Zhu, Xiaoxiong
2017-08-01
Monitoring the extent and evolution of contaminant plumes in local and regional groundwater systems from existing landfills is critical in contamination control and remediation. The self-potential survey is an efficient and economical nondestructive geophysical technique that can be used to investigate underground contaminant plumes. Based on the unscented transform, we have built a Kalman filtering cycle to conduct time-lapse data assimilation for monitoring the transport of solute based on the solute transport experiment using a bench-scale physical model. The data assimilation was formed by modeling the evolution based on the random walk model and observation correcting based on the self-potential forward. Thus, monitoring self-potential data can be inverted by the data assimilation technique. As a result, we can reconstruct the dynamic process of the contaminant plume instead of using traditional frame-to-frame static inversion, which may cause inversion artifacts. The data assimilation inversion algorithm was evaluated through noise-added synthetic time-lapse self-potential data. The result of the numerical experiment shows validity, accuracy and tolerance to the noise of the dynamic inversion. To validate the proposed algorithm, we conducted a scaled-down sandbox self-potential observation experiment to generate time-lapse data that closely mimics the real-world contaminant monitoring setup. The results of physical experiments support the idea that the data assimilation method is a potentially useful approach for characterizing the transport of contamination plumes using the unscented Kalman filter (UKF) data assimilation technique applied to field time-lapse self-potential data.
NASA Astrophysics Data System (ADS)
Sokkar, T. Z. N.; El-Farahaty, K. A.; El-Bakary, M. A.; Raslan, M. I.; Omar, E. Z.; Hamza, A. A.
2018-03-01
The optical setup of the transport intensity equation (TIE) technique is developed to be valid for measuring the optical properties of the highly-oriented anisotropic fibres. This development is based on the microstructure models of the highly-oriented anisotropic fibres and the principle of anisotropy. We provide the setup of TIE technique with polarizer which is controlled via stepper motor. This developed technique is used to investigate the refractive indices in the parallel and perpendicular polarization directions of light for the highly-oriented poly (ethylene terephthalate) (PET) fibres and hence its birefringence. The obtained results through the developed TIE technique for PET fibre are compared with that determined experimentally using the Mach-Zehnder interferometer under the same conditions. The comparison shows a good agreement between the obtained results from the developed technique and that obtained from the Mach-Zehnder interferometer technique.
Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil
2015-01-01
We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730
ERIC Educational Resources Information Center
Myers, Greeley; Siera, Steven
1980-01-01
Default on guaranteed student loans has been increasing. The use of discriminant analysis as a technique to identify "good" v "bad" student loans based on information available from the loan application is discussed. Research to test the ability of models to such predictions is reported. (Author/MLW)
ERIC Educational Resources Information Center
Pallini, Susanna; Bove, Giuseppe; Laghi, Fiorenzo
2011-01-01
This study applies a multidimensional scaling (MSD) technique to investigate the structural validity of the Work Values Inventory for Adolescents with a sample of Italian students. The MSD results indicated the presence of two underlying orthogonal dimensions: individuality versus sociality and conservation versus exploration. Implications for…
Interference detection and correction applied to incoherent-scatter radar power spectrum measurement
NASA Technical Reports Server (NTRS)
Ying, W. P.; Mathews, J. D.; Rastogi, P. K.
1986-01-01
A median filter based interference detection and correction technique is evaluated and the method applied to the Arecibo incoherent scatter radar D-region ionospheric power spectrum is discussed. The method can be extended to other kinds of data when the statistics involved in the process are still valid.
Parental Representations and Attachment Security in Young Israeli Mothers' Bird's Nest Drawings
ERIC Educational Resources Information Center
Goldner, Limor; Golan, Yifat
2016-01-01
The Bird's Nest Drawing (BND; Kaiser, 1996) is an art-based technique developed to assess attachment security. In an attempt to expand the BND's validity, the authors explored the possible associations between parental representations and the BND's dimensions and attachment classifications in a sample of 80 young Israeli mothers. Positive…
Application of the Combination Approach for Estimating Evapotranspiration in Puerto Rico
NASA Technical Reports Server (NTRS)
Harmsen, Eric; Luvall, Jeffrey; Gonzalez, Jorge
2005-01-01
The ability to estimate short-term fluxes of water vapor from the land surface is important for validating latent heat flux estimates from high resolution remote sensing techniques. A new, relatively inexpensive method is presented for estimating t h e ground-based values of the surface latent heat flux or evapotranspiration.
ERIC Educational Resources Information Center
Shotsberger, Paul G.
The National Council of Teachers of Mathematics (1991) has identified the use of computers as a necessary teaching tool for enhancing mathematical discourse in schools. One possible vehicle of technological change in mathematics classrooms is the Intelligent Tutoring System (ITS), an artificially intelligent computer-based tutor. This paper…
Modeling and controlling a robotic convoy using guidance laws strategies.
Belkhouche, Fethi; Belkhouche, Boumediene
2005-08-01
This paper deals with the problem of modeling and controlling a robotic convoy. Guidance laws techniques are used to provide a mathematical formulation of the problem. The guidance laws used for this purpose are the velocity pursuit, the deviated pursuit, and the proportional navigation. The velocity pursuit equations model the robot's path under various sensors based control laws. A systematic study of the tracking problem based on this technique is undertaken. These guidance laws are applied to derive decentralized control laws for the angular and linear velocities. For the angular velocity, the control law is directly derived from the guidance laws after considering the relative kinematics equations between successive robots. The second control law maintains the distance between successive robots constant by controlling the linear velocity. This control law is derived by considering the kinematics equations between successive robots under the considered guidance law. Properties of the method are discussed and proven. Simulation results confirm the validity of our approach, as well as the validity of the properties of the method. Index Terms-Guidance laws, relative kinematics equations, robotic convoy, tracking.
DelGiudice, Nancy J; Street, Nancy; Torchia, Ronald J; Sawyer, Susan S; Bernard, Sylvia Allison; Holick, Michael F
2018-05-24
Vitamin D deficiency and insufficiency is a pandemic problem in children and adolescents in the United States. The problem may be aggravated by the inconsistent implementation of current clinical practice guidelines for vitamin D management by pediatric primary care providers. This study examines the relationship between primary care providers' prescribing vitamin D to children ages 1 through 18 years and their practice actions and knowledge. A descriptive correlation design was used. Participants were recruited from a purposive sample of pediatricians and pediatric nurse practitioners through an online invitation to participate in a survey. Reliability and validity was established for the survey developed by the principal investigator using a web-based Delphi technique. Results from this study indicate that although most providers are aware that vitamin D insufficiency and deficiency are problems, fewer than half currently recommend 600- to 1,000-IU supplementation to their patients ages 1 through 18 years. Copyright © 2018 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.
Genetically Validated Drug Targets in Leishmania: Current Knowledge and Future Prospects.
Jones, Nathaniel G; Catta-Preta, Carolina M C; Lima, Ana Paula C A; Mottram, Jeremy C
2018-04-13
There has been a very limited number of high-throughput screening campaigns carried out with Leishmania drug targets. In part, this is due to the small number of suitable target genes that have been shown by genetic or chemical methods to be essential for the parasite. In this perspective, we discuss the state of genetic target validation in the field of Leishmania research and review the 200 Leishmania genes and 36 Trypanosoma cruzi genes for which gene deletion attempts have been made since the first published case in 1990. We define a quality score for the different genetic deletion techniques that can be used to identify potential drug targets. We also discuss how the advances in genome-scale gene disruption techniques have been used to assist target-based and phenotypic-based drug development in other parasitic protozoa and why Leishmania has lacked a similar approach so far. The prospects for this scale of work are considered in the context of the application of CRISPR/Cas9 gene editing as a useful tool in Leishmania.
Clustering molecular dynamics trajectories for optimizing docking experiments.
De Paris, Renata; Quevedo, Christian V; Ruiz, Duncan D; Norberto de Souza, Osmar; Barros, Rodrigo C
2015-01-01
Molecular dynamics simulations of protein receptors have become an attractive tool for rational drug discovery. However, the high computational cost of employing molecular dynamics trajectories in virtual screening of large repositories threats the feasibility of this task. Computational intelligence techniques have been applied in this context, with the ultimate goal of reducing the overall computational cost so the task can become feasible. Particularly, clustering algorithms have been widely used as a means to reduce the dimensionality of molecular dynamics trajectories. In this paper, we develop a novel methodology for clustering entire trajectories using structural features from the substrate-binding cavity of the receptor in order to optimize docking experiments on a cloud-based environment. The resulting partition was selected based on three clustering validity criteria, and it was further validated by analyzing the interactions between 20 ligands and a fully flexible receptor (FFR) model containing a 20 ns molecular dynamics simulation trajectory. Our proposed methodology shows that taking into account features of the substrate-binding cavity as input for the k-means algorithm is a promising technique for accurately selecting ensembles of representative structures tailored to a specific ligand.
NASA Technical Reports Server (NTRS)
Khaiyer, M. M.; Doelling, D. R.; Palikonda, R.; Mordeen, M. L.; Minnis, P.
2007-01-01
This poster presentation reviews the process used to validate the GOES-10 satellite derived cloud and radiative properties. The ARM Mobile Facility (AMF) deployment at Pt Reyes, CA as part of the Marine Stratus Radiation Aerosol and Drizzle experiment (MASRAD), 14 March - 14 September 2005 provided an excellent chance to validate satellite cloud-property retrievals with the AMF's flexible suite of ground-based remote sensing instruments. For this comparison, NASA LaRC GOES10 satellite retrievals covering this region and period were re-processed using an updated version of the Visible Infrared Solar-Infrared Split-Window Technique (VISST), which uses data taken at 4 wavelengths (0.65, 3.9,11 and 12 m resolution), and computes broadband fluxes using improved CERES (Clouds and Earth's Radiant Energy System)-GOES-10 narrowband-to-broadband flux conversion coefficients. To validate MASRAD GOES-10 satellite-derived cloud property data, VISST-derived cloud amounts, heights, liquid water paths are compared with similar quantities derived from available ARM ground-based instrumentation and with CERES fluxes from Terra.
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil
2015-01-01
PRIMsrc is a novel implementation of a non-parametric bump hunting procedure, based on the Patient Rule Induction Method (PRIM), offering a unified treatment of outcome variables, including censored time-to-event (Survival), continuous (Regression) and discrete (Classification) responses. To fit the model, it uses a recursive peeling procedure with specific peeling criteria and stopping rules depending on the response. To validate the model, it provides an objective function based on prediction-error or other specific statistic, as well as two alternative cross-validation techniques, adapted to the task of decision-rule making and estimation in the three types of settings. PRIMsrc comes as an open source R package, including at this point: (i) a main function for fitting a Survival Bump Hunting model with various options allowing cross-validated model selection to control model size (#covariates) and model complexity (#peeling steps) and generation of cross-validated end-point estimates; (ii) parallel computing; (iii) various S3-generic and specific plotting functions for data visualization, diagnostic, prediction, summary and display of results. It is available on CRAN and GitHub. PMID:26798326
NASA Astrophysics Data System (ADS)
Asal Kzar, Ahmed; Mat Jafri, M. Z.; Hwee San, Lim; Al-Zuky, Ali A.; Mutter, Kussay N.; Hassan Al-Saleh, Anwar
2016-06-01
There are many techniques that have been given for water quality problem, but the remote sensing techniques have proven their success, especially when the artificial neural networks are used as mathematical models with these techniques. Hopfield neural network is one type of artificial neural networks which is common, fast, simple, and efficient, but it when it deals with images that have more than two colours such as remote sensing images. This work has attempted to solve this problem via modifying the network that deals with colour remote sensing images for water quality mapping. A Feed-forward Hopfield Neural Network Algorithm (FHNNA) was modified and used with a satellite colour image from type of Thailand earth observation system (THEOS) for TSS mapping in the Penang strait, Malaysia, through the classification of TSS concentrations. The new algorithm is based essentially on three modifications: using HNN as feed-forward network, considering the weights of bitplanes, and non-self-architecture or zero diagonal of weight matrix, in addition, it depends on a validation data. The achieved map was colour-coded for visual interpretation. The efficiency of the new algorithm has found out by the higher correlation coefficient (R=0.979) and the lower root mean square error (RMSE=4.301) between the validation data that were divided into two groups. One used for the algorithm and the other used for validating the results. The comparison was with the minimum distance classifier. Therefore, TSS mapping of polluted water in Penang strait, Malaysia, can be performed using FHNNA with remote sensing technique (THEOS). It is a new and useful application of HNN, so it is a new model with remote sensing techniques for water quality mapping which is considered important environmental problem.
NASA Astrophysics Data System (ADS)
Bayharti; Iswendi, I.; Arifin, M. N.
2018-04-01
The purpose of this research was to produce a chemistry game card as an instructional media in the subject of naming chemical compounds and determine the degree of validity and practicality of instructional media produced. Type of this research was Research and Development (R&D) that produced a product. The development model used was4-D model which comprises four stages incuding: (1) define, (2) design, (3) develop, and (4) disseminate. This research was restricted at the development stage. Chemistry game card developed was validated by seven validators and practicality was tested to class X6 students of SMAN 5 Padang. Instrument of this research is questionnair that consist of validity sheet and practicality sheet. Technique in collection data was done by distributing questionnaire to the validators, chemistry teachers, and students. The data were analyzed by using formula Cohen’s Kappa. Based on data analysis, validity of chemistry game card was0.87 with category highly valid and practicality of chemistry game card was 0.91 with category highly practice.
Xu, Yisheng; Tong, Yunxia; Liu, Siyuan; Chow, Ho Ming; AbdulSabur, Nuria Y.; Mattay, Govind S.; Braun, Allen R.
2014-01-01
A comprehensive set of methods based on spatial independent component analysis (sICA) is presented as a robust technique for artifact removal, applicable to a broad range of functional magnetic resonance imaging (fMRI) experiments that have been plagued by motion-related artifacts. Although the applications of sICA for fMRI denoising have been studied previously, three fundamental elements of this approach have not been established as follows: 1) a mechanistically-based ground truth for component classification; 2) a general framework for evaluating the performance and generalizability of automated classifiers; 3) a reliable method for validating the effectiveness of denoising. Here we perform a thorough investigation of these issues and demonstrate the power of our technique by resolving the problem of severe imaging artifacts associated with continuous overt speech production. As a key methodological feature, a dual-mask sICA method is proposed to isolate a variety of imaging artifacts by directly revealing their extracerebral spatial origins. It also plays an important role for understanding the mechanistic properties of noise components in conjunction with temporal measures of physical or physiological motion. The potentials of a spatially-based machine learning classifier and the general criteria for feature selection have both been examined, in order to maximize the performance and generalizability of automated component classification. The effectiveness of denoising is quantitatively validated by comparing the activation maps of fMRI with those of positron emission tomography acquired under the same task conditions. The general applicability of this technique is further demonstrated by the successful reduction of distance-dependent effect of head motion on resting-state functional connectivity. PMID:25225001
Xu, Yisheng; Tong, Yunxia; Liu, Siyuan; Chow, Ho Ming; AbdulSabur, Nuria Y; Mattay, Govind S; Braun, Allen R
2014-12-01
A comprehensive set of methods based on spatial independent component analysis (sICA) is presented as a robust technique for artifact removal, applicable to a broad range of functional magnetic resonance imaging (fMRI) experiments that have been plagued by motion-related artifacts. Although the applications of sICA for fMRI denoising have been studied previously, three fundamental elements of this approach have not been established as follows: 1) a mechanistically-based ground truth for component classification; 2) a general framework for evaluating the performance and generalizability of automated classifiers; and 3) a reliable method for validating the effectiveness of denoising. Here we perform a thorough investigation of these issues and demonstrate the power of our technique by resolving the problem of severe imaging artifacts associated with continuous overt speech production. As a key methodological feature, a dual-mask sICA method is proposed to isolate a variety of imaging artifacts by directly revealing their extracerebral spatial origins. It also plays an important role for understanding the mechanistic properties of noise components in conjunction with temporal measures of physical or physiological motion. The potentials of a spatially-based machine learning classifier and the general criteria for feature selection have both been examined, in order to maximize the performance and generalizability of automated component classification. The effectiveness of denoising is quantitatively validated by comparing the activation maps of fMRI with those of positron emission tomography acquired under the same task conditions. The general applicability of this technique is further demonstrated by the successful reduction of distance-dependent effect of head motion on resting-state functional connectivity. Copyright © 2014 Elsevier Inc. All rights reserved.
Autonomous formation flying based on GPS — PRISMA flight results
NASA Astrophysics Data System (ADS)
D'Amico, Simone; Ardaens, Jean-Sebastien; De Florio, Sergio
2013-01-01
This paper presents flight results from the early harvest of the Spaceborne Autonomous Formation Flying Experiment (SAFE) conducted in the frame of the Swedish PRISMA technology demonstration mission. SAFE represents one of the first demonstrations in low Earth orbit of an advanced guidance, navigation and control system for dual-spacecraft formations. Innovative techniques based on differential GPS-based navigation and relative orbital elements control are validated and tuned in orbit to fulfill the typical requirements of future distributed scientific instruments for remote sensing.
NASA Astrophysics Data System (ADS)
Barone, Fabrizio; Giordano, Gerardo
2018-02-01
We present the Extended Folded Pendulum Model (EFPM), a model developed for a quantitative description of the dynamical behavior of a folded pendulum generically oriented in space. This model, based on the Tait-Bryan angular reference system, highlights the relationship between the folded pendulum orientation in the gravitational field and its natural resonance frequency. Tis model validated by tests performed with a monolithic UNISA Folded Pendulum, highlights a new technique of implementation of folded pendulum based tiltmeters.
NASA Astrophysics Data System (ADS)
Gao, Xiatian; Wang, Xiaogang; Jiang, Binhao
2017-10-01
UPSF (Universal Plasma Simulation Framework) is a new plasma simulation code designed for maximum flexibility by using edge-cutting techniques supported by C++17 standard. Through use of metaprogramming technique, UPSF provides arbitrary dimensional data structures and methods to support various kinds of plasma simulation models, like, Vlasov, particle in cell (PIC), fluid, Fokker-Planck, and their variants and hybrid methods. Through C++ metaprogramming technique, a single code can be used to arbitrary dimensional systems with no loss of performance. UPSF can also automatically parallelize the distributed data structure and accelerate matrix and tensor operations by BLAS. A three-dimensional particle in cell code is developed based on UPSF. Two test cases, Landau damping and Weibel instability for electrostatic and electromagnetic situation respectively, are presented to show the validation and performance of the UPSF code.
A diagnostic technique used to obtain cross range radiation centers from antenna patterns
NASA Technical Reports Server (NTRS)
Lee, T. H.; Burnside, W. D.
1988-01-01
A diagnostic technique to obtain cross range radiation centers based on antenna radiation patterns is presented. This method is similar to the synthetic aperture processing of scattered fields in the radar application. Coherent processing of the radiated fields is used to determine the various radiation centers associated with the far-zone pattern of an antenna for a given radiation direction. This technique can be used to identify an unexpected radiation center that creates an undesired effect in a pattern; on the other hand, it can improve a numerical simulation of the pattern by identifying other significant mechanisms. Cross range results for two 8' reflector antennas are presented to illustrate as well as validate that technique.
Validation of helicopter noise prediction techniques
NASA Technical Reports Server (NTRS)
Succi, G. P.
1981-01-01
The current techniques of helicopter rotor noise prediction attempt to describe the details of the noise field precisely and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The purpose of this paper is to review those techniques in general and the Farassat/Nystrom analysis in particular. The predictions of the Farassat/Nystrom noise computer program, using both measured and calculated blade surface pressure data, are compared to measured noise level data. This study is based on a contract from NASA to Bolt Beranek and Newman Inc. with measured data from the AH-1G Helicopter Operational Loads Survey flight test program supplied by Bell Helicopter Textron.
NASA Astrophysics Data System (ADS)
Mohanty, B.; Jena, S.; Panda, R. K.
2016-12-01
The overexploitation of groundwater elicited in abandoning several shallow tube wells in the study Basin in Eastern India. For the sustainability of groundwater resources, basin-scale modelling of groundwater flow is indispensable for the effective planning and management of the water resources. The basic intent of this study is to develop a 3-D groundwater flow model of the study basin using the Visual MODFLOW Flex 2014.2 package and successfully calibrate and validate the model using 17 years of observed data. The sensitivity analysis was carried out to quantify the susceptibility of aquifer system to the river bank seepage, recharge from rainfall and agriculture practices, horizontal and vertical hydraulic conductivities, and specific yield. To quantify the impact of parameter uncertainties, Sequential Uncertainty Fitting Algorithm (SUFI-2) and Markov chain Monte Carlo (McMC) techniques were implemented. Results from the two techniques were compared and the advantages and disadvantages were analysed. Nash-Sutcliffe coefficient (NSE), Coefficient of Determination (R2), Mean Absolute Error (MAE), Mean Percent Deviation (Dv) and Root Mean Squared Error (RMSE) were adopted as criteria of model evaluation during calibration and validation of the developed model. NSE, R2, MAE, Dv and RMSE values for groundwater flow model during calibration and validation were in acceptable range. Also, the McMC technique was able to provide more reasonable results than SUFI-2. The calibrated and validated model will be useful to identify the aquifer properties, analyse the groundwater flow dynamics and the change in groundwater levels in future forecasts.
NASA Astrophysics Data System (ADS)
Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven
2016-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.
METAPHOR: Probability density estimation for machine learning based photometric redshifts
NASA Astrophysics Data System (ADS)
Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.
2017-06-01
We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).
Developing evaluation instrument based on CIPP models on the implementation of portfolio assessment
NASA Astrophysics Data System (ADS)
Kurnia, Feni; Rosana, Dadan; Supahar
2017-08-01
This study aimed to develop an evaluation instrument constructed by CIPP model on the implementation of portfolio assessment in science learning. This study used research and development (R & D) method; adapting 4-D by the development of non-test instrument, and the evaluation instrument constructed by CIPP model. CIPP is the abbreviation of Context, Input, Process, and Product. The techniques of data collection were interviews, questionnaires, and observations. Data collection instruments were: 1) the interview guidelines for the analysis of the problems and the needs, 2) questionnaire to see level of accomplishment of portfolio assessment instrument, and 3) observation sheets for teacher and student to dig up responses to the portfolio assessment instrument. The data obtained was quantitative data obtained from several validators. The validators consist of two lecturers as the evaluation experts, two practitioners (science teachers), and three colleagues. This paper shows the results of content validity obtained from the validators and the analysis result of the data obtained by using Aikens' V formula. The results of this study shows that the evaluation instrument based on CIPP models is proper to evaluate the implementation of portfolio assessment instruments. Based on the experts' judgments, practitioners, and colleagues, the Aikens' V coefficient was between 0.86-1,00 which means that it is valid and can be used in the limited trial and operational field trial.
Static and Dynamic Verification of Critical Software for Space Applications
NASA Astrophysics Data System (ADS)
Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.
Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.
The Scientific Status of Projective Techniques.
Lilienfeld, S O; Wood, J M; Garb, H N
2000-11-01
Although projective techniques continue to be widely used in clinical and forensic settings, their scientific status remains highly controversial. In this monograph, we review the current state of the literature concerning the psychometric properties (norms, reliability, validity, incremental validity, treatment utility) of three major projective instruments: Rorschach Inkblot Test, Thematic Apperception Test (TAT), and human figure drawings. We conclude that there is empirical support for the validity of a small number of indexes derived from the Rorschach and TAT. However, the substantial majority of Rorschach and TAT indexes are not empirically supported. The validity evidence for human figure drawings is even more limited. With a few exceptions, projective indexes have not consistently demonstrated incremental validity above and beyond other psychometric data. In addition, we summarize the results of a new meta-analysis intended to examine the capacity of these three instruments to detect child sexual abuse. Although some projective instruments were better than chance at detecting child sexual abuse, there were virtually no replicated findings across independent investigative teams. This meta-analysis also provides the first clear evidence of substantial file drawer effects in the projectives literature, as the effect sizes from published studies markedly exceeded those from unpublished studies. We conclude with recommendations regarding the (a) construction of projective techniques with adequate validity, (b) forensic and clinical use of projective techniques, and (c) education and training of future psychologists regarding projective techniques. © 2000 Association for Psychological Science.
GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)
NASA Astrophysics Data System (ADS)
Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza
2017-12-01
Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.
NASA Astrophysics Data System (ADS)
Pérez, B.; Brower, R.; Beckers, J.; Paradis, D.; Balseiro, C.; Lyons, K.; Cure, M.; Sotillo, M. G.; Hacket, B.; Verlaan, M.; Alvarez Fanjul, E.
2011-04-01
ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast that makes use of existing storm surge or circulation models today operational in Europe, as well as near-real time tide gauge data in the region, with the following main goals: - providing an easy access to existing forecasts, as well as to its performance and model validation, by means of an adequate visualization tool - generation of better forecasts of sea level, including confidence intervals, by means of the Bayesian Model Average Technique (BMA) The system was developed and implemented within ECOOP (C.No. 036355) European Project for the NOOS and the IBIROOS regions, based on MATROOS visualization tool developed by Deltares. Both systems are today operational at Deltares and Puertos del Estado respectively. The Bayesian Modelling Average technique generates an overall forecast probability density function (PDF) by making a weighted average of the individual forecasts PDF's; the weights represent the probability that a model will give the correct forecast PDF and are determined and updated operationally based on the performance of the models during a recent training period. This implies the technique needs the availability of sea level data from tide gauges in near-real time. Results of validation of the different models and BMA implementation for the main harbours will be presented for the IBIROOS and Western Mediterranean regions, where this kind of activity is performed for the first time. The work has proved to be useful to detect problems in some of the circulation models not previously well calibrated with sea level data, to identify the differences on baroclinic and barotropic models for sea level applications and to confirm the general improvement of the BMA forecasts.
Bairy, Santhosh Kumar; Suneel Kumar, B V S; Bhalla, Joseph Uday Tej; Pramod, A B; Ravikumar, Muttineni
2009-04-01
c-Src kinase play an important role in cell growth and differentiation and its inhibitors can be useful for the treatment of various diseases, including cancer, osteoporosis, and metastatic bone disease. Three dimensional quantitative structure-activity relationship (3D-QSAR) studies were carried out on quinazolin derivatives inhibiting c-Src kinase. Molecular field analysis (MFA) models with four different alignment techniques, namely, GLIDE, GOLD, LIGANDFIT and Least squares based methods were developed. glide based MFA model showed better results (Leave one out cross validation correlation coefficient r(2)(cv) = 0.923 and non-cross validation correlation coefficient r(2)= 0.958) when compared with other models. These results help us to understand the nature of descriptors required for activity of these compounds and thereby provide guidelines to design novel and potent c-Src kinase inhibitors.
Accelerated Aging in Electrolytic Capacitors for Prognostics
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Kulkarni, Chetan; Saha, Sankalita; Biswas, Gautam; Goebel, Kai Frank
2012-01-01
The focus of this work is the analysis of different degradation phenomena based on thermal overstress and electrical overstress accelerated aging systems and the use of accelerated aging techniques for prognostics algorithm development. Results on thermal overstress and electrical overstress experiments are presented. In addition, preliminary results toward the development of physics-based degradation models are presented focusing on the electrolyte evaporation failure mechanism. An empirical degradation model based on percentage capacitance loss under electrical overstress is presented and used in: (i) a Bayesian-based implementation of model-based prognostics using a discrete Kalman filter for health state estimation, and (ii) a dynamic system representation of the degradation model for forecasting and remaining useful life (RUL) estimation. A leave-one-out validation methodology is used to assess the validity of the methodology under the small sample size constrain. The results observed on the RUL estimation are consistent through the validation tests comparing relative accuracy and prediction error. It has been observed that the inaccuracy of the model to represent the change in degradation behavior observed at the end of the test data is consistent throughout the validation tests, indicating the need of a more detailed degradation model or the use of an algorithm that could estimate model parameters on-line. Based on the observed degradation process under different stress intensity with rest periods, the need for more sophisticated degradation models is further supported. The current degradation model does not represent the capacitance recovery over rest periods following an accelerated aging stress period.
Validation of Land Surface Temperature from Sentinel-3
NASA Astrophysics Data System (ADS)
Ghent, D.
2017-12-01
One of the main objectives of the Sentinel-3 mission is to measure sea- and land-surface temperature with high-end accuracy and reliability in support of environmental and climate monitoring in an operational context. Calibration and validation are thus key criteria for operationalization within the framework of the Sentinel-3 Mission Performance Centre (S3MPC). Land surface temperature (LST) has a long heritage of satellite observations which have facilitated our understanding of land surface and climate change processes, such as desertification, urbanization, deforestation and land/atmosphere coupling. These observations have been acquired from a variety of satellite instruments on platforms in both low-earth orbit and in geostationary orbit. Retrieval accuracy can be a challenge though; surface emissivities can be highly variable owing to the heterogeneity of the land, and atmospheric effects caused by the presence of aerosols and by water vapour absorption can give a bias to the underlying LST. As such, a rigorous validation is critical in order to assess the quality of the data and the associated uncertainties. Validation of the level-2 SL_2_LST product, which became freely available on an operational basis from 5th July 2017 builds on an established validation protocol for satellite-based LST. This set of guidelines provides a standardized framework for structuring LST validation activities. The protocol introduces a four-pronged approach which can be summarised thus: i) in situ validation where ground-based observations are available; ii) radiance-based validation over sites that are homogeneous in emissivity; iii) intercomparison with retrievals from other satellite sensors; iv) time-series analysis to identify artefacts on an interannual time-scale. This multi-dimensional approach is a necessary requirement for assessing the performance of the LST algorithm for the Sea and Land Surface Temperature Radiometer (SLSTR) which is designed around biome-based coefficients, thus emphasizing the importance of non-traditional forms of validation such as radiance-based techniques. Here we present examples of the ongoing routine application of the protocol to operational Sentinel-3 LST data.
NASA Technical Reports Server (NTRS)
Kwon, Youngwoo; Pavlidis, Dimitris; Tutt, Marcel N.
1991-01-01
A large-signal analysis method based on an harmonic balance technique and a 2-D cubic spline interpolation function has been developed and applied to the prediction of InP-based HEMT oscillator performance for frequencies extending up to the submillimeter-wave range. The large-signal analysis method uses a limited number of DC and small-signal S-parameter data and allows the accurate characterization of HEMT large-signal behavior. The method has been validated experimentally using load-pull measurement. Oscillation frequency, power performance, and load requirements are discussed, with an operation capability of 300 GHz predicted using state-of-the-art devices (fmax is approximately equal to 450 GHz).
NASA Astrophysics Data System (ADS)
Arai, Hiroyuki; Miyagawa, Isao; Koike, Hideki; Haseyama, Miki
We propose a novel technique for estimating the number of people in a video sequence; it has the advantages of being stable even in crowded situations and needing no ground-truth data. By analyzing the geometrical relationships between image pixels and their intersection volumes in the real world quantitatively, a foreground image directly indicates the number of people. Because foreground detection is possible even in crowded situations, the proposed method can be applied in such situations. Moreover, it can estimate the number of people in an a priori manner, so it needs no ground-truth data unlike existing feature-based estimation techniques. Experiments show the validity of the proposed method.
NASA Astrophysics Data System (ADS)
Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H.
2014-06-01
Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (VLL) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the VLL within TBRs.
Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H
2014-06-01
Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (V(LL)) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the V(LL) within TBRs.
Le Bihan, Nicolas; Margerin, Ludovic
2009-07-01
In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.
ERIC Educational Resources Information Center
Sayre, Scott Alan
The purpose of this study was to develop and validate a computer-based system that would allow interactive video developers to integrate and manage the design components prior to production. These components of an interactive video (IVD) program include visual information in a variety of formats, audio information, and instructional techniques,…
NASA Technical Reports Server (NTRS)
Stankovic, Ana V.
2003-01-01
Professor Stankovic will be developing and refining Simulink based models of the PM alternator and comparing the simulation results with experimental measurements taken from the unit. Her first task is to validate the models using the experimental data. Her next task is to develop alternative control techniques for the application of the Brayton Cycle PM Alternator in a nuclear electric propulsion vehicle. The control techniques will be first simulated using the validated models then tried experimentally with hardware available at NASA. Testing and simulation of a 2KW PM synchronous generator with diode bridge output is described. The parameters of a synchronous PM generator have been measured and used in simulation. Test procedures have been developed to verify the PM generator model with diode bridge output. Experimental and simulation results are in excellent agreement.
A campaign to end animal testing: introducing the PETA International Science Consortium Ltd.
Stoddart, Gilly; Brown, Jeffrey
2014-12-01
The successful development and validation of non-animal techniques, or the analysis of existing data to satisfy regulatory requirements, provide no guarantee that this information will be used in place of animal experiments. In order to advocate for the replacement of animal-based testing requirements, the PETA International Science Consortium Ltd (PISC) liaises with industry, regulatory and research agencies to establish and promote clear paths to validation and regulatory use of non-animal techniques. PISC and its members use an approach that identifies, promotes and verifies the implementation of good scientific practices in place of testing on animals. Examples of how PISC and its members have applied this approach to minimise the use of animals for the Registration, Evaluation, Authorisation and Restriction of Chemicals regulation in the EU and testing of cosmetics on animals in India, are described. 2014 FRAME.
Developing material for promoting problem-solving ability through bar modeling technique
NASA Astrophysics Data System (ADS)
Widyasari, N.; Rosiyanti, H.
2018-01-01
This study aimed at developing material for enhancing problem-solving ability through bar modeling technique with thematic learning. Polya’s steps of problem-solving were chosen as the basis of the study. The methods of the study were research and development. The subject of this study were five teen students of the fifth grade of Lab-school FIP UMJ elementary school. Expert review and student’ response analysis were used to collect the data. Furthermore, the data were analyzed using qualitative descriptive and quantitative. The findings showed that material in theme “Selalu Berhemat Energi” was categorized as valid and practical. The validity was measured by using the aspect of language, contents, and graphics. Based on the expert comments, the materials were easy to implement in the teaching-learning process. In addition, the result of students’ response showed that material was both interesting and easy to understand. Thus, students gained more understanding in learning problem-solving.
Dry Socket Etiology, Diagnosis, and Clinical Treatment Techniques.
Mamoun, John
2018-04-01
Dry socket, also termed fibrinolytic osteitis or alveolar osteitis, is a complication of tooth exodontia. A dry socket lesion is a post-extraction socket that exhibits exposed bone that is not covered by a blood clot or healing epithelium and exists inside or around the perimeter of the socket or alveolus for days after the extraction procedure. This article describes dry socket lesions; reviews the basic clinical techniques of treating different manifestations of dry socket lesions; and shows how microscope level loupe magnification of 6× to 8× or greater, combined with co-axial illumination or a dental operating microscope, facilitate more precise treatment of dry socket lesions. The author examines the scientific validity of the proposed causes of dry socket lesions (such as bacteria, inflammation, fibrinolysis, or traumatic extractions) and the scientific validity of different terminologies used to describe dry socket lesions. This article also presents an alternative model of what causes dry socket lesions, based on evidence from dental literature. Although the clinical techniques for treating dry socket lesions seem empirically correct, more evidence is required to determine the causes of dry socket lesions.
Dry Socket Etiology, Diagnosis, and Clinical Treatment Techniques
2018-01-01
Dry socket, also termed fibrinolytic osteitis or alveolar osteitis, is a complication of tooth exodontia. A dry socket lesion is a post-extraction socket that exhibits exposed bone that is not covered by a blood clot or healing epithelium and exists inside or around the perimeter of the socket or alveolus for days after the extraction procedure. This article describes dry socket lesions; reviews the basic clinical techniques of treating different manifestations of dry socket lesions; and shows how microscope level loupe magnification of 6× to 8× or greater, combined with co-axial illumination or a dental operating microscope, facilitate more precise treatment of dry socket lesions. The author examines the scientific validity of the proposed causes of dry socket lesions (such as bacteria, inflammation, fibrinolysis, or traumatic extractions) and the scientific validity of different terminologies used to describe dry socket lesions. This article also presents an alternative model of what causes dry socket lesions, based on evidence from dental literature. Although the clinical techniques for treating dry socket lesions seem empirically correct, more evidence is required to determine the causes of dry socket lesions. PMID:29732309
Development and Validation of the Negative Attitudes towards CBT Scale.
Parker, Zachary J; Waller, Glenn
2017-11-01
Clinicians commonly fail to use cognitive behavioural therapy (CBT) adequately, but the reasons for such omissions are not well understood. The objective of this study was to create and validate a measure to assess clinicians' attitudes towards CBT - the Negative Attitudes towards CBT Scale (NACS). The participants were 204 clinicians from various mental healthcare fields. Each completed the NACS, measures of anxiety and self-esteem, and a measure of therapists' use of CBT and non-CBT techniques and their confidence in using those techniques. Exploratory factor analysis was used to determine the factor structure of the NACS, and scale internal consistency was tested. A single, 16-item scale emerged from the factor analysis of the NACS, and that scale had good internal consistency. Clinicians' negative attitudes and their anxiety had different patterns of association with the use of CBT and other therapeutic techniques. The findings suggest that clinicians' attitudes and emotions each need to be considered when understanding why many clinicians fail to deliver the optimum version of evidence-based CBT. They also suggest that training effective CBT clinicians might depend on understanding and targeting such internal states.
NASA Astrophysics Data System (ADS)
Ng, Theam Foo; Pham, Tuan D.; Zhou, Xiaobo
2010-01-01
With the fast development of multi-dimensional data compression and pattern classification techniques, vector quantization (VQ) has become a system that allows large reduction of data storage and computational effort. One of the most recent VQ techniques that handle the poor estimation of vector centroids due to biased data from undersampling is to use fuzzy declustering-based vector quantization (FDVQ) technique. Therefore, in this paper, we are motivated to propose a justification of FDVQ based hidden Markov model (HMM) for investigating its effectiveness and efficiency in classification of genotype-image phenotypes. The performance evaluation and comparison of the recognition accuracy between a proposed FDVQ based HMM (FDVQ-HMM) and a well-known LBG (Linde, Buzo, Gray) vector quantization based HMM (LBG-HMM) will be carried out. The experimental results show that the performances of both FDVQ-HMM and LBG-HMM are almost similar. Finally, we have justified the competitiveness of FDVQ-HMM in classification of cellular phenotype image database by using hypotheses t-test. As a result, we have validated that the FDVQ algorithm is a robust and an efficient classification technique in the application of RNAi genome-wide screening image data.
Fusion and Gaussian mixture based classifiers for SONAR data
NASA Astrophysics Data System (ADS)
Kotari, Vikas; Chang, KC
2011-06-01
Underwater mines are inexpensive and highly effective weapons. They are difficult to detect and classify. Hence detection and classification of underwater mines is essential for the safety of naval vessels. This necessitates a formulation of highly efficient classifiers and detection techniques. Current techniques primarily focus on signals from one source. Data fusion is known to increase the accuracy of detection and classification. In this paper, we formulated a fusion-based classifier and a Gaussian mixture model (GMM) based classifier for classification of underwater mines. The emphasis has been on sound navigation and ranging (SONAR) signals due to their extensive use in current naval operations. The classifiers have been tested on real SONAR data obtained from University of California Irvine (UCI) repository. The performance of both GMM based classifier and fusion based classifier clearly demonstrate their superior classification accuracy over conventional single source cases and validate our approach.
NASA Astrophysics Data System (ADS)
Prakash, Satya; Mahesh, C.; Gairola, Rakesh M.
2011-12-01
Large-scale precipitation estimation is very important for climate science because precipitation is a major component of the earth's water and energy cycles. In the present study, the GOES precipitation index technique has been applied to the Kalpana-1 satellite infrared (IR) images of every three-hourly, i.e., of 0000, 0300, 0600,…., 2100 hours UTC, for rainfall estimation as a preparatory to the INSAT-3D. After the temperatures of all the pixels in a grid are known, they are distributed to generate a three-hourly 24-class histogram of brightness temperatures of IR (10.5-12.5 μm) images for a 1.0° × 1.0° latitude/longitude box. The daily, monthly, and seasonal rainfall have been estimated using these three-hourly rain estimates for the entire south-west monsoon period of 2009 in the present study. To investigate the potential of these rainfall estimates, the validation of monthly and seasonal rainfall estimates has been carried out using the Global Precipitation Climatology Project and Global Precipitation Climatology Centre data. The validation results show that the present technique works very well for the large-scale precipitation estimation qualitatively as well as quantitatively. The results also suggest that the simple IR-based estimation technique can be used to estimate rainfall for tropical areas at a larger temporal scale for climatological applications.
NASA Technical Reports Server (NTRS)
Cramer, J. M.; Pal, S.; Marshall, W. M.; Santoro, R. J.
2003-01-01
Contents include the folloving: 1. Motivation. Support NASA's 3d generation launch vehicle technology program. RBCC is promising candidate for 3d generation propulsion system. 2. Approach. Focus on ejector mode p3erformance (Mach 0-3). Perform testing on established flowpath geometry. Use conventional propulsion measurement techniques. Use advanced optical diagnostic techniques to measure local combustion gas properties. 3. Objectives. Gain physical understanding of detailing mixing and combustion phenomena. Establish an experimental data set for CFD code development and validation.
Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro
2015-06-15
Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.
Nallasivam, Ulaganathan; Shah, Vishesh H.; Shenvi, Anirudh A.; ...
2016-02-10
We present a general Global Minimization Algorithm (GMA) to identify basic or thermally coupled distillation configurations that require the least vapor duty under minimum reflux conditions for separating any ideal or near-ideal multicomponent mixture into a desired number of product streams. In this algorithm, global optimality is guaranteed by modeling the system using Underwood equations and reformulating the resulting constraints to bilinear inequalities. The speed of convergence to the globally optimal solution is increased by using appropriate feasibility and optimality based variable-range reduction techniques and by developing valid inequalities. As a result, the GMA can be coupled with already developedmore » techniques that enumerate basic and thermally coupled distillation configurations, to provide for the first time, a global optimization based rank-list of distillation configurations.« less
Khoury, Joseph D; Wang, Wei-Lien; Prieto, Victor G; Medeiros, L Jeffrey; Kalhor, Neda; Hameed, Meera; Broaddus, Russell; Hamilton, Stanley R
2018-02-01
Biomarkers that guide therapy selection are gaining unprecedented importance as targeted therapy options increase in scope and complexity. In conjunction with high-throughput molecular techniques, therapy-guiding biomarker assays based upon immunohistochemistry (IHC) have a critical role in cancer care in that they inform about the expression status of a protein target. Here, we describe the validation procedures for four clinical IHC biomarker assays-PTEN, RB, MLH1, and MSH2-for use as integral biomarkers in the nationwide NCI-Molecular Analysis for Therapy Choice (NCI-MATCH) EAY131 clinical trial. Validation procedures were developed through an iterative process based on collective experience and adaptation of broad guidelines from the FDA. The steps included primary antibody selection; assay optimization; development of assay interpretation criteria incorporating biological considerations; and expected staining patterns, including indeterminate results, orthogonal validation, and tissue validation. Following assay lockdown, patient samples and cell lines were used for analytic and clinical validation. The assays were then approved as laboratory-developed tests and used for clinical trial decisions for treatment selection. Calculations of sensitivity and specificity were undertaken using various definitions of gold-standard references, and external validation was required for the PTEN IHC assay. In conclusion, validation of IHC biomarker assays critical for guiding therapy in clinical trials is feasible using comprehensive preanalytic, analytic, and postanalytic steps. Implementation of standardized guidelines provides a useful framework for validating IHC biomarker assays that allow for reproducibility across institutions for routine clinical use. Clin Cancer Res; 24(3); 521-31. ©2017 AACR . ©2017 American Association for Cancer Research.
Expert system verification and validation survey, delivery 4
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Current progress in patient-specific modeling
2010-01-01
We present a survey of recent advancements in the emerging field of patient-specific modeling (PSM). Researchers in this field are currently simulating a wide variety of tissue and organ dynamics to address challenges in various clinical domains. The majority of this research employs three-dimensional, image-based modeling techniques. Recent PSM publications mostly represent feasibility or preliminary validation studies on modeling technologies, and these systems will require further clinical validation and usability testing before they can become a standard of care. We anticipate that with further testing and research, PSM-derived technologies will eventually become valuable, versatile clinical tools. PMID:19955236
Expert system verification and validation survey. Delivery 2: Survey results
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and industry applications. This is the first task of the series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Flight test techniques for validating simulated nuclear electromagnetic pulse aircraft responses
NASA Technical Reports Server (NTRS)
Winebarger, R. M.; Neely, W. R., Jr.
1984-01-01
An attempt has been made to determine the effects of nuclear EM pulses (NEMPs) on aircraft systems, using a highly instrumented NASA F-106B to document the simulated NEMP environment at the Kirtland Air Force Base's Vertically Polarized Dipole test facility. Several test positions were selected so that aircraft orientation relative to the test facility would be the same in flight as when on the stationary dielectric stand, in order to validate the dielectric stand's use in flight configuration simulations. Attention is given to the flight test portions of the documentation program.
Expert system verification and validation survey. Delivery 5: Revised
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Expert system verification and validation survey. Delivery 3: Recommendations
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
ERIC Educational Resources Information Center
Long, Haiying
2012-01-01
As one of the most widely used creativity assessment tools, the Consensual Assessment Technique (CAT) has been praised as a valid tool to assess creativity. In Amabile's (1982) seminal work, the inter-rater reliability was defined as construct validity of the CAT. During the past three decades, researchers followed this definition and…
Curtis, Andrew R; Palin, William M; Fleming, Garry J P; Shortall, Adrian C C; Marquis, Peter M
2009-02-01
To assess the mechanical properties of discrete filler particles representative of several inorganic fillers in modern dental resin-based composites (RBCs) and to assess the validity of a novel micromanipulation technique. RBCs with microhybrid (Filtek Z250), 'nanohybrid' (Grandio) and 'nanofilled' (Filtek Supreme), filler particle morphologies were investigated. Filler particles were provided by the manufacturer or separated from the unpolymerized resin using a dissolution technique. Filler particles (n=30) were subjected to compression using a micromanipulation technique between a descending glass probe and a glass slide. The number of distinct fractures particles underwent was determined from force/displacement and stress/deformation curves and the force at fracture and pseudo-modulus of stress was calculated. Agglomerated fillers ('nanoclusters') exhibited up to four distinct fractures, while spheroidal and irregular particles underwent either a single fracture or did not fracture following micromanipulation. Z-tests highlighted failure of nanoclusters to be significant compared with spheroidal and irregular particles (P<0.05). The mean force at first fracture of the nanoclusters was greater (1702+/-909 microN) than spheroidal and irregular particles (1389+/-1342 and 1356+/-1093 microN, respectively). Likewise, the initial pseudo-modulus of stress of nanoclusters (797+/-555 MPa) was also greater than spheroidal (587+/-439 MPa) or irregular (552+/-275 MPa) fillers. The validity of employing the micromanipulation technique to determine the mechanical properties of filler particulates was established. The 'nanoclusters' exhibited a greater tendency to multiple fractures compared with conventional fillers and possessed a comparatively higher variability of pseudo-modulus and load prior to and at fracture, which may modify the damage tolerance of the overall RBC system.
Kumar, Y Kiran; Mehta, Shashi Bhushan; Ramachandra, Manjunath
2017-01-01
The purpose of this work is to provide some validation methods for evaluating the hemodynamic assessment of Cerebral Arteriovenous Malformation (CAVM). This article emphasizes the importance of validating noninvasive measurements for CAVM patients, which are designed using lumped models for complex vessel structure. The validation of the hemodynamics assessment is based on invasive clinical measurements and cross-validation techniques with the Philips proprietary validated software's Qflow and 2D Perfursion. The modeling results are validated for 30 CAVM patients for 150 vessel locations. Mean flow, diameter, and pressure were compared between modeling results and with clinical/cross validation measurements, using an independent two-tailed Student t test. Exponential regression analysis was used to assess the relationship between blood flow, vessel diameter, and pressure between them. Univariate analysis is used to assess the relationship between vessel diameter, vessel cross-sectional area, AVM volume, AVM pressure, and AVM flow results were performed with linear or exponential regression. Modeling results were compared with clinical measurements from vessel locations of cerebral regions. Also, the model is cross validated with Philips proprietary validated software's Qflow and 2D Perfursion. Our results shows that modeling results and clinical results are nearly matching with a small deviation. In this article, we have validated our modeling results with clinical measurements. The new approach for cross-validation is proposed by demonstrating the accuracy of our results with a validated product in a clinical environment.
Evidence-based dentistry: analysis of dental anxiety scales for children.
Al-Namankany, A; de Souza, M; Ashley, P
2012-03-09
To review paediatric dental anxiety measures (DAMs) and assess the statistical methods used for validation and their clinical implications. A search of four computerised databases between 1960 and January 2011 associated with DAMs, using pre-specified search terms, to assess the method of validation including the reliability as intra-observer agreement 'repeatability or stability' and inter-observer agreement 'reproducibility' and all types of validity. Fourteen paediatric DAMs were predominantly validated in schools and not in the clinical setting while five of the DAMs were not validated at all. The DAMs that were validated were done so against other paediatric DAMs which may not have been validated previously. Reliability was not assessed in four of the DAMs. However, all of the validated studies assessed reliability which was usually 'good' or 'acceptable'. None of the current DAMs used a formal sample size technique. Diversity was seen between the studies ranging from a few simple pictograms to lists of questions reported by either the individual or an observer. To date there is no scale that can be considered as a gold standard, and there is a need to further develop an anxiety scale with a cognitive component for children and adolescents.
Predicting implementation from organizational readiness for change: a study protocol
2011-01-01
Background There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment. Objectives Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias. Methods and Design We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities), and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities). Convergent and discriminant validities will test associations between organizational readiness and different aspects of job satisfaction: satisfaction with leadership, which should be highly correlated with readiness, versus satisfaction with salary, which should be less correlated with readiness. Content validity will be assessed using an expert panel and modified Delphi technique. Discussion We propose a comprehensive protocol for validating a survey instrument for assessing organizational readiness to change that specifically addresses key threats of bias related to halo effect, method bias and questions of construct validity that often go unexplored in research using measures of organizational constructs. PMID:21777479
NASA Astrophysics Data System (ADS)
San-Blas, A. A.; Roca, J. M.; Cogollos, S.; Morro, J. V.; Boria, V. E.; Gimeno, B.
2016-06-01
In this work, a full-wave tool for the accurate analysis and design of compensated E-plane multiport junctions is proposed. The implemented tool is capable of evaluating the undesired effects related to the use of low-cost manufacturing techniques, which are mostly due to the introduction of rounded corners in the cross section of the rectangular waveguides of the device. The obtained results show that, although stringent mechanical effects are imposed, it is possible to compensate for the impact of the cited low-cost manufacturing techniques by redesigning the matching elements considered in the original device. Several new designs concerning a great variety of E-plane components (such as right-angled bends, T-junctions and magic-Ts) are presented, and useful design guidelines are provided. The implemented tool, which is mainly based on the boundary integral-resonant mode expansion technique, has been successfully validated by comparing the obtained results to simulated data provided by a commercial software based on the finite element method.
Parameterizing unresolved obstacles with source terms in wave modeling: A real-world application
NASA Astrophysics Data System (ADS)
Mentaschi, Lorenzo; Kakoulaki, Georgia; Vousdoukas, Michalis; Voukouvalas, Evangelos; Feyen, Luc; Besio, Giovanni
2018-06-01
Parameterizing the dissipative effects of small, unresolved coastal features, is fundamental to improve the skills of wave models. The established technique to deal with this problem consists in reducing the amount of energy advected within the propagation scheme, and is currently available only for regular grids. To find a more general approach, Mentaschi et al., 2015b formulated a technique based on source terms, and validated it on synthetic case studies. This technique separates the parameterization of the unresolved features from the energy advection, and can therefore be applied to any numerical scheme and to any type of mesh. Here we developed an open-source library for the estimation of the transparency coefficients needed by this approach, from bathymetric data and for any type of mesh. The spectral wave model WAVEWATCH III was used to show that in a real-world domain, such as the Caribbean Sea, the proposed approach has skills comparable and sometimes better than the established propagation-based technique.
Towards a balanced software team formation based on Belbin team role using fuzzy technique
NASA Astrophysics Data System (ADS)
Omar, Mazni; Hasan, Bikhtiyar; Ahmad, Mazida; Yasin, Azman; Baharom, Fauziah; Mohd, Haslina; Darus, Norida Muhd
2016-08-01
In software engineering (SE), team roles play significant impact in determining the project success. To ensure the optimal outcome of the project the team is working on, it is essential to ensure that the team members are assigned to the right role with the right characteristics. One of the prevalent team roles is Belbin team role. A successful team must have a balance of team roles. Thus, this study demonstrates steps taken to determine balance of software team formation based on Belbin team role using fuzzy technique. Fuzzy technique was chosen because it allows analyzing of imprecise data and classifying selected criteria. In this study, two roles in Belbin team role, which are Shaper (Sh) and Plant (Pl) were chosen to assign the specific role in software team. Results show that the technique is able to be used for determining the balance of team roles. Future works will focus on the validation of the proposed method by using empirical data in industrial setting.
Mozer, M C; Wolniewicz, R; Grimes, D B; Johnson, E; Kaushansky, H
2000-01-01
Competition in the wireless telecommunications industry is fierce. To maintain profitability, wireless carriers must control churn, which is the loss of subscribers who switch from one carrier to another.We explore techniques from statistical machine learning to predict churn and, based on these predictions, to determine what incentives should be offered to subscribers to improve retention and maximize profitability to the carrier. The techniques include logit regression, decision trees, neural networks, and boosting. Our experiments are based on a database of nearly 47,000 U.S. domestic subscribers and includes information about their usage, billing, credit, application, and complaint history. Our experiments show that under a wide variety of assumptions concerning the cost of intervention and the retention rate resulting from intervention, using predictive techniques to identify potential churners and offering incentives can yield significant savings to a carrier. We also show the importance of a data representation crafted by domain experts. Finally, we report on a real-world test of the techniques that validate our simulation experiments.
Content validation of the 'Mosaic of Opinions About Abortion' (Mosai).
Cacique, Denis Barbosa; Passini Junior, Renato; Osis, Maria José Martins Duarte
2013-01-01
This study aimed to develop and validate the contents of the Mosaico de Opiniões Sobre o Aborto Induzido (Mosai), a structured questionnaire intended to be used as a tool to collect information about the views of health professionals about the morality of abortion. The contents of the first version of the questionnaire was developed based on the technique of thematic content analysis of books, articles, films, websites and newspapers reporting cases of abortion and arguing about their practice. The Mosai was composed of 6 moral dilemmas (vignettes) related to induced abortion, whose outcomes should be chosen by the respondents and could be justified by the classification of 15 patterns of arguments about the morality of abortion. In order to validate its contents, the questionnaire was submitted to the scrutiny of a panel of 12 experts, an intentional sample consisted of doctors, lawyers, ethicists, sociologists, nurses and statisticians, who evaluated the criteria of clarity of writing, relevance, appropriateness to sample and suitability to the fields. These scores were analyzed by the method of concordance rate, while the free comments were analyzed using the analysis technique content. All the moral dilemmas and arguments were considered valid according to the rate of agreement, however, some comments led to the exclusion of a dilemma about emergency contraception, among other changes. The content of Mosai was considered valid to serve as a tool to collect the opinions of healthcare professionals regarding the morality of abortion. Copyright © 2013 Elsevier Editora Ltda. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Canhai; Xu, Zhijie; Pan, Wenxiao
2016-01-01
To quantify the predictive confidence of a solid sorbent-based carbon capture design, a hierarchical validation methodology—consisting of basic unit problems with increasing physical complexity coupled with filtered model-based geometric upscaling has been developed and implemented. This paper describes the computational fluid dynamics (CFD) multi-phase reactive flow simulations and the associated data flows among different unit problems performed within the said hierarchical validation approach. The bench-top experiments used in this calibration and validation effort were carefully designed to follow the desired simple-to-complex unit problem hierarchy, with corresponding data acquisition to support model parameters calibrations at each unit problem level. A Bayesianmore » calibration procedure is employed and the posterior model parameter distributions obtained at one unit-problem level are used as prior distributions for the same parameters in the next-tier simulations. Overall, the results have demonstrated that the multiphase reactive flow models within MFIX can be used to capture the bed pressure, temperature, CO2 capture capacity, and kinetics with quantitative accuracy. The CFD modeling methodology and associated uncertainty quantification techniques presented herein offer a solid framework for estimating the predictive confidence in the virtual scale up of a larger carbon capture device.« less
Development of a Three-Tier Test to Assess Misconceptions about Simple Electric Circuits
ERIC Educational Resources Information Center
Pesman, Haki; Eryilmaz, Ali
2010-01-01
The authors aimed to propose a valid and reliable diagnostic instrument by developing a three-tier test on simple electric circuits. Based on findings from the interviews, open-ended questions, and the related literature, the test was developed and administered to 124 high school students. In addition to some qualitative techniques for…
A Comparison of Laser and Video Techniques for Determining Displacement and Velocity during Running
ERIC Educational Resources Information Center
Harrison, Andrew J.; Jensen, Randall L.; Donoghue, Orna
2005-01-01
The reliability of a laser system was compared with the reliability of a video-based kinematic analysis in measuring displacement and velocity during running. Validity and reliability of the laser on static measures was also assessed at distances between 10 m and 70 m by evaluating the coefficient of variation and intraclass correlation…
ERIC Educational Resources Information Center
Révész, Andrea; Sachs, Rebecca; Hama, Mika
2014-01-01
This investigation examined two techniques that may help learners focus on second language (L2) constructions when recasts are provided during meaning-based communicative activities: altering the cognitive complexity of tasks and manipulating the input frequency distributions of target constructions. We first independently assessed the validity of…
NASA Astrophysics Data System (ADS)
Petsev, Nikolai D.; Leal, L. Gary; Shell, M. Scott
2017-12-01
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely resolved (e.g., molecular dynamics) and coarse-grained (e.g., continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 084115 (2016)], simulated using a particle-based continuum method known as smoothed dissipative particle dynamics. An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.
Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener.
An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen
2016-09-16
This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ.
Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener
An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen
2016-01-01
This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ. PMID:27649200
Training and certification in endobronchial ultrasound-guided transbronchial needle aspiration
Konge, Lars; Nayahangan, Leizl Joy; Clementsen, Paul Frost
2017-01-01
Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) plays a key role in the staging of lung cancer, which is crucial for allocation to surgical treatment. EBUS-TBNA is a complicated procedure and simulation-based training is helpful in the first part of the long learning curve prior to performing the procedure on actual patients. New trainees should follow a structured training programme consisting of training on simulators to proficiency as assessed with a validated test followed by supervised practice on patients. The simulation-based training is superior to the traditional apprenticeship model and is recommended in the newest guidelines. EBUS-TBNA and oesophageal ultrasound-guided fine needle aspiration (EUS-FNA or EUS-B-FNA) are complementary to each other and the combined techniques are superior to either technique alone. It is logical to learn and to perform the two techniques in combination, however, for lung cancer staging solely EBUS-TBNA simulators exist, but hopefully in the future simulation-based training in EUS will be possible. PMID:28840013
CFD Techniques for Propulsion Applications
NASA Technical Reports Server (NTRS)
1992-01-01
The symposium was composed of the following sessions: turbomachinery computations and validations; flow in ducts, intakes, and nozzles; and reacting flows. Forty papers were presented, and they covered full 3-D code validation and numerical techniques; multidimensional reacting flow; and unsteady viscous flow for the entire spectrum of propulsion system components. The capabilities of the various numerical techniques were assessed and significant new developments were identified. The technical evaluation spells out where progress has been made and concludes that the present state of the art has almost reached the level necessary to tackle the comprehensive topic of computational fluid dynamics (CFD) validation for propulsion.
Nurses' knowledge of inhaler technique in the inpatient hospital setting.
De Tratto, Katie; Gomez, Christy; Ryan, Catherine J; Bracken, Nina; Steffen, Alana; Corbridge, Susan J
2014-01-01
High rates of inhaler misuse in patients with chronic obstructive pulmonary disease and asthma contribute to hospital readmissions and increased healthcare cost. The purpose of this study was to examine inpatient staff nurses' self-perception of their knowledge of proper inhaler technique compared with demonstrated technique and frequency of providing patients with inhaler technique teaching during hospitalization and at discharge. A prospective, descriptive study. A 495-bed urban academic medical center in the Midwest United States. A convenience sample of 100 nurses working on inpatient medical units. Participants completed a 5-item, 4-point Likert-scale survey evaluating self-perception of inhaler technique knowledge, frequency of providing patient education, and responsibility for providing education. Participants demonstrated inhaler technique to the investigators using both a metered dose inhaler (MDI) and Diskus device inhaler, and performance was measured via a validated checklist. Overall misuse rates were high for both MDI and Diskus devices. There was poor correlation between perceived ability and investigator-measured performance of inhaler technique. Frequency of education during hospitalization and at discharge was related to measured level of performance for the Diskus device but not for the MDI. Nurses are a key component of patient education in the hospital; however, nursing staff lack adequate knowledge of inhaler technique. Identifying gaps in nursing knowledge regarding proper inhaler technique and patient education about proper inhaler technique is important to design interventions that may positively impact patient outcomes. Interventions could include one-on-one education, Web-based education, unit-based education, or hospital-wide competency-based education. All should include return demonstration of appropriate technique.
Iglesias-Parra, Maria Rosa; García-Guerrero, Alfonso; García-Mayor, Silvia; Kaknani-Uttumchandani, Shakira; León-Campos, Álvaro; Morales-Asencio, José Miguel
2015-07-01
To develop an evaluation system of clinical competencies for the practicum of nursing students based on the Nursing Interventions Classification (NIC). Psychometric validation study: the first two phases addressed definition and content validation, and the third phase consisted of a cross-sectional study for analyzing reliability. The study population was undergraduate nursing students and clinical tutors. Through the Delphi technique, 26 competencies and 91 interventions were isolated. Cronbach's α was 0.96. Factor analysis yielded 18 factors that explained 68.82% of the variance. Overall inter-item correlation was 0.26, and total-item correlation ranged between 0.66 and 0.19. A competency system for the nursing practicum, structured on the NIC, is a reliable method for assessing and evaluating clinical competencies. Further evaluations in other contexts are needed. The availability of standardized language systems in the nursing discipline supposes an ideal framework to develop the nursing curricula. © 2015 Sigma Theta Tau International.
Fractal Clustering and Knowledge-driven Validation Assessment for Gene Expression Profiling.
Wang, Lu-Yong; Balasubramanian, Ammaiappan; Chakraborty, Amit; Comaniciu, Dorin
2005-01-01
DNA microarray experiments generate a substantial amount of information about the global gene expression. Gene expression profiles can be represented as points in multi-dimensional space. It is essential to identify relevant groups of genes in biomedical research. Clustering is helpful in pattern recognition in gene expression profiles. A number of clustering techniques have been introduced. However, these traditional methods mainly utilize shape-based assumption or some distance metric to cluster the points in multi-dimension linear Euclidean space. Their results shows poor consistence with the functional annotation of genes in previous validation study. From a novel different perspective, we propose fractal clustering method to cluster genes using intrinsic (fractal) dimension from modern geometry. This method clusters points in such a way that points in the same clusters are more self-affine among themselves than to the points in other clusters. We assess this method using annotation-based validation assessment for gene clusters. It shows that this method is superior in identifying functional related gene groups than other traditional methods.
Eticha, Tadele; Kahsay, Getu; Hailu, Teklebrhan; Gebretsadikan, Tesfamichael; Asefa, Fitsum; Gebretsadik, Hailekiros; Thangabalan, Boovizhikannan
2018-01-01
A simple extractive spectrophotometric technique has been developed and validated for the determination of miconazole nitrate in pure and pharmaceutical formulations. The method is based on the formation of a chloroform-soluble ion-pair complex between the drug and bromocresol green (BCG) dye in an acidic medium. The complex showed absorption maxima at 422 nm, and the system obeys Beer's law in the concentration range of 1-30 µ g/mL with molar absorptivity of 2.285 × 10 4 L/mol/cm. The composition of the complex was studied by Job's method of continuous variation, and the results revealed that the mole ratio of drug : BCG is 1 : 1. Full factorial design was used to optimize the effect of variable factors, and the method was validated based on the ICH guidelines. The method was applied for the determination of miconazole nitrate in real samples.
NASA Astrophysics Data System (ADS)
sugiarti, A. C.; suyatno, S.; Sanjaya, I. G. M.
2018-04-01
The objective of this study is describing the feasibility of Learning Cycle 5E STEM (Science, Technology, Engineering, and Mathematics) based learning material which is appropriate to improve students’ learning achievement in Thermochemistry. The study design used 4-D models and one group pretest-posttest design to obtain the information about the improvement of sudents’ learning outcomes. The subject was learning cycle 5E based STEM learning materials which the data were collected from 30 students of Science class at 11th Grade. The techniques used in this study were validation, observation, test, and questionnaire. Some result attain: (1) all the learning materials contents were valid, (2) the practicality and the effectiveness of all the learning materials contents were classified as good. The conclution of this study based on those three condition, the Learnig Cycle 5E based STEM learning materials is appropriate to improve students’ learning outcomes in studying Thermochemistry.
de Boer, Pieter T; Frederix, Geert W J; Feenstra, Talitha L; Vemer, Pepijn
2016-09-01
Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of model validation efforts in the overall health economic literature. A literature search was performed in Pubmed and Embase to retrieve health economic modelling studies published between 2008 and 2014. Reporting on model validation was evaluated by checking for the word validation, and by using AdViSHE (Assessment of the Validation Status of Health Economic decision models), a tool containing a structured list of relevant items for validation. Additionally, we contacted corresponding authors to ask whether more validation efforts were performed other than those reported in the manuscripts. A total of 53 studies on seasonal influenza and 41 studies on early breast cancer were included in our review. The word validation was used in 16 studies (30 %) on seasonal influenza and 23 studies (56 %) on early breast cancer; however, in a minority of studies, this referred to a model validation technique. Fifty-seven percent of seasonal influenza studies and 71 % of early breast cancer studies reported one or more validation techniques. Cross-validation of study outcomes was found most often. A limited number of studies reported on model validation efforts, although good examples were identified. Author comments indicated that more validation techniques were performed than those reported in the manuscripts. Although validation is deemed important by many researchers, this is not reflected in the reporting habits of health economic modelling studies. Systematic reporting of validation efforts would be desirable to further enhance decision makers' confidence in health economic models and their outcomes.
Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R.
2012-01-01
This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases. PMID:22448233
Scalability and Validation of Big Data Bioinformatics Software.
Yang, Andrian; Troup, Michael; Ho, Joshua W K
2017-01-01
This review examines two important aspects that are central to modern big data bioinformatics analysis - software scalability and validity. We argue that not only are the issues of scalability and validation common to all big data bioinformatics analyses, they can be tackled by conceptually related methodological approaches, namely divide-and-conquer (scalability) and multiple executions (validation). Scalability is defined as the ability for a program to scale based on workload. It has always been an important consideration when developing bioinformatics algorithms and programs. Nonetheless the surge of volume and variety of biological and biomedical data has posed new challenges. We discuss how modern cloud computing and big data programming frameworks such as MapReduce and Spark are being used to effectively implement divide-and-conquer in a distributed computing environment. Validation of software is another important issue in big data bioinformatics that is often ignored. Software validation is the process of determining whether the program under test fulfils the task for which it was designed. Determining the correctness of the computational output of big data bioinformatics software is especially difficult due to the large input space and complex algorithms involved. We discuss how state-of-the-art software testing techniques that are based on the idea of multiple executions, such as metamorphic testing, can be used to implement an effective bioinformatics quality assurance strategy. We hope this review will raise awareness of these critical issues in bioinformatics.
Self-Alignment MEMS IMU Method Based on the Rotation Modulation Technique on a Swing Base
Chen, Zhiyong; Yang, Haotian; Wang, Chengbin; Lin, Zhihui; Guo, Meifeng
2018-01-01
The micro-electro-mechanical-system (MEMS) inertial measurement unit (IMU) has been widely used in the field of inertial navigation due to its small size, low cost, and light weight, but aligning MEMS IMUs remains a challenge for researchers. MEMS IMUs have been conventionally aligned on a static base, requiring other sensors, such as magnetometers or satellites, to provide auxiliary information, which limits its application range to some extent. Therefore, improving the alignment accuracy of MEMS IMU as much as possible under swing conditions is of considerable value. This paper proposes an alignment method based on the rotation modulation technique (RMT), which is completely self-aligned, unlike the existing alignment techniques. The effect of the inertial sensor errors is mitigated by rotating the IMU. Then, inertial frame-based alignment using the rotation modulation technique (RMT-IFBA) achieved coarse alignment on the swing base. The strong tracking filter (STF) further improved the alignment accuracy. The performance of the proposed method was validated with a physical experiment, and the results of the alignment showed that the standard deviations of pitch, roll, and heading angle were 0.0140°, 0.0097°, and 0.91°, respectively, which verified the practicality and efficacy of the proposed method for the self-alignment of the MEMS IMU on a swing base. PMID:29649150
Pageler, Natalie M; Grazier G'Sell, Max Jacob; Chandler, Warren; Mailes, Emily; Yang, Christine; Longhurst, Christopher A
2016-09-01
The objective of this project was to use statistical techniques to determine the completeness and accuracy of data migrated during electronic health record conversion. Data validation during migration consists of mapped record testing and validation of a sample of the data for completeness and accuracy. We statistically determined a randomized sample size for each data type based on the desired confidence level and error limits. The only error identified in the post go-live period was a failure to migrate some clinical notes, which was unrelated to the validation process. No errors in the migrated data were found during the 12- month post-implementation period. Compared to the typical industry approach, we have demonstrated that a statistical approach to sampling size for data validation can ensure consistent confidence levels while maximizing efficiency of the validation process during a major electronic health record conversion. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Online measurement of bead geometry in GMAW-based additive manufacturing using passive vision
NASA Astrophysics Data System (ADS)
Xiong, Jun; Zhang, Guangjun
2013-11-01
Additive manufacturing based on gas metal arc welding is an advanced technique for depositing fully dense components with low cost. Despite this fact, techniques to achieve accurate control and automation of the process have not yet been perfectly developed. The online measurement of the deposited bead geometry is a key problem for reliable control. In this work a passive vision-sensing system, comprising two cameras and composite filtering techniques, was proposed for real-time detection of the bead height and width through deposition of thin walls. The nozzle to the top surface distance was monitored for eliminating accumulated height errors during the multi-layer deposition process. Various image processing algorithms were applied and discussed for extracting feature parameters. A calibration procedure was presented for the monitoring system. Validation experiments confirmed the effectiveness of the online measurement system for bead geometry in layered additive manufacturing.
Tumor response estimation in radar-based microwave breast cancer detection.
Kurrant, Douglas J; Fear, Elise C; Westwick, David T
2008-12-01
Radar-based microwave imaging techniques have been proposed for early stage breast cancer detection. A considerable challenge for the successful implementation of these techniques is the reduction of clutter, or components of the signal originating from objects other than the tumor. In particular, the reduction of clutter from the late-time scattered fields is required in order to detect small (subcentimeter diameter) tumors. In this paper, a method to estimate the tumor response contained in the late-time scattered fields is presented. The method uses a parametric function to model the tumor response. A maximum a posteriori estimation approach is used to evaluate the optimal values for the estimates of the parameters. A pattern classification technique is then used to validate the estimation. The ability of the algorithm to estimate a tumor response is demonstrated by using both experimental and simulated data obtained with a tissue sensing adaptive radar system.
Setting up a Rayleigh Scattering Based Flow Measuring System in a Large Nozzle Testing Facility
NASA Technical Reports Server (NTRS)
Panda, Jayanta; Gomez, Carlos R.
2002-01-01
A molecular Rayleigh scattering based air density measurement system has been built in a large nozzle testing facility at NASA Glenn Research Center. The technique depends on the light scattering by gas molecules present in air; no artificial seeding is required. Light from a single mode, continuous wave laser was transmitted to the nozzle facility by optical fiber, and light scattered by gas molecules, at various points along the laser beam, is collected and measured by photon-counting electronics. By placing the laser beam and collection optics on synchronized traversing units, the point measurement technique is made effective for surveying density variation over a cross-section of the nozzle plume. Various difficulties associated with dust particles, stray light, high noise level and vibration are discussed. Finally, a limited amount of data from an underexpanded jet are presented and compared with expected variations to validate the technique.
EOS-Aura's Ozone Monitoring Instrument (OMI): Validation Requirements
NASA Technical Reports Server (NTRS)
Brinksma, E. J.; McPeters, R.; deHaan, J. F.; Levelt, P. F.; Hilsenrath, E.; Bhartia, P. K.
2003-01-01
OMI is an advanced hyperspectral instrument that measures backscattered radiation in the UV and visible. It will be flown as part of the EOS Aura mission and provide data on atmospheric chemistry that is highly synergistic with other Aura instruments HIRDLS, MLS, and TES. OMI is designed to measure total ozone, aerosols, cloud information, and UV irradiances, continuing the TOMS series of global mapped products but with higher spatial resolution. In addition its hyperspectral capability enables measurements of trace gases such as SO2, NO2, HCHO, BrO, and OClO. A plan for validation of the various OM1 products is now being formulated. Validation of the total column and UVB products will rely heavily on existing networks of instruments, like NDSC. NASA and its European partners are planning aircraft missions for the validation of Aura instruments. New instruments and techniques (DOAS systems for example) will need to be developed, both ground and aircraft based. Lidar systems are needed for validation of the vertical distributions of ozone, aerosols, NO2 and possibly SO2. The validation emphasis will be on the retrieval of these products under polluted conditions. This is challenging because they often depend on the tropospheric profiles of the product in question, and because of large spatial variations in the troposphere. Most existing ground stations are located in, and equipped for, pristine environments. This is also true for almost all NDSC stations. OMI validation will need ground based sites in polluted environments and specially developed instruments, complementing the existing instrumentation.
MRPrimer: a MapReduce-based method for the thorough design of valid and ranked primers for PCR
Kim, Hyerin; Kang, NaNa; Chon, Kang-Wook; Kim, Seonho; Lee, NaHye; Koo, JaeHyung; Kim, Min-Soo
2015-01-01
Primer design is a fundamental technique that is widely used for polymerase chain reaction (PCR). Although many methods have been proposed for primer design, they require a great deal of manual effort to generate feasible and valid primers, including homology tests on off-target sequences using BLAST-like tools. That approach is inconvenient for many target sequences of quantitative PCR (qPCR) due to considering the same stringent and allele-invariant constraints. To address this issue, we propose an entirely new method called MRPrimer that can design all feasible and valid primer pairs existing in a DNA database at once, while simultaneously checking a multitude of filtering constraints and validating primer specificity. Furthermore, MRPrimer suggests the best primer pair for each target sequence, based on a ranking method. Through qPCR analysis using 343 primer pairs and the corresponding sequencing and comparative analyses, we showed that the primer pairs designed by MRPrimer are very stable and effective for qPCR. In addition, MRPrimer is computationally efficient and scalable and therefore useful for quickly constructing an entire collection of feasible and valid primers for frequently updated databases like RefSeq. Furthermore, we suggest that MRPrimer can be utilized conveniently for experiments requiring primer design, especially real-time qPCR. PMID:26109350
Validation of Model Forecasts of the Ambient Solar Wind
NASA Technical Reports Server (NTRS)
Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.
2009-01-01
Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.
Semi-supervised clustering for parcellating brain regions based on resting state fMRI data
NASA Astrophysics Data System (ADS)
Cheng, Hewei; Fan, Yong
2014-03-01
Many unsupervised clustering techniques have been adopted for parcellating brain regions of interest into functionally homogeneous subregions based on resting state fMRI data. However, the unsupervised clustering techniques are not able to take advantage of exiting knowledge of the functional neuroanatomy readily available from studies of cytoarchitectonic parcellation or meta-analysis of the literature. In this study, we propose a semi-supervised clustering method for parcellating amygdala into functionally homogeneous subregions based on resting state fMRI data. Particularly, the semi-supervised clustering is implemented under the framework of graph partitioning, and adopts prior information and spatial consistent constraints to obtain a spatially contiguous parcellation result. The graph partitioning problem is solved using an efficient algorithm similar to the well-known weighted kernel k-means algorithm. Our method has been validated for parcellating amygdala into 3 subregions based on resting state fMRI data of 28 subjects. The experiment results have demonstrated that the proposed method is more robust than unsupervised clustering and able to parcellate amygdala into centromedial, laterobasal, and superficial parts with improved functionally homogeneity compared with the cytoarchitectonic parcellation result. The validity of the parcellation results is also supported by distinctive functional and structural connectivity patterns of the subregions and high consistency between coactivation patterns derived from a meta-analysis and functional connectivity patterns of corresponding subregions.
Shareef, Hussain; Mutlag, Ammar Hussein; Mohamed, Azah
2017-01-01
Many maximum power point tracking (MPPT) algorithms have been developed in recent years to maximize the produced PV energy. These algorithms are not sufficiently robust because of fast-changing environmental conditions, efficiency, accuracy at steady-state value, and dynamics of the tracking algorithm. Thus, this paper proposes a new random forest (RF) model to improve MPPT performance. The RF model has the ability to capture the nonlinear association of patterns between predictors, such as irradiance and temperature, to determine accurate maximum power point. A RF-based tracker is designed for 25 SolarTIFSTF-120P6 PV modules, with the capacity of 3 kW peak using two high-speed sensors. For this purpose, a complete PV system is modeled using 300,000 data samples and simulated using the MATLAB/SIMULINK package. The proposed RF-based MPPT is then tested under actual environmental conditions for 24 days to validate the accuracy and dynamic response. The response of the RF-based MPPT model is also compared with that of the artificial neural network and adaptive neurofuzzy inference system algorithms for further validation. The results show that the proposed MPPT technique gives significant improvement compared with that of other techniques. In addition, the RF model passes the Bland-Altman test, with more than 95 percent acceptability.
Shareef, Hussain; Mohamed, Azah
2017-01-01
Many maximum power point tracking (MPPT) algorithms have been developed in recent years to maximize the produced PV energy. These algorithms are not sufficiently robust because of fast-changing environmental conditions, efficiency, accuracy at steady-state value, and dynamics of the tracking algorithm. Thus, this paper proposes a new random forest (RF) model to improve MPPT performance. The RF model has the ability to capture the nonlinear association of patterns between predictors, such as irradiance and temperature, to determine accurate maximum power point. A RF-based tracker is designed for 25 SolarTIFSTF-120P6 PV modules, with the capacity of 3 kW peak using two high-speed sensors. For this purpose, a complete PV system is modeled using 300,000 data samples and simulated using the MATLAB/SIMULINK package. The proposed RF-based MPPT is then tested under actual environmental conditions for 24 days to validate the accuracy and dynamic response. The response of the RF-based MPPT model is also compared with that of the artificial neural network and adaptive neurofuzzy inference system algorithms for further validation. The results show that the proposed MPPT technique gives significant improvement compared with that of other techniques. In addition, the RF model passes the Bland–Altman test, with more than 95 percent acceptability. PMID:28702051
NASA Astrophysics Data System (ADS)
Sellami, Takwa; Jelassi, Sana; Darcherif, Abdel Moumen; Berriri, Hanen; Mimouni, Med Faouzi
2018-04-01
With the advancement of wind turbines towards complex structures, the requirement of trusty structural models has become more apparent. Hence, the vibration characteristics of the wind turbine components, like the blades and the tower, have to be extracted under vibration constraints. Although extracting the modal properties of blades is a simple task, calculating precise modal data for the whole wind turbine coupled to its tower/foundation is still a perplexing task. In this framework, this paper focuses on the investigation of the structural modeling approach of modern commercial micro-turbines. Thus, the structural model a complex designed wind turbine, which is Rutland 504, is established based on both experimental and numerical methods. A three-dimensional (3-D) numerical model of the structure was set up based on the finite volume method (FVM) using the academic finite element analysis software ANSYS. To validate the created model, experimental vibration tests were carried out using the vibration test system of TREVISE platform at ECAM-EPMI. The tests were based on the experimental modal analysis (EMA) technique, which is one of the most efficient techniques for identifying structures parameters. Indeed, the poles and residues of the frequency response functions (FRF), between input and output spectra, were calculated to extract the mode shapes and the natural frequencies of the structure. Based on the obtained modal parameters, the numerical designed model was up-dated.
A TRMM-Calibrated Infrared Technique for Global Rainfall Estimation
NASA Technical Reports Server (NTRS)
Negri, Andrew J.; Adler, Robert F.
2002-01-01
The development of a satellite infrared (IR) technique for estimating convective and stratiform rainfall and its application in studying the diurnal variability of rainfall on a global scale is presented. The Convective-Stratiform Technique (CST), calibrated by coincident, physically retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), is applied over the global tropics during 2001. The technique is calibrated separately over land and ocean, making ingenious use of the IR data from the TRMM Visible/Infrared Scanner (VIRS) before application to global geosynchronous satellite data. The low sampling rate of TRMM PR imposes limitations on calibrating IR-based techniques; however, our research shows that PR observations can be applied to improve IR-based techniques significantly by selecting adequate calibration areas and calibration length. The diurnal cycle of rainfall, as well as the division between convective and stratiform rainfall will be presented. The technique is validated using available data sets and compared to other global rainfall products such as Global Precipitation Climatology Project (GPCP) IR product, calibrated with TRMM Microwave Imager (TMI) data. The calibrated CST technique has the advantages of high spatial resolution (4 km), filtering of non-raining cirrus clouds, and the stratification of the rainfall into its convective and stratiform components, the latter being important for the calculation of vertical profiles of latent heating.
A new technique for the characterization of chaff elements
NASA Astrophysics Data System (ADS)
Scholfield, David; Myat, Maung; Dauby, Jason; Fesler, Jonathon; Bright, Jonathan
2011-07-01
A new technique for the experimental characterization of electromagnetic chaff based on Inverse Synthetic Aperture Radar is presented. This technique allows for the characterization of as few as one filament of chaff in a controlled anechoic environment allowing for stability and repeatability of experimental results. This approach allows for a deeper understanding of the fundamental phenomena of electromagnetic scattering from chaff through an incremental analysis approach. Chaff analysis can now begin with a single element and progress through the build-up of particles into pseudo-cloud structures. This controlled incremental approach is supported by an identical incremental modeling and validation process. Additionally, this technique has the potential to produce considerable savings in financial and schedule cost and provides a stable and repeatable experiment to aid model valuation.
Health diagnosis of arch bridge suspender by acoustic emission technique
NASA Astrophysics Data System (ADS)
Li, Dongsheng; Ou, Jinping
2007-01-01
Conventional non-destructive methods can't be dynamically monitored the suspenders' damage levels and types, so acoustic emission (AE) technique is proposed to monitor its activity. The validity signals are determined by the relationship with risetime and duration. The ambient noise is eliminated using float threshold value and placing a guard sensor. The cement mortar and steel strand damage level is analyzed by AE parameter method and damage types are judged by waveform analyzing technique. Based on these methods, all the suspenders of Sichuan Ebian Dadu river arch bridge have been monitored using AE techniques. The monitoring results show that AE signal amplitude, energy, counts can visually display the suspenders' damage levels, the difference of waveform and frequency range express different damage type. The testing results are well coincide with the practical situation.
2014-11-01
39–44) has been explored in depth in the literature. Of particular interest for this study are investigations into roll control. Isolating the...Control Performance, Aerodynamic Modeling, and Validation of Coupled Simulation Techniques for Guided Projectile Roll Dynamics by Jubaraj...Simulation Techniques for Guided Projectile Roll Dynamics Jubaraj Sahu, Frank Fresconi, and Karen R. Heavey Weapons and Materials Research
A protocol for validating Land Surface Temperature from Sentinel-3
NASA Astrophysics Data System (ADS)
Ghent, D.
2015-12-01
One of the main objectives of the Sentinel-3 mission is to measure sea- and land-surface temperature with high-end accuracy and reliability in support of environmental and climate monitoring in an operational context. Calibration and validation are thus key criteria for operationalization within the framework of the Sentinel-3 Mission Performance Centre (S3MPC).Land surface temperature (LST) has a long heritage of satellite observations which have facilitated our understanding of land surface and climate change processes, such as desertification, urbanization, deforestation and land/atmosphere coupling. These observations have been acquired from a variety of satellite instruments on platforms in both low-earth orbit and in geostationary orbit. Retrieval accuracy can be a challenge though; surface emissivities can be highly variable owing to the heterogeneity of the land, and atmospheric effects caused by the presence of aerosols and by water vapour absorption can give a bias to the underlying LST. As such, a rigorous validation is critical in order to assess the quality of the data and the associated uncertainties. The Sentinel-3 Cal-Val Plan for evaluating the level-2 SL_2_LST product builds on an established validation protocol for satellite-based LST. This set of guidelines provides a standardized framework for structuring LST validation activities, and is rapidly gaining international recognition. The protocol introduces a four-pronged approach which can be summarised thus: i) in situ validation where ground-based observations are available; ii) radiance-based validation over sites that are homogeneous in emissivity; iii) intercomparison with retrievals from other satellite sensors; iv) time-series analysis to identify artefacts on an interannual time-scale. This multi-dimensional approach is a necessary requirement for assessing the performance of the LST algorithm for SLSTR which is designed around biome-based coefficients, thus emphasizing the importance of non-traditional forms of validation such as radiance-based techniques. Here we present examples of the application of the protocol to data produced within the ESA DUE GlobTemperature Project. The lessons learnt here are helping to fine-tune the methodology in preparation for Sentinel-3 commissioning.
Barnett, David; Louzao, Raaul; Gambell, Peter; De, Jitakshi; Oldaker, Teri; Hanson, Curtis A
2013-01-01
Flow cytometry and other technologies of cell-based fluorescence assays are as a matter of good laboratory practice required to validate all assays, which when in clinical practice may pass through regulatory review processes using criteria often defined with a soluble analyte in plasma or serum samples in mind. Recently the U.S. Food and Drug Administration (FDA) has entered into a public dialogue in the U.S. regarding their regulatory interest in laboratory developed tests (LDTs) or so-called home brew assays performed in clinical laboratories. The absence of well-defined guidelines for validation of cell-based assays using fluorescence detection has thus become a subject of concern for the International Council for Standardization of Haematology (ICSH) and International Clinical Cytometry Society (ICCS). Accordingly, a group of over 40 international experts in the areas of test development, test validation, and clinical practice of a variety of assay types using flow cytometry and/or morphologic image analysis were invited to develop a set of practical guidelines useful to in vitro diagnostic (IVD) innovators, clinical laboratories, regulatory scientists, and laboratory inspectors. The focus of the group was restricted to fluorescence reporter reagents, although some common principles are shared by immunohistochemistry or immunocytochemistry techniques and noted where appropriate. The work product of this two year effort is the content of this special issue of this journal, which is published as 5 separate articles, this being Validation of Cell-based Fluorescence Assays: Practice Guidelines from the ICSH and ICCS - Part IV - Postanalytic considerations. © 2013 International Clinical Cytometry Society.
Davis, Bruce H; Dasgupta, Amar; Kussick, Steven; Han, Jin-Yeong; Estrellado, Annalee
2013-01-01
Flow cytometry and other technologies of cell-based fluorescence assays are as a matter of good laboratory practice required to validate all assays, which when in clinical practice may pass through regulatory review processes using criteria often defined with a soluble analyte in plasma or serum samples in mind. Recently the U.S. Food and Drug Administration (FDA) has entered into a public dialogue in the U.S. regarding their regulatory interest in laboratory developed tests (LDTs) or so-called "home brew" assays performed in clinical laboratories. The absence of well-defined guidelines for validation of cell-based assays using fluorescence detection has thus become a subject of concern for the International Council for Standardization of Haematology (ICSH) and International Clinical Cytometry Society (ICCS). Accordingly, a group of over 40 international experts in the areas of test development, test validation, and clinical practice of a variety of assay types using flow cytometry and/or morphologic image analysis were invited to develop a set of practical guidelines useful to in vitro diagnostic (IVD) innovators, clinical laboratories, regulatory scientists, and laboratory inspectors. The focus of the group was restricted to fluorescence reporter reagents, although some common principles are shared by immunohistochemistry or immunocytochemistry techniques and noted where appropriate. The work product of this two year effort is the content of this special issue of this journal, which is published as 5 separate articles, this being Validation of Cell-based Fluorescence Assays: Practice Guidelines from the ICSH and ICCS - Part II - Preanalytical issues. © 2013 International Clinical Cytometry Society. © 2013 International Clinical Cytometry Society.
Measurement of fracture toughness by nanoindentation methods: Recent advances and future challenges
Sebastiani, Marco; Johanns, K. E.; Herbert, Erik G.; ...
2015-04-30
In this study, we describe recent advances and developments for the measurement of fracture toughness at small scales by the use of nanoindentation-based methods including techniques based on micro-cantilever beam bending and micro-pillar splitting. A critical comparison of the techniques is made by testing a selected group of bulk and thin film materials. For pillar splitting, cohesive zone finite element simulations are used to validate a simple relationship between the critical load at failure, the pillar radius, and the fracture toughness for a range of material properties and coating/substrate combinations. The minimum pillar diameter required for nucleation and growth ofmore » a crack during indentation is also estimated. An analysis of pillar splitting for a film on a dissimilar substrate material shows that the critical load for splitting is relatively insensitive to the substrate compliance for a large range of material properties. Experimental results from a selected group of materials show good agreement between single cantilever and pillar splitting methods, while a discrepancy of ~25% is found between the pillar splitting technique and double-cantilever testing. It is concluded that both the micro-cantilever and pillar splitting techniques are valuable methods for micro-scale assessment of fracture toughness of brittle ceramics, provided the underlying assumptions can be validated. Although the pillar splitting method has some advantages because of the simplicity of sample preparation and testing, it is not applicable to most metals because their higher toughness prevents splitting, and in this case, micro-cantilever bend testing is preferred.« less
Automatic brain tumor detection in MRI: methodology and statistical validation
NASA Astrophysics Data System (ADS)
Iftekharuddin, Khan M.; Islam, Mohammad A.; Shaik, Jahangheer; Parra, Carlos; Ogg, Robert
2005-04-01
Automated brain tumor segmentation and detection are immensely important in medical diagnostics because it provides information associated to anatomical structures as well as potential abnormal tissue necessary to delineate appropriate surgical planning. In this work, we propose a novel automated brain tumor segmentation technique based on multiresolution texture information that combines fractal Brownian motion (fBm) and wavelet multiresolution analysis. Our wavelet-fractal technique combines the excellent multiresolution localization property of wavelets to texture extraction of fractal. We prove the efficacy of our technique by successfully segmenting pediatric brain MR images (MRIs) from St. Jude Children"s Research Hospital. We use self-organizing map (SOM) as our clustering tool wherein we exploit both pixel intensity and multiresolution texture features to obtain segmented tumor. Our test results show that our technique successfully segments abnormal brain tissues in a set of T1 images. In the next step, we design a classifier using Feed-Forward (FF) neural network to statistically validate the presence of tumor in MRI using both the multiresolution texture and the pixel intensity features. We estimate the corresponding receiver operating curve (ROC) based on the findings of true positive fractions and false positive fractions estimated from our classifier at different threshold values. An ROC, which can be considered as a gold standard to prove the competence of a classifier, is obtained to ascertain the sensitivity and specificity of our classifier. We observe that at threshold 0.4 we achieve true positive value of 1.0 (100%) sacrificing only 0.16 (16%) false positive value for the set of 50 T1 MRI analyzed in this experiment.
NASA Astrophysics Data System (ADS)
Erol, Serdar; Serkan Isık, Mustafa; Erol, Bihter
2016-04-01
The recent Earth gravity field satellite missions data lead significant improvement in Global Geopotential Models in terms of both accuracy and resolution. However the improvement in accuracy is not the same everywhere in the Earth and therefore quantifying the level of improvement locally is necessary using the independent data. The validations of the level-3 products from the gravity field satellite missions, independently from the estimation procedures of these products, are possible using various arbitrary data sets, as such the terrestrial gravity observations, astrogeodetic vertical deflections, GPS/leveling data, the stationary sea surface topography. Quantifying the quality of the gravity field functionals via recent products has significant importance for determination of the regional geoid modeling, base on the satellite and terrestrial data fusion with an optimal algorithm, beside the statistical reporting the improvement rates depending on spatial location. In the validations, the errors and the systematic differences between the data and varying spectral content of the compared signals should be considered in order to have comparable results. In this manner this study compares the performance of Wavelet decomposition and spectral enhancement techniques in validation of the GOCE/GRACE based Earth gravity field models using GPS/leveling and terrestrial gravity data in Turkey. The terrestrial validation data are filtered using Wavelet decomposition technique and the numerical results from varying levels of decomposition are compared with the results which are derived using the spectral enhancement approach with contribution of an ultra-high resolution Earth gravity field model. The tests include the GO-DIR-R5, GO-TIM-R5, GOCO05S, EIGEN-6C4 and EGM2008 global models. The conclusion discuss the superiority and drawbacks of both concepts as well as reporting the performance of tested gravity field models with an estimate of their contribution to modeling the geoid in Turkish territory.
Bhattacharya, S.; Byrnes, A.P.; Watney, W.L.; Doveton, J.H.
2008-01-01
Characterizing the reservoir interval into flow units is an effective way to subdivide the net-pay zone into layers for reservoir simulation. Commonly used flow unit identification techniques require a reliable estimate of permeability in the net pay on a foot-by-foot basis. Most of the wells do not have cores, and the literature is replete with different kinds of correlations, transforms, and prediction methods for profiling permeability in pay. However, for robust flow unit determination, predicted permeability at noncored wells requires validation and, if necessary, refinement. This study outlines the use o f a spreadsheet-based permeability validation technique to characterize flow units in wells from the Norcan East field, Clark County, Kansas, that produce from Atokan aged fine- to very fine-grained quartzarenite sandstones interpreted to have been deposited in brackish-water, tidally dominated restricted tidal-flat, tidal-channel, tidal-bar, and estuary bay environments within a small incised-valley-fill system. The methodology outlined enables the identification of fieldwide free-water level and validates and refines predicted permeability at 0.5-ft (0.15-m) intervals by iteratively reconciling differences in water saturation calculated from wire-line log and a capillary-pressure formulation that models fine- to very fine-grained sandstone with diagenetic clay and silt or shale laminae. The effectiveness of this methodology was confirmed by successfully matching primary and secondary production histories using a flow unit-based reservoir model of the Norcan East field without permeability modifications. The methodologies discussed should prove useful for robust flow unit characterization of different kinds of reservoirs. Copyright ?? 2008. The American Association of Petroleum Geologists. All rights reserved.
Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2014-01-01
Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354
Computer-aided assessment of regional abdominal fat with food residue removal in CT.
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2013-11-01
Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Chandrasekar, Thiravidamani; Raman, Natarajan
2016-07-01
A few novel Schiff base transition metal complexes of general formula [MLCl] (where, L = Schiff base, obtained by the condensation reaction of Knoevenagel condensate of curcumin, L-tryptophan and M = Cu(II), Ni(II), Co(II), and Zn(II)), were prepared by stencil synthesis. They were typified using UV-vis, IR, EPR spectral techniques, micro analytical techniques, magnetic susceptibility and molar conductivity. Geometry of the metal complexes was examined and recognized as square planar. DNA binding and viscosity studies revealed that the metal(II) complexes powerfully bound via an intercalation mechanism with the calf thymus DNA. Gel-electrophoresis technique was used to investigate the DNA cleavage competence of the complexes and they establish to approve the cleavage of pBR322 DNA in presence of oxidant H2O2. This outcome inferred that the synthesized complexes showed better nuclease activity. Moreover, the complexes were monitored for antimicrobial activities. The results exposed that the synthesized compounds were forceful against all the microbes under exploration.
System equivalent model mixing
NASA Astrophysics Data System (ADS)
Klaassen, Steven W. B.; van der Seijs, Maarten V.; de Klerk, Dennis
2018-05-01
This paper introduces SEMM: a method based on Frequency Based Substructuring (FBS) techniques that enables the construction of hybrid dynamic models. With System Equivalent Model Mixing (SEMM) frequency based models, either of numerical or experimental nature, can be mixed to form a hybrid model. This model follows the dynamic behaviour of a predefined weighted master model. A large variety of applications can be thought of, such as the DoF-space expansion of relatively small experimental models using numerical models, or the blending of different models in the frequency spectrum. SEMM is outlined, both mathematically and conceptually, based on a notation commonly used in FBS. A critical physical interpretation of the theory is provided next, along with a comparison to similar techniques; namely DoF expansion techniques. SEMM's concept is further illustrated by means of a numerical example. It will become apparent that the basic method of SEMM has some shortcomings which warrant a few extensions to the method. One of the main applications is tested in a practical case, performed on a validated benchmark structure; it will emphasize the practicality of the method.
A Novel Rules Based Approach for Estimating Software Birthmark
Binti Alias, Norma; Anwar, Sajid
2015-01-01
Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
Lehotsky, Á; Szilágyi, L; Bánsághi, S; Szerémy, P; Wéber, G; Haidegger, T
2017-09-01
Ultraviolet spectrum markers are widely used for hand hygiene quality assessment, although their microbiological validation has not been established. A microbiology-based assessment of the procedure was conducted. Twenty-five artificial hand models underwent initial full contamination, then disinfection with UV-dyed hand-rub solution, digital imaging under UV-light, microbiological sampling and cultivation, and digital imaging of the cultivated flora were performed. Paired images of each hand model were registered by a software tool, then the UV-marked regions were compared with the pathogen-free sites pixel by pixel. Statistical evaluation revealed that the method indicates correctly disinfected areas with 95.05% sensitivity and 98.01% specificity. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Double-Edge Molecular Measurement of Lidar Wind Profiles in the VALID Campaign
NASA Technical Reports Server (NTRS)
Korb, C. Laurence; Flesia, Cristina; Lolli, Simone; Hirt, Christian
2000-01-01
We have developed a transportable container based direct detection Doppler lidar based on the double-edge molecular technique. The pulsed solid state system was built at the University of Geneva. It was used to make range resolved measurements of the atmospheric wind field as part of the VALID campaign at the Observatoire de Haute Provence in Provence, France in July 1999. Comparison of our lidar wind measurements, which were analyzed without knowledge of the results of rawinsonde measurements made under the supervision of ESA, show good agreement with these rawinsondes. These are the first Doppler lidar field measurements made with an eyesafe direct detection molecular-based system at 355 nm and serve as a demonstrator for future spaceborne direct detection wind systems such as the Atmospheric Dynamics mission. Winds are an important contributor to sea surface temperature measurements made with the Tropical Rainfall Measuring Mission (TRMM) and also affect the TRMM rainfall estimates.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
NASA Astrophysics Data System (ADS)
Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.
2016-05-01
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
NASA Astrophysics Data System (ADS)
Liu, Yutong; Uberti, Mariano; Dou, Huanyu; Mosley, R. Lee; Gendelman, Howard E.; Boska, Michael D.
2009-02-01
Coregistration of in vivo magnetic resonance imaging (MRI) with histology provides validation of disease biomarker and pathobiology studies. Although thin-plate splines are widely used in such image registration, point landmark selection is error prone and often time-consuming. We present a technique to optimize landmark selection for thin-plate splines and demonstrate its usefulness in warping rodent brain MRI to histological sections. In this technique, contours are drawn on the corresponding MRI slices and images of histological sections. The landmarks are extracted from the contours by equal spacing then optimized by minimizing a cost function consisting of the landmark displacement and contour curvature. The technique was validated using simulation data and brain MRI-histology coregistration in a murine model of HIV-1 encephalitis. Registration error was quantified by calculating target registration error (TRE). The TRE of approximately 8 pixels for 20-80 landmarks without optimization was stable at different landmark numbers. The optimized results were more accurate at low landmark numbers (TRE of approximately 2 pixels for 50 landmarks), while the accuracy decreased (TRE approximately 8 pixels for larger numbers of landmarks (70- 80). The results demonstrated that registration accuracy decreases with the increasing landmark numbers offering more confidence in MRI-histology registration using thin-plate splines.
Measuring Brain Connectivity: Diffusion Tensor Imaging Validates Resting State Temporal Correlations
Skudlarski, Pawel; Jagannathan, Kanchana; Calhoun, Vince D.; Hampson, Michelle; Skudlarska, Beata A.; Pearlson, Godfrey
2015-01-01
Diffusion tensor imaging (DTI) and resting state temporal correlations (RSTC) are two leading techniques for investigating the connectivity of the human brain. They have been widely used to investigate the strength of anatomical and functional connections between distant brain regions in healthy subjects, and in clinical populations. Though they are both based on magnetic resonance imaging (MRI) they have not yet been compared directly. In this work both techniques were employed to create global connectivity matrices covering the whole brain gray matter. This allowed for direct comparisons between functional connectivity measured by RSTC with anatomical connectivity quantified using DTI tractography. We found that connectivity matrices obtained using both techniques showed significant agreement. Connectivity maps created for a priori defined anatomical regions showed significant correlation, and furthermore agreement was especially high in regions showing strong overall connectivity, such as those belonging to the default mode network. Direct comparison between functional RSTC and anatomical DTI connectivity, presented here for the first time, links two powerful approaches for investigating brain connectivity and shows their strong agreement. It provides a crucial multi-modal validation for resting state correlations as representing neuronal connectivity. The combination of both techniques presented here allows for further combining them to provide richer representation of brain connectivity both in the healthy brain and in clinical conditions. PMID:18771736
Skudlarski, Pawel; Jagannathan, Kanchana; Calhoun, Vince D; Hampson, Michelle; Skudlarska, Beata A; Pearlson, Godfrey
2008-11-15
Diffusion tensor imaging (DTI) and resting state temporal correlations (RSTC) are two leading techniques for investigating the connectivity of the human brain. They have been widely used to investigate the strength of anatomical and functional connections between distant brain regions in healthy subjects, and in clinical populations. Though they are both based on magnetic resonance imaging (MRI) they have not yet been compared directly. In this work both techniques were employed to create global connectivity matrices covering the whole brain gray matter. This allowed for direct comparisons between functional connectivity measured by RSTC with anatomical connectivity quantified using DTI tractography. We found that connectivity matrices obtained using both techniques showed significant agreement. Connectivity maps created for a priori defined anatomical regions showed significant correlation, and furthermore agreement was especially high in regions showing strong overall connectivity, such as those belonging to the default mode network. Direct comparison between functional RSTC and anatomical DTI connectivity, presented here for the first time, links two powerful approaches for investigating brain connectivity and shows their strong agreement. It provides a crucial multi-modal validation for resting state correlations as representing neuronal connectivity. The combination of both techniques presented here allows for further combining them to provide richer representation of brain connectivity both in the healthy brain and in clinical conditions.
Symptom-based categorization of in-flight passenger medical incidents.
Mahony, Paul H; Myers, Julia A; Larsen, Peter D; Powell, David M C; Griffiths, Robin F
2011-12-01
The majority of in-flight passenger medical events are managed by cabin crew. Our study aimed to evaluate the reliability of cabin crew reports of in-flight medical events and to develop a symptom-based categorization system. All cabin crew in-flight passenger medical incident reports for an airline over a 9-yr period were examined retrospectively. Validation of incident descriptions were undertaken on a sample of 162 cabin crew reports where medically trained persons' reports were available for comparison using a three Round Delphi technique and testing concordance using Cohen's Kappa. A hierarchical symptom-based categorization system was designed and validated. The rate was 159 incidents per 106 passengers carried, or 70.4/113.3 incidents per 106 revenue passenger kilometres/miles, respectively. Concordance between cabin crew and medical reports was 96%, with a high validity rating (mean 4.6 on a 1-5 scale) and high Cohen's Kappa (0.94). The most common in-flight medical events were transient loss of consciousness (41%), nausea/vomiting/diarrhea (19.5%), and breathing difficulty (16%). Cabin crew records provide reliable data regarding in-flight passenger medical incidents, complementary to diagnosis-based systems, and allow the use of currently underutilized data. The categorization system provides a means for tracking passenger medical incidents internationally and an evidence base for cabin crew first aid training.
Correction techniques for depth errors with stereo three-dimensional graphic displays
NASA Technical Reports Server (NTRS)
Parrish, Russell V.; Holden, Anthony; Williams, Steven P.
1992-01-01
Three-dimensional (3-D), 'real-world' pictorial displays that incorporate 'true' depth cues via stereopsis techniques have proved effective for displaying complex information in a natural way to enhance situational awareness and to improve pilot/vehicle performance. In such displays, the display designer must map the depths in the real world to the depths available with the stereo display system. However, empirical data have shown that the human subject does not perceive the information at exactly the depth at which it is mathematically placed. Head movements can also seriously distort the depth information that is embedded in stereo 3-D displays because the transformations used in mapping the visual scene to the depth-viewing volume (DVV) depend intrinsically on the viewer location. The goal of this research was to provide two correction techniques; the first technique corrects the original visual scene to the DVV mapping based on human perception errors, and the second (which is based on head-positioning sensor input data) corrects for errors induced by head movements. Empirical data are presented to validate both correction techniques. A combination of the two correction techniques effectively eliminates the distortions of depth information embedded in stereo 3-D displays.
The Myotonometer: Not a Valid Measurement Tool for Active Hamstring Musculotendinous Stiffness.
Pamukoff, Derek N; Bell, Sarah E; Ryan, Eric D; Blackburn, J Troy
2016-05-01
Hamstring musculotendinous stiffness (MTS) is associated with lower-extremity injury risk (ie, hamstring strain, anterior cruciate ligament injury) and is commonly assessed using the damped oscillatory technique. However, despite a preponderance of studies that measure MTS reliably in laboratory settings, there are no valid clinical measurement tools. A valid clinical measurement technique is needed to assess MTS and permit identification of individuals at heightened risk of injury and track rehabilitation progress. To determine the validity and reliability of the Myotonometer for measuring active hamstring MTS. Descriptive laboratory study. Laboratory. 33 healthy participants (15 men, age 21.33 ± 2.94 y, height 172.03 ± 16.36 cm, mass 74.21 ± 16.36 kg). Hamstring MTS was assessed using the damped oscillatory technique and the Myotonometer. Intraclass correlations were used to determine the intrasession, intersession, and interrater reliability of the Myotonometer. Criterion validity was assessed via Pearson product-moment correlation between MTS measures obtained from the Myotonometer and from the damped oscillatory technique. The Myotonometer demonstrated good intrasession (ICC3,1 = .807) and interrater reliability (ICC2,k = .830) and moderate intersession reliability (ICC2,k = .693). However, it did not provide a valid measurement of MTS compared with the damped oscillatory technique (r = .346, P = .061). The Myotonometer does not provide a valid measure of active hamstring MTS. Although the Myotonometer does not measure active MTS, it possesses good reliability and portability and could be used clinically to measure tissue compliance, muscle tone, or spasticity associated with multiple musculoskeletal disorders. Future research should focus on portable and clinically applicable tools to measure active hamstring MTS in efforts to prevent and monitor injuries.
Takada, M; Sugimoto, M; Ohno, S; Kuroi, K; Sato, N; Bando, H; Masuda, N; Iwata, H; Kondo, M; Sasano, H; Chow, L W C; Inamoto, T; Naito, Y; Tomita, M; Toi, M
2012-07-01
Nomogram, a standard technique that utilizes multiple characteristics to predict efficacy of treatment and likelihood of a specific status of an individual patient, has been used for prediction of response to neoadjuvant chemotherapy (NAC) in breast cancer patients. The aim of this study was to develop a novel computational technique to predict the pathological complete response (pCR) to NAC in primary breast cancer patients. A mathematical model using alternating decision trees, an epigone of decision tree, was developed using 28 clinicopathological variables that were retrospectively collected from patients treated with NAC (n = 150), and validated using an independent dataset from a randomized controlled trial (n = 173). The model selected 15 variables to predict the pCR with yielding area under the receiver operating characteristics curve (AUC) values of 0.766 [95 % confidence interval (CI)], 0.671-0.861, P value < 0.0001) in cross-validation using training dataset and 0.787 (95 % CI 0.716-0.858, P value < 0.0001) in the validation dataset. Among three subtypes of breast cancer, the luminal subgroup showed the best discrimination (AUC = 0.779, 95 % CI 0.641-0.917, P value = 0.0059). The developed model (AUC = 0.805, 95 % CI 0.716-0.894, P value < 0.0001) outperformed multivariate logistic regression (AUC = 0.754, 95 % CI 0.651-0.858, P value = 0.00019) of validation datasets without missing values (n = 127). Several analyses, e.g. bootstrap analysis, revealed that the developed model was insensitive to missing values and also tolerant to distribution bias among the datasets. Our model based on clinicopathological variables showed high predictive ability for pCR. This model might improve the prediction of the response to NAC in primary breast cancer patients.
Alahmad, Shoeb; Elfatatry, Hamed M; Mabrouk, Mokhtar M; Hammad, Sherin F; Mansour, Fotouh R
2018-01-01
The development and introduction of combined therapy represent a challenge for analysis due to severe overlapping of their UV spectra in case of spectroscopy or the requirement of a long tedious and high cost separation technique in case of chromatography. Quality control laboratories have to develop and validate suitable analytical procedures in order to assay such multi component preparations. New spectrophotometric methods for the simultaneous determination of simvastatin (SIM) and nicotinic acid (NIA) in binary combinations were developed. These methods are based on chemometric treatment of data, the applied chemometric techniques are multivariate methods including classical least squares (CLS), principal component regression (PCR) and partial least squares (PLS). In these techniques, the concentration data matrix were prepared by using the synthetic mixtures containing SIM and NIA dissolved in ethanol. The absorbance data matrix corresponding to the concentration data matrix was obtained by measuring the absorbance at 12 wavelengths in the range 216 - 240 nm at 2 nm intervals in the zero-order. The spectrophotometric procedures do not require any separation step. The accuracy, precision and the linearity ranges of the methods have been determined and validated by analyzing synthetic mixtures containing the studied drugs. Chemometric spectrophotometric methods have been developed in the present study for the simultaneous determination of simvastatin and nicotinic acid in their synthetic binary mixtures and in their mixtures with possible excipients present in tablet dosage form. The validation was performed successfully. The developed methods have been shown to be accurate, linear, precise, and so simple. The developed methods can be used routinely for the determination dosage form. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Shape Optimization by Bayesian-Validated Computer-Simulation Surrogates
NASA Technical Reports Server (NTRS)
Patera, Anthony T.
1997-01-01
A nonparametric-validated, surrogate approach to optimization has been applied to the computational optimization of eddy-promoter heat exchangers and to the experimental optimization of a multielement airfoil. In addition to the baseline surrogate framework, a surrogate-Pareto framework has been applied to the two-criteria, eddy-promoter design problem. The Pareto analysis improves the predictability of the surrogate results, preserves generality, and provides a means to rapidly determine design trade-offs. Significant contributions have been made in the geometric description used for the eddy-promoter inclusions as well as to the surrogate framework itself. A level-set based, geometric description has been developed to define the shape of the eddy-promoter inclusions. The level-set technique allows for topology changes (from single-body,eddy-promoter configurations to two-body configurations) without requiring any additional logic. The continuity of the output responses for input variations that cross the boundary between topologies has been demonstrated. Input-output continuity is required for the straightforward application of surrogate techniques in which simplified, interpolative models are fitted through a construction set of data. The surrogate framework developed previously has been extended in a number of ways. First, the formulation for a general, two-output, two-performance metric problem is presented. Surrogates are constructed and validated for the outputs. The performance metrics can be functions of both outputs, as well as explicitly of the inputs, and serve to characterize the design preferences. By segregating the outputs and the performance metrics, an additional level of flexibility is provided to the designer. The validated outputs can be used in future design studies and the error estimates provided by the output validation step still apply, and require no additional appeals to the expensive analysis. Second, a candidate-based a posteriori error analysis capability has been developed which provides probabilistic error estimates on the true performance for a design randomly selected near the surrogate-predicted optimal design.
A comparison of computer-assisted and manual wound size measurement.
Thawer, Habiba A; Houghton, Pamela E; Woodbury, M Gail; Keast, David; Campbell, Karen
2002-10-01
Accurate and precise wound measurements are a critical component of every wound assessment. To examine the reliability and validity of a new computerized technique for measuring human and animal wounds, chronic human wounds (N = 45) and surgical animal wounds (N = 38) were assessed using manual and computerized techniques. Using intraclass correlation coefficients, intrarater and interrater reliability of surface area measurements obtained using the computerized technique were compared to those obtained using acetate tracings and planimetry. A single measurement of surface area using either technique produced excellent intrarater and interrater reliability for both human and animal wounds, but the computerized technique was more precise than the manual technique for measuring the surface area of animal wounds. For both types of wounds and measurement techniques, intrarater and interrater reliability improved when the average of three repeated measurements was obtained. The precision of each technique with human wounds and the precision of the manual technique with animal wounds also improved when three repeated measurement results were averaged. Concurrent validity between the two techniques was excellent for human wounds but poor for the smaller animal wounds, regardless of whether single or the average of three repeated surface area measurements was used. The computerized technique permits reliable and valid assessment of the surface area of both human and animal wounds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nallasivam, Ulaganathan; Shah, Vishesh H.; Shenvi, Anirudh A.
We present a general Global Minimization Algorithm (GMA) to identify basic or thermally coupled distillation configurations that require the least vapor duty under minimum reflux conditions for separating any ideal or near-ideal multicomponent mixture into a desired number of product streams. In this algorithm, global optimality is guaranteed by modeling the system using Underwood equations and reformulating the resulting constraints to bilinear inequalities. The speed of convergence to the globally optimal solution is increased by using appropriate feasibility and optimality based variable-range reduction techniques and by developing valid inequalities. As a result, the GMA can be coupled with already developedmore » techniques that enumerate basic and thermally coupled distillation configurations, to provide for the first time, a global optimization based rank-list of distillation configurations.« less
Compound synchronization of four memristor chaotic oscillator systems and secure communication.
Sun, Junwei; Shen, Yi; Yin, Quan; Xu, Chengjie
2013-03-01
In this paper, a novel kind of compound synchronization among four chaotic systems is investigated, where the drive systems have been conceptually divided into two categories: scaling drive systems and base drive systems. Firstly, a sufficient condition is obtained to ensure compound synchronization among four memristor chaotic oscillator systems based on the adaptive technique. Secondly, a secure communication scheme via adaptive compound synchronization of four memristor chaotic oscillator systems is presented. The corresponding theoretical proofs and numerical simulations are given to demonstrate the validity and feasibility of the proposed control technique. The unpredictability of scaling drive systems can additionally enhance the security of communication. The transmitted signals can be split into several parts loaded in the drive systems to improve the reliability of communication.
NASA Astrophysics Data System (ADS)
Pavlov, Al. A.; Shevchenko, A. M.; Khotyanovsky, D. V.; Pavlov, A. A.; Shmakov, A. S.; Golubev, M. P.
2017-10-01
We present a method for and results of determination of the field of integral density in the structure of flow corresponding to the Mach interaction of shock waves at Mach number M = 3. The optical diagnostics of flow was performed using an interference technique based on self-adjusting Zernike filters (SA-AVT method). Numerical simulations were carried out using the CFS3D program package for solving the Euler and Navier-Stokes equations. Quantitative data on the distribution of integral density on the path of probing radiation in one direction of 3D flow transillumination in the region of Mach interaction of shock waves were obtained for the first time.
Development of Modal Test Techniques for Validation of a Solar Sail Design
NASA Technical Reports Server (NTRS)
Gaspar, James L.; Mann, Troy; Behun, Vaughn; Wilkie, W. Keats; Pappa, Richard
2004-01-01
This paper focuses on the development of modal test techniques for validation of a solar sail gossamer space structure design. The major focus is on validating and comparing the capabilities of various excitation techniques for modal testing solar sail components. One triangular shaped quadrant of a solar sail membrane was tested in a 1 Torr vacuum environment using various excitation techniques including, magnetic excitation, and surface-bonded piezoelectric patch actuators. Results from modal tests performed on the sail using piezoelectric patches at different positions are discussed. The excitation methods were evaluated for their applicability to in-vacuum ground testing and to the development of on orbit flight test techniques. The solar sail membrane was tested in the horizontal configuration at various tension levels to assess the variation in frequency with tension in a vacuum environment. A segment of a solar sail mast prototype was also tested in ambient atmospheric conditions using various excitation techniques, and these methods are also assessed for their ground test capabilities and on-orbit flight testing.
NASA Astrophysics Data System (ADS)
Sehad, Mounir; Lazri, Mourad; Ameur, Soltane
2017-03-01
In this work, a new rainfall estimation technique based on the high spatial and temporal resolution of the Spinning Enhanced Visible and Infra Red Imager (SEVIRI) aboard the Meteosat Second Generation (MSG) is presented. This work proposes efficient scheme rainfall estimation based on two multiclass support vector machine (SVM) algorithms: SVM_D for daytime and SVM_N for night time rainfall estimations. Both SVM models are trained using relevant rainfall parameters based on optical, microphysical and textural cloud proprieties. The cloud parameters are derived from the Spectral channels of the SEVIRI MSG radiometer. The 3-hourly and daily accumulated rainfall are derived from the 15 min-rainfall estimation given by the SVM classifiers for each MSG observation image pixel. The SVMs were trained with ground meteorological radar precipitation scenes recorded from November 2006 to March 2007 over the north of Algeria located in the Mediterranean region. Further, the SVM_D and SVM_N models were used to estimate 3-hourly and daily rainfall using data set gathered from November 2010 to March 2011 over north Algeria. The results were validated against collocated rainfall observed by rain gauge network. Indeed, the statistical scores given by correlation coefficient, bias, root mean square error and mean absolute error, showed good accuracy of rainfall estimates by the present technique. Moreover, rainfall estimates of our technique were compared with two high accuracy rainfall estimates methods based on MSG SEVIRI imagery namely: random forests (RF) based approach and an artificial neural network (ANN) based technique. The findings of the present technique indicate higher correlation coefficient (3-hourly: 0.78; daily: 0.94), and lower mean absolute error and root mean square error values. The results show that the new technique assign 3-hourly and daily rainfall with good and better accuracy than ANN technique and (RF) model.
Maier, Jürgen; Hampe, J Felix; Jahn, Nico
2016-01-01
Real-time response (RTR) measurement is an important technique for analyzing human processing of electronic media stimuli. Although it has been demonstrated that RTR data are reliable and internally valid, some argue that they lack external validity. The reason for this is that RTR measurement is restricted to a laboratory environment due to its technical requirements. This paper introduces a smartphone app that 1) captures real-time responses using the dial technique and 2) provides a solution for one of the most important problems in RTR measurement, the (automatic) synchronization of RTR data. In addition, it explores the reliability and validity of mobile RTR measurement by comparing the real-time reactions of two samples of young and well-educated voters to the 2013 German televised debate. Whereas the first sample participated in a classical laboratory study, the second sample was equipped with our mobile RTR system and watched the debate at home. Results indicate that the mobile RTR system yields similar results to the lab-based RTR measurement, providing evidence that laboratory studies using RTR are externally valid. In particular, the argument that the artificial reception situation creates artificial results has to be questioned. In addition, we conclude that RTR measurement outside the lab is possible. Hence, mobile RTR opens the door for large-scale studies to better understand the processing and impact of electronic media content.
Maier, Jürgen; Hampe, J. Felix; Jahn, Nico
2016-01-01
Real-time response (RTR) measurement is an important technique for analyzing human processing of electronic media stimuli. Although it has been demonstrated that RTR data are reliable and internally valid, some argue that they lack external validity. The reason for this is that RTR measurement is restricted to a laboratory environment due to its technical requirements. This paper introduces a smartphone app that 1) captures real-time responses using the dial technique and 2) provides a solution for one of the most important problems in RTR measurement, the (automatic) synchronization of RTR data. In addition, it explores the reliability and validity of mobile RTR measurement by comparing the real-time reactions of two samples of young and well-educated voters to the 2013 German televised debate. Whereas the first sample participated in a classical laboratory study, the second sample was equipped with our mobile RTR system and watched the debate at home. Results indicate that the mobile RTR system yields similar results to the lab-based RTR measurement, providing evidence that laboratory studies using RTR are externally valid. In particular, the argument that the artificial reception situation creates artificial results has to be questioned. In addition, we conclude that RTR measurement outside the lab is possible. Hence, mobile RTR opens the door for large-scale studies to better understand the processing and impact of electronic media content. PMID:27274577
Flight control system design factors for applying automated testing techniques
NASA Technical Reports Server (NTRS)
Sitz, Joel R.; Vernon, Todd H.
1990-01-01
Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.
Simulation verification techniques study. Subsystem simulation validation techniques
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1974-01-01
Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.
Evidence-based hypnotherapy for depression.
Alladin, Assen
2010-04-01
Cognitive hypnotherapy (CH) is a comprehensive evidence-based hypnotherapy for clinical depression. This article describes the major components of CH, which integrate hypnosis with cognitive-behavior therapy as the latter provides an effective host theory for the assimilation of empirically supported treatment techniques derived from various theoretical models of psychotherapy and psychopathology. CH meets criteria for an assimilative model of psychotherapy, which is considered to be an efficacious model of psychotherapy integration. The major components of CH for depression are described in sufficient detail to allow replication, verification, and validation of the techniques delineated. CH for depression provides a template that clinicians and investigators can utilize to study the additive effects of hypnosis in the management of other psychological or medical disorders. Evidence-based hypnotherapy and research are encouraged; such a movement is necessary if clinical hypnosis is to integrate into mainstream psychotherapy.
The VALiDATe29 MRI Based Multi-Channel Atlas of the Squirrel Monkey Brain.
Schilling, Kurt G; Gao, Yurui; Stepniewska, Iwona; Wu, Tung-Lin; Wang, Feng; Landman, Bennett A; Gore, John C; Chen, Li Min; Anderson, Adam W
2017-10-01
We describe the development of the first digital atlas of the normal squirrel monkey brain and present the resulting product, VALiDATe29. The VALiDATe29 atlas is based on multiple types of magnetic resonance imaging (MRI) contrast acquired on 29 squirrel monkeys, and is created using unbiased, nonlinear registration techniques, resulting in a population-averaged stereotaxic coordinate system. The atlas consists of multiple anatomical templates (proton density, T1, and T2* weighted), diffusion MRI templates (fractional anisotropy and mean diffusivity), and ex vivo templates (fractional anisotropy and a structural MRI). In addition, the templates are combined with histologically defined cortical labels, and diffusion tractography defined white matter labels. The combination of intensity templates and image segmentations make this atlas suitable for the fundamental atlas applications of spatial normalization and label propagation. Together, this atlas facilitates 3D anatomical localization and region of interest delineation, and enables comparisons of experimental data across different subjects or across different experimental conditions. This article describes the atlas creation and its contents, and demonstrates the use of the VALiDATe29 atlas in typical applications. The atlas is freely available to the scientific community.
Application of Petri net based analysis techniques to signal transduction pathways.
Sackmann, Andrea; Heiner, Monika; Koch, Ina
2006-11-02
Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules.
Application of Petri net based analysis techniques to signal transduction pathways
Sackmann, Andrea; Heiner, Monika; Koch, Ina
2006-01-01
Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. Conclusion The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules. PMID:17081284
Gupta, Sarthak; Chan, Diana W; Zaal, Kristien J; Kaplan, Mariana J
2018-01-15
Neutrophils play a key role in host defenses and have recently been implicated in the pathogenesis of autoimmune diseases by various mechanisms, including formation of neutrophil extracellular traps through a recently described distinct form of programmed cell death called NETosis. Techniques to assess and quantitate NETosis in an unbiased, reproducible, and efficient way are lacking, considerably limiting the advancement of research in this field. We optimized and validated, a new method to automatically quantify the percentage of neutrophils undergoing NETosis in real time using the IncuCyte ZOOM imaging platform and the membrane-permeability properties of two DNA dyes. Neutrophils undergoing NETosis induced by various physiological stimuli showed distinct changes, with a loss of multilobulated nuclei, as well as nuclear decondensation followed by membrane compromise, and were accurately counted by applying filters based on fluorescence intensity and nuclear size. Findings were confirmed and validated with the established method of immunofluorescence microscopy. The platform was also validated to rapidly assess and quantify the dose-dependent effect of inhibitors of NETosis. In addition, this method was able to distinguish among neutrophils undergoing NETosis, apoptosis, or necrosis based on distinct changes in nuclear morphology and membrane integrity. The IncuCyte ZOOM platform is a novel real-time assay that quantifies NETosis in a rapid, automated, and reproducible way, significantly optimizing the study of neutrophils. This platform is a powerful tool to assess neutrophil physiology and NETosis, as well as to swiftly develop and test novel neutrophil targets.
Guidance and Control Systems Simulation and Validation Techniques
1988-07-01
AGARDograph No.273 GUIDANCE AND CONTROL SYSTEMS SIMULATION AND VALIDATION TECHNIQUES Edited by Dr William P.Albritton, Jr AMTEC Corporation 213 Ridgelawn...AND DEVELOPMENT PROCESS FOR TACTICAL GUIDED WEAPONS by Dr W.PAlbritton, Jr AMTEC Corporation 213 Ridgelawn Drive Athens, AL 35611, USA Summary A brief
Classification and Validation of Behavioral Subtypes of Learning-Disabled Children.
ERIC Educational Resources Information Center
Speece, Deborah L.; And Others
1985-01-01
Using the Classroom Behavior Inventory, teachers rated the behaviors of 63 school-identified, learning-disabled first and second graders. Hierarchical cluster analysis techniques identified seven distinct behavioral subtypes. Internal validation techniques indicated that the subtypes were replicable and had profile patterns different from a sample…
Design and validation of diffusion MRI models of white matter
NASA Astrophysics Data System (ADS)
Jelescu, Ileana O.; Budde, Matthew D.
2017-11-01
Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus.
Design and validation of diffusion MRI models of white matter
Jelescu, Ileana O.; Budde, Matthew D.
2018-01-01
Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus. PMID:29755979
ERIC Educational Resources Information Center
Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen
2013-01-01
The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…
Exploring Dreamspace through Video Art with At-Risk Youth
ERIC Educational Resources Information Center
Ehinger, Jon
2009-01-01
This thesis is an art-based research video demonstration of an alternate medium for art therapy. It postulates the value and validity of media arts as a therapeutic modality by way of adopting the major motion picture green screening technique for therapy with an at-risk youth population. Four male participants, raging from 16 to 19 years of age,…
Developing an ICT-Literacy Task-Based Assessment Instrument: The Findings on the Final Testing Phase
ERIC Educational Resources Information Center
Mat-jizat, Jessnor Elmy
2013-01-01
This paper reports the findings of a study which seeks to identify the information and communications technology (ICT) literacy levels of trainee teachers, by investigating their ICT proficiency using a task-bask assessment instrument. The Delphi technique was used as a primary validation method for the new assessment tool and the ICT literacy…
Applying Formal Verification Techniques to Ambient Assisted Living Systems
NASA Astrophysics Data System (ADS)
Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel
This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.
Validation of oxygen extraction fraction measurement by qBOLD technique.
He, Xiang; Zhu, Mingming; Yablonskiy, Dmitriy A
2008-10-01
Measurement of brain tissue oxygen extraction fraction (OEF) in both baseline and functionally activated states can provide important information on brain functioning in health and disease. The recently proposed quantitative BOLD (qBOLD) technique is MRI-based and provides a regional in vivo OEF measurement (He and Yablonskiy, MRM 2007, 57:115-126). It is based on a previously developed analytical BOLD model and incorporates prior knowledge about the brain tissue composition including the contributions from grey matter, white matter, cerebrospinal fluid, interstitial fluid and intravascular blood. The qBOLD model also allows for the separation of contributions to the BOLD signal from OEF and the deoxyhemoglobin containing blood volume (DBV). The objective of this study is to validate OEF measurements provided by the qBOLD approach. To this end we use a rat model and compare qBOLD OEF measurements against direct measurements of the blood oxygenation level obtained from venous blood drawn directly from the superior sagittal sinus. The cerebral venous oxygenation level of the rat was manipulated by utilizing different anestheisa methods. The study demonstrates a very good agreement between qBOLD approach and direct measurements. (c) 2008 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Maierhofer, Christiane; Röllig, Mathias; Gower, Michael; Lodeiro, Maria; Baker, Graham; Monte, Christian; Adibekyan, Albert; Gutschwager, Berndt; Knazowicka, Lenka; Blahut, Ales
2018-05-01
For assuring the safety and reliability of components and constructions in energy applications made of fiber-reinforced polymers (e.g., blades of wind turbines and tidal power plants, engine chassis, flexible oil and gas pipelines) innovative non-destructive testing methods are required. Within the European project VITCEA complementary methods (shearography, microwave, ultrasonics and thermography) have been further developed and validated. Together with partners from the industry, test specimens have been constructed and selected on-site containing different artificial and natural defect artefacts. As base materials, carbon and glass fibers in different orientations and layering embedded in different matrix materials (epoxy, polyamide) have been considered. In this contribution, the validation of flash and lock-in thermography to these testing problems is presented. Data analysis is based on thermal contrasts and phase evaluation techniques. Experimental data are compared to analytical and numerical models. Among others, the influence of two different types of artificial defects (flat bottom holes and delaminations) with varying diameters and depths and of two different materials (CFRP and GFRP) with unidirectional and quasi-isotropic fiber alignment is discussed.
Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y
2003-01-01
Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.
Clustering Molecular Dynamics Trajectories for Optimizing Docking Experiments
De Paris, Renata; Quevedo, Christian V.; Ruiz, Duncan D.; Norberto de Souza, Osmar; Barros, Rodrigo C.
2015-01-01
Molecular dynamics simulations of protein receptors have become an attractive tool for rational drug discovery. However, the high computational cost of employing molecular dynamics trajectories in virtual screening of large repositories threats the feasibility of this task. Computational intelligence techniques have been applied in this context, with the ultimate goal of reducing the overall computational cost so the task can become feasible. Particularly, clustering algorithms have been widely used as a means to reduce the dimensionality of molecular dynamics trajectories. In this paper, we develop a novel methodology for clustering entire trajectories using structural features from the substrate-binding cavity of the receptor in order to optimize docking experiments on a cloud-based environment. The resulting partition was selected based on three clustering validity criteria, and it was further validated by analyzing the interactions between 20 ligands and a fully flexible receptor (FFR) model containing a 20 ns molecular dynamics simulation trajectory. Our proposed methodology shows that taking into account features of the substrate-binding cavity as input for the k-means algorithm is a promising technique for accurately selecting ensembles of representative structures tailored to a specific ligand. PMID:25873944
Nonlinear ultrasonic fatigue crack detection using a single piezoelectric transducer
NASA Astrophysics Data System (ADS)
An, Yun-Kyu; Lee, Dong Jun
2016-04-01
This paper proposes a new nonlinear ultrasonic technique for fatigue crack detection using a single piezoelectric transducer (PZT). The proposed technique identifies a fatigue crack using linear (α) and nonlinear (β) parameters obtained from only a single PZT mounted on a target structure. Based on the different physical characteristics of α and β, a fatigue crack-induced feature is able to be effectively isolated from the inherent nonlinearity of a target structure and data acquisition system. The proposed technique requires much simpler test setup and less processing costs than the existing nonlinear ultrasonic techniques, but fast and powerful. To validate the proposed technique, a real fatigue crack is created in an aluminum plate, and then false positive and negative tests are carried out under varying temperature conditions. The experimental results reveal that the fatigue crack is successfully detected, and no positive false alarm is indicated.
Cicchetti, Esmeralda; Chaintreau, Alain
2009-06-01
Accelerated solvent extraction (ASE) of vanilla beans has been optimized using ethanol as a solvent. A theoretical model is proposed to account for this multistep extraction. This allows the determination, for the first time, of the total amount of analytes initially present in the beans and thus the calculation of recoveries using ASE or any other extraction technique. As a result, ASE and Soxhlet extractions have been determined to be efficient methods, whereas recoveries are modest for maceration techniques and depend on the solvent used. Because industrial extracts are obtained by many different procedures, including maceration in various solvents, authenticating vanilla extracts using quantitative ratios between the amounts of vanilla flavor constituents appears to be unreliable. When authentication techniques based on isotopic ratios are used, ASE is a valid sample preparation technique because it does not induce isotopic fractionation.
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.
Improved modeling of GaN HEMTs for predicting thermal and trapping-induced-kink effects
NASA Astrophysics Data System (ADS)
Jarndal, Anwar; Ghannouchi, Fadhel M.
2016-09-01
In this paper, an improved modeling approach has been developed and validated for GaN high electron mobility transistors (HEMTs). The proposed analytical model accurately simulates the drain current and its inherent trapping and thermal effects. Genetic-algorithm-based procedure is developed to automatically find the fitting parameters of the model. The developed modeling technique is implemented on a packaged GaN-on-Si HEMT and validated by DC and small-/large-signal RF measurements. The model is also employed for designing and realizing a switch-mode inverse class-F power amplifier. The amplifier simulations showed a very good agreement with RF large-signal measurements.
Validation of SIV measurements of turbulent characteristics in the separation region
NASA Astrophysics Data System (ADS)
Dushin, N. S.; Mikheev, N. I.; Dushina, O. A.; Zaripov, D. I.; Aslaev, A. K.
2017-11-01
Temporally and spatially resolved 2D measurements are important for the studies of complex turbulent flows. The recently developed SIV technique (Smoke Image Velocimetry), which is superior to PIV in some cases, can be used for this purpose. SIV validation results are presented for the steady turbulent backward-facing step flow measurements. Velocity profiles and Reynolds stress profiles are given for the regions of oncoming flow, reverse flow, flow reattachment and relaxation. The Reynolds number based on the step height and oncoming flow velocity at the boundary layer edge was Reh = 4834. The obtained data have been compared to LDA measurements and DNS.
Theoretical modelling of AFM for bimetallic tip-substrate interactions
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Ferrante, John
1991-01-01
Recently, a new technique for calculating the defect energetics of alloys based on Equivalent Crystal Theory was developed. This new technique successfully predicts the bulk properties for binary alloys as well as segregation energies in the dilute limit. The authors apply this limit for the calculation of energy and force as a function of separation of an atomic force microscope (AFM) tip and substrate. The study was done for different combinations of tip and sample materials. The validity of the universality discovered for the same metal interfaces is examined for the case of different metal interactions.
Comparing interpolation techniques for annual temperature mapping across Xinjiang region
NASA Astrophysics Data System (ADS)
Ren-ping, Zhang; Jing, Guo; Tian-gang, Liang; Qi-sheng, Feng; Aimaiti, Yusupujiang
2016-11-01
Interpolating climatic variables such as temperature is challenging due to the highly variable nature of meteorological processes and the difficulty in establishing a representative network of stations. In this paper, based on the monthly temperature data which obtained from the 154 official meteorological stations in the Xinjiang region and surrounding areas, we compared five spatial interpolation techniques: Inverse distance weighting (IDW), Ordinary kriging, Cokriging, thin-plate smoothing splines (ANUSPLIN) and Empirical Bayesian kriging(EBK). Error metrics were used to validate interpolations against independent data. Results indicated that, the ANUSPLIN performed best than the other four interpolation methods.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
Photoacoustic imaging of fluorophores using pump-probe excitation
Märk, Julia; Schmitt, Franz-Josef; Theiss, Christoph; Dortay, Hakan; Friedrich, Thomas; Laufer, Jan
2015-01-01
A pump-probe technique for the detection of fluorophores in tomographic PA images is introduced. It is based on inducing stimulated emission in fluorescent molecules, which in turn modulates the amount of thermalized energy, and hence the PA signal amplitude. A theoretical model of the PA signal generation in fluorophores is presented and experimentally validated on cuvette measurements made in solutions of Rhodamine 6G, a fluorophore of known optical and molecular properties. The application of this technique to deep tissue tomographic PA imaging is demonstrated by determining the spatial distribution of a near-infrared fluorophore in a tissue phantom. PMID:26203378
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
Validating an Air Traffic Management Concept of Operation Using Statistical Modeling
NASA Technical Reports Server (NTRS)
He, Yuning; Davies, Misty Dawn
2013-01-01
Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis
SSME fault monitoring and diagnosis expert system
NASA Technical Reports Server (NTRS)
Ali, Moonis; Norman, Arnold M.; Gupta, U. K.
1989-01-01
An expert system, called LEADER, has been designed and implemented for automatic learning, detection, identification, verification, and correction of anomalous propulsion system operations in real time. LEADER employs a set of sensors to monitor engine component performance and to detect, identify, and validate abnormalities with respect to varying engine dynamics and behavior. Two diagnostic approaches are adopted in the architecture of LEADER. In the first approach fault diagnosis is performed through learning and identifying engine behavior patterns. LEADER, utilizing this approach, generates few hypotheses about the possible abnormalities. These hypotheses are then validated based on the SSME design and functional knowledge. The second approach directs the processing of engine sensory data and performs reasoning based on the SSME design, functional knowledge, and the deep-level knowledge, i.e., the first principles (physics and mechanics) of SSME subsystems and components. This paper describes LEADER's architecture which integrates a design based reasoning approach with neural network-based fault pattern matching techniques. The fault diagnosis results obtained through the analyses of SSME ground test data are presented and discussed.
NASA Astrophysics Data System (ADS)
Miner, Nadine Elizabeth
1998-09-01
This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.
Voxel based morphometry in optical coherence tomography: validation and core findings
NASA Astrophysics Data System (ADS)
Antony, Bhavna J.; Chen, Min; Carass, Aaron; Jedynak, Bruno M.; Al-Louzi, Omar; Solomon, Sharon D.; Saidha, Shiv; Calabresi, Peter A.; Prince, Jerry L.
2016-03-01
Optical coherence tomography (OCT) of the human retina is now becoming established as an important modality for the detection and tracking of various ocular diseases. Voxel based morphometry (VBM) is a long standing neuroimaging analysis technique that allows for the exploration of the regional differences in the brain. There has been limited work done in developing registration based methods for OCT, which has hampered the advancement of VBM analyses in OCT based population studies. Following on from our recent development of an OCT registration method, we explore the potential benefits of VBM analysis in cohorts of healthy controls (HCs) and multiple sclerosis (MS) patients. Specifically, we validate the stability of VBM analysis in two pools of HCs showing no significant difference between the two populations. Additionally, we also present a retrospective study of age and sex matched HCs and relapsing remitting MS patients, demonstrating results consistent with the reported literature while providing insight into the retinal changes associated with this MS subtype.
Okamoto, Takuma; Sakaguchi, Atsushi
2017-03-01
Generating acoustically bright and dark zones using loudspeakers is gaining attention as one of the most important acoustic communication techniques for such uses as personal sound systems and multilingual guide services. Although most conventional methods are based on numerical solutions, an analytical approach based on the spatial Fourier transform with a linear loudspeaker array has been proposed, and its effectiveness has been compared with conventional acoustic energy difference maximization and presented by computer simulations. To describe the effectiveness of the proposal in actual environments, this paper investigates the experimental validation of the proposed approach with rectangular and Hann windows and compared it with three conventional methods: simple delay-and-sum beamforming, contrast maximization, and least squares-based pressure matching using an actually implemented linear array of 64 loudspeakers in an anechoic chamber. The results of both the computer simulations and the actual experiments show that the proposed approach with a Hann window more accurately controlled the bright and dark zones than the conventional methods.
NASA Astrophysics Data System (ADS)
Nouizi, F.; Erkol, H.; Luk, A.; Marks, M.; Unlu, M. B.; Gulsen, G.
2016-10-01
We previously introduced photo-magnetic imaging (PMI), an imaging technique that illuminates the medium under investigation with near-infrared light and measures the induced temperature increase using magnetic resonance thermometry (MRT). Using a multiphysics solver combining photon migration and heat diffusion, PMI models the spatiotemporal distribution of temperature variation and recovers high resolution optical absorption images using these temperature maps. In this paper, we present a new fast non-iterative reconstruction algorithm for PMI. This new algorithm uses analytic methods during the resolution of the forward problem and the assembly of the sensitivity matrix. We validate our new analytic-based algorithm with the first generation finite element method (FEM) based reconstruction algorithm previously developed by our team. The validation is performed using, first synthetic data and afterwards, real MRT measured temperature maps. Our new method accelerates the reconstruction process 30-fold when compared to a single iteration of the FEM-based algorithm.
Kereszturya, László; Rajczya, Katalin; Lászikb, András; Gyódia, Eva; Pénzes, Mária; Falus, András; Petrányia, Gyõzõ G
2002-03-01
In cases of disputed paternity, the scientific goal is to promote either the exclusion of a falsely accused man or the affiliation of the alleged father. Until now, in addition to anthropologic characteristics, the determination of genetic markers included human leukocyte antigen gene variants; erythrocyte antigens and serum proteins were used for that reason. Recombinant DNA techniques provided a new set of highly variable genetic markers based on DNA nucleotide sequence polymorphism. From the practical standpoint, the application of these techniques to paternity testing provides greater versatility than do conventional genetic marker systems. The use of methods to detect the polymorphism of human leukocyte antigen loci significantly increases the chance of validation of ambiguous results in paternity testing. The outcome of 2384 paternity cases investigated by serologic and/or DNA-based human leukocyte antigen typing was statistically analyzed. Different cases solved by DNA typing are presented involving cases with one or two accused men, exclusions and nonexclusions, and tests of the paternity of a deceased man. The results provide evidence for the advantage of the combined application of various techniques in forensic diagnostics and emphasizes the outstanding possibilities of DNA-based assays. Representative examples demonstrate the strength of combined techniques in paternity testing.
NASA Astrophysics Data System (ADS)
Zhong, XiaoXu; Liao, ShiJun
2018-01-01
Analytic approximations of the Von Kármán's plate equations in integral form for a circular plate under external uniform pressure to arbitrary magnitude are successfully obtained by means of the homotopy analysis method (HAM), an analytic approximation technique for highly nonlinear problems. Two HAM-based approaches are proposed for either a given external uniform pressure Q or a given central deflection, respectively. Both of them are valid for uniform pressure to arbitrary magnitude by choosing proper values of the so-called convergence-control parameters c 1 and c 2 in the frame of the HAM. Besides, it is found that the HAM-based iteration approaches generally converge much faster than the interpolation iterative method. Furthermore, we prove that the interpolation iterative method is a special case of the first-order HAM iteration approach for a given external uniform pressure Q when c 1 = - θ and c 2 = -1, where θ denotes the interpolation iterative parameter. Therefore, according to the convergence theorem of Zheng and Zhou about the interpolation iterative method, the HAM-based approaches are valid for uniform pressure to arbitrary magnitude at least in the special case c 1 = - θ and c 2 = -1. In addition, we prove that the HAM approach for the Von Kármán's plate equations in differential form is just a special case of the HAM for the Von Kármán's plate equations in integral form mentioned in this paper. All of these illustrate the validity and great potential of the HAM for highly nonlinear problems, and its superiority over perturbation techniques.
Rakotonarivo, O Sarobidy; Schaafsma, Marije; Hockley, Neal
2016-12-01
While discrete choice experiments (DCEs) are increasingly used in the field of environmental valuation, they remain controversial because of their hypothetical nature and the contested reliability and validity of their results. We systematically reviewed evidence on the validity and reliability of environmental DCEs from the past thirteen years (Jan 2003-February 2016). 107 articles met our inclusion criteria. These studies provide limited and mixed evidence of the reliability and validity of DCE. Valuation results were susceptible to small changes in survey design in 45% of outcomes reporting reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2-90% of respondents protested against a feature of the survey, and a considerable proportion found DCEs to be incomprehensible or inconsequential (17-40% and 10-62% respectively). DCE remains useful for non-market valuation, but its results should be used with caution. Given the sparse and inconclusive evidence base, we recommend that tests of reliability and validity are more routinely integrated into DCE studies and suggest how this might be achieved. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
van der Ploeg, Tjeerd; Austin, Peter C; Steyerberg, Ewout W
2014-12-22
Modern modelling techniques may potentially provide more accurate predictions of binary outcomes than classical techniques. We aimed to study the predictive performance of different modelling techniques in relation to the effective sample size ("data hungriness"). We performed simulation studies based on three clinical cohorts: 1282 patients with head and neck cancer (with 46.9% 5 year survival), 1731 patients with traumatic brain injury (22.3% 6 month mortality) and 3181 patients with minor head injury (7.6% with CT scan abnormalities). We compared three relatively modern modelling techniques: support vector machines (SVM), neural nets (NN), and random forests (RF) and two classical techniques: logistic regression (LR) and classification and regression trees (CART). We created three large artificial databases with 20 fold, 10 fold and 6 fold replication of subjects, where we generated dichotomous outcomes according to different underlying models. We applied each modelling technique to increasingly larger development parts (100 repetitions). The area under the ROC-curve (AUC) indicated the performance of each model in the development part and in an independent validation part. Data hungriness was defined by plateauing of AUC and small optimism (difference between the mean apparent AUC and the mean validated AUC <0.01). We found that a stable AUC was reached by LR at approximately 20 to 50 events per variable, followed by CART, SVM, NN and RF models. Optimism decreased with increasing sample sizes and the same ranking of techniques. The RF, SVM and NN models showed instability and a high optimism even with >200 events per variable. Modern modelling techniques such as SVM, NN and RF may need over 10 times as many events per variable to achieve a stable AUC and a small optimism than classical modelling techniques such as LR. This implies that such modern techniques should only be used in medical prediction problems if very large data sets are available.
Li, Zhao-Liang
2018-01-01
Few studies have examined hyperspectral remote-sensing image classification with type-II fuzzy sets. This paper addresses image classification based on a hyperspectral remote-sensing technique using an improved interval type-II fuzzy c-means (IT2FCM*) approach. In this study, in contrast to other traditional fuzzy c-means-based approaches, the IT2FCM* algorithm considers the ranking of interval numbers and the spectral uncertainty. The classification results based on a hyperspectral dataset using the FCM, IT2FCM, and the proposed improved IT2FCM* algorithms show that the IT2FCM* method plays the best performance according to the clustering accuracy. In this paper, in order to validate and demonstrate the separability of the IT2FCM*, four type-I fuzzy validity indexes are employed, and a comparative analysis of these fuzzy validity indexes also applied in FCM and IT2FCM methods are made. These four indexes are also applied into different spatial and spectral resolution datasets to analyze the effects of spectral and spatial scaling factors on the separability of FCM, IT2FCM, and IT2FCM* methods. The results of these validity indexes from the hyperspectral datasets show that the improved IT2FCM* algorithm have the best values among these three algorithms in general. The results demonstrate that the IT2FCM* exhibits good performance in hyperspectral remote-sensing image classification because of its ability to handle hyperspectral uncertainty. PMID:29373548
Fast and Accurate Simulation Technique for Large Irregular Arrays
NASA Astrophysics Data System (ADS)
Bui-Van, Ha; Abraham, Jens; Arts, Michel; Gueuning, Quentin; Raucy, Christopher; Gonzalez-Ovejero, David; de Lera Acedo, Eloy; Craeye, Christophe
2018-04-01
A fast full-wave simulation technique is presented for the analysis of large irregular planar arrays of identical 3-D metallic antennas. The solution method relies on the Macro Basis Functions (MBF) approach and an interpolatory technique to compute the interactions between MBFs. The Harmonic-polynomial (HARP) model is established for the near-field interactions in a modified system of coordinates. For extremely large arrays made of complex antennas, two approaches assuming a limited radius of influence for mutual coupling are considered: one is based on a sparse-matrix LU decomposition and the other one on a tessellation of the array in the form of overlapping sub-arrays. The computation of all embedded element patterns is sped up with the help of the non-uniform FFT algorithm. Extensive validations are shown for arrays of log-periodic antennas envisaged for the low-frequency SKA (Square Kilometer Array) radio-telescope. The analysis of SKA stations with such a large number of elements has not been treated yet in the literature. Validations include comparison with results obtained with commercial software and with experiments. The proposed method is particularly well suited to array synthesis, in which several orders of magnitude can be saved in terms of computation time.
Lauerman, Lloyd H
2004-12-01
Since the discovery of the polymerase chain reaction (PCR) 20 years ago, an avalanche of scientific publications have reported major developments and changes in specialized equipment, reagents, sample preparation, computer programs and techniques, generated through business, government and university research. The requirement for genetic sequences for primer selection and validation has been greatly facilitated by the development of new sequencing techniques, machines and computer programs. Genetic libraries, such as GenBank, EMBL and DDBJ continue to accumulate a wealth of genetic sequence information for the development and validation of molecular-based diagnostic procedures concerning human and veterinary disease agents. The mechanization of various aspects of the PCR assay, such as robotics, microfluidics and nanotechnology, has made it possible for the rapid advancement of new procedures. Real-time PCR, DNA microarray and DNA chips utilize these newer techniques in conjunction with computer and computer programs. Instruments for hand-held PCR assays are being developed. The PCR and reverse transcription-PCR (RT-PCR) assays have greatly accelerated the speed and accuracy of diagnoses of human and animal disease, especially of the infectious agents that are difficult to isolate or demonstrate. The PCR has made it possible to genetically characterize a microbial isolate inexpensively and rapidly for identification, typing and epidemiological comparison.
Weinstock, Peter; Rehder, Roberta; Prabhu, Sanjay P; Forbes, Peter W; Roussin, Christopher J; Cohen, Alan R
2017-07-01
OBJECTIVE Recent advances in optics and miniaturization have enabled the development of a growing number of minimally invasive procedures, yet innovative training methods for the use of these techniques remain lacking. Conventional teaching models, including cadavers and physical trainers as well as virtual reality platforms, are often expensive and ineffective. Newly developed 3D printing technologies can recreate patient-specific anatomy, but the stiffness of the materials limits fidelity to real-life surgical situations. Hollywood special effects techniques can create ultrarealistic features, including lifelike tactile properties, to enhance accuracy and effectiveness of the surgical models. The authors created a highly realistic model of a pediatric patient with hydrocephalus via a unique combination of 3D printing and special effects techniques and validated the use of this model in training neurosurgery fellows and residents to perform endoscopic third ventriculostomy (ETV), an effective minimally invasive method increasingly used in treating hydrocephalus. METHODS A full-scale reproduction of the head of a 14-year-old adolescent patient with hydrocephalus, including external physical details and internal neuroanatomy, was developed via a unique collaboration of neurosurgeons, simulation engineers, and a group of special effects experts. The model contains "plug-and-play" replaceable components for repetitive practice. The appearance of the training model (face validity) and the reproducibility of the ETV training procedure (content validity) were assessed by neurosurgery fellows and residents of different experience levels based on a 14-item Likert-like questionnaire. The usefulness of the training model for evaluating the performance of the trainees at different levels of experience (construct validity) was measured by blinded observers using the Objective Structured Assessment of Technical Skills (OSATS) scale for the performance of ETV. RESULTS A combination of 3D printing technology and casting processes led to the creation of realistic surgical models that include high-fidelity reproductions of the anatomical features of hydrocephalus and allow for the performance of ETV for training purposes. The models reproduced the pulsations of the basilar artery, ventricles, and cerebrospinal fluid (CSF), thus simulating the experience of performing ETV on an actual patient. The results of the 14-item questionnaire showed limited variability among participants' scores, and the neurosurgery fellows and residents gave the models consistently high ratings for face and content validity. The mean score for the content validity questions (4.88) was higher than the mean score for face validity (4.69) (p = 0.03). On construct validity scores, the blinded observers rated performance of fellows significantly higher than that of residents, indicating that the model provided a means to distinguish between novice and expert surgical skills. CONCLUSIONS A plug-and-play lifelike ETV training model was developed through a combination of 3D printing and special effects techniques, providing both anatomical and haptic accuracy. Such simulators offer opportunities to accelerate the development of expertise with respect to new and novel procedures as well as iterate new surgical approaches and innovations, thus allowing novice neurosurgeons to gain valuable experience in surgical techniques without exposing patients to risk of harm.
Validation d'un nouveau calcul de reference en evolution pour les reacteurs thermiques
NASA Astrophysics Data System (ADS)
Canbakan, Axel
Resonance self-shielding calculations are an essential component of a deterministic lattice code calculation. Even if their aim is to correct the cross sections deviation, they introduce a non negligible error in evaluated parameters such as the flux. Until now, French studies for light water reactors are based on effective reaction rates obtained using an equivalence in dilution technique. With the increase of computing capacities, this method starts to show its limits in precision and can be replaced by a subgroup method. Originally used for fast neutron reactor calculations, the subgroup method has many advantages such as using an exact slowing down equation. The aim of this thesis is to suggest a validation as precise as possible without burnup, and then with an isotopic depletion study for the subgroup method. In the end, users interested in implementing a subgroup method in their scheme for Pressurized Water Reactors can rely on this thesis to justify their modelization choices. Moreover, other parameters are validated to suggest a new reference scheme for fast execution and precise results. These new techniques are implemented in the French lattice scheme SHEM-MOC, composed of a Method Of Characteristics flux calculation and a SHEM-like 281-energy group mesh. First, the libraries processed by the CEA are compared. Then, this thesis suggests the most suitable energetic discretization for a subgroup method. Finally, other techniques such as the representation of the anisotropy of the scattering sources and the spatial representation of the source in the MOC calculation are studied. A DRAGON5 scheme is also validated as it shows interesting elements: the DRAGON5 subgroup method is run with a 295-eenergy group mesh (compared to 361 groups for APOLLO2). There are two reasons to use this code. The first involves offering a new reference lattice scheme for Pressurized Water Reactors to DRAGON5 users. The second is to study parameters that are not available in APOLLO2 such as self-shielding in a temperature gradient and using a flux calculation based on MOC in the self-shielding part of the simulation. This thesis concludes that: (1) The subgroup method is at least more precise than a technique based on effective reaction rates, only if we use a 361-energy group mesh; (2) MOC with a linear source in a geometrical region gives better results than a MOC with a constant model. A moderator discretization is compulsory; (3) A P3 choc law is satisfactory, ensuring a coherence with 2D full core calculations; (4) SHEM295 is viable with a Subgroup Projection Method for DRAGON5.
Neufeld, E; Chavannes, N; Samaras, T; Kuster, N
2007-08-07
The modeling of thermal effects, often based on the Pennes Bioheat Equation, is becoming increasingly popular. The FDTD technique commonly used in this context suffers considerably from staircasing errors at boundaries. A new conformal technique is proposed that can easily be integrated into existing implementations without requiring a special update scheme. It scales fluxes at interfaces with factors derived from the local surface normal. The new scheme is validated using an analytical solution, and an error analysis is performed to understand its behavior. The new scheme behaves considerably better than the standard scheme. Furthermore, in contrast to the standard scheme, it is possible to obtain with it more accurate solutions by increasing the grid resolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravelo Arias, S. I.; Ramírez Muñoz, D.; Cardoso, S.
2015-06-15
The work shows a measurement technique to obtain the correct value of the four elements in a resistive Wheatstone bridge without the need to separate the physical connections existing between them. Two electronic solutions are presented, based on a source-and-measure unit and using discrete electronic components. The proposed technique brings the possibility to know the mismatching or the tolerance between the bridge resistive elements and then to pass or reject it in terms of its related common-mode rejection. Experimental results were taken in various Wheatstone resistive bridges (discrete and magnetoresistive integrated bridges) validating the proposed measurement technique specially when themore » bridge is micro-fabricated and there is no physical way to separate one resistive element from the others.« less
AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*
Bruch, Elizabeth; Atwell, Jon
2014-01-01
Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351
Web-based Food Behaviour Questionnaire: validation with grades six to eight students.
Hanning, Rhona M; Royall, Dawna; Toews, Jenn E; Blashill, Lindsay; Wegener, Jessica; Driezen, Pete
2009-01-01
The web-based Food Behaviour Questionnaire (FBQ) includes a 24-hour diet recall, a food frequency questionnaire, and questions addressing knowledge, attitudes, intentions, and food-related behaviours. The survey has been revised since it was developed and initially validated. The current study was designed to obtain qualitative feedback and to validate the FBQ diet recall. "Think aloud" techniques were used in cognitive interviews with dietitian experts (n=11) and grade six students (n=21). Multi-ethnic students (n=201) in grades six to eight at urban southern Ontario schools completed the FBQ and, subsequently, one-on-one diet recall interviews with trained dietitians. Food group and nutrient intakes were compared. Users provided positive feedback on the FBQ. Suggestions included adding more foods, more photos for portion estimation, and online student feedback. Energy and nutrient intakes were positively correlated between FBQ and dietitian interviews, overall and by gender and grade (all p<0.001). Intraclass correlation coefficients were ≥0.5 for energy and macro-nutrients, although the web-based survey underestimated energy (10.5%) and carbohydrate (-15.6%) intakes (p<0.05). Under-estimation of rice and pasta portions on the web accounted for 50% of this discrepancy. The FBQ is valid, relative to 24-hour recall interviews, for dietary assessment in diverse populations of Ontario children in grades six to eight.
Berger, Steve; Hasler, Carol-Claudius; Grant, Caroline A; Zheng, Guoyan; Schumann, Steffen; Büchler, Philippe
2017-01-01
The aim of this study was to validate a new program which aims at measuring the three-dimensional length of the spine's midline based on two calibrated orthogonal radiographic images. The traditional uniplanar T1-S1 measurement method is not reflecting the actual three dimensional curvature of a scoliotic spine and is therefore not accurate. The Spinal Measurement Software (SMS) is an alternative to conveniently measure the true spine's length. The validity, inter- and intra-observer variability and usability of the program were evaluated. The usability was quantified based on a subjective questionnaire filled by eight participants using the program for the first time. The validity and variability were assessed by comparing the length of five phantom spines measured based on CT-scan data and on radiographic images with the SMS. The lengths were measured independently by each participant using both techniques. The SMS is easy and intuitive to use, even for non-clinicians. The SMS measured spinal length with an error below 2 millimeters compared to length obtained using CT scan datasets. The inter- and intra-observer variability of the SMS measurements was below 5 millimeters. The SMS provides accurate measurement of the spinal length based on orthogonal radiographic images. The software is easy to use and could easily integrate the clinical workflow and replace current approximations of the spinal length based on a single radiographic image such as the traditional T1-S1 measurement. Crown Copyright © 2016. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cánovas-García, Fulgencio; Alonso-Sarría, Francisco; Gomariz-Castillo, Francisco; Oñate-Valdivieso, Fernando
2017-06-01
Random forest is a classification technique widely used in remote sensing. One of its advantages is that it produces an estimation of classification accuracy based on the so called out-of-bag cross-validation method. It is usually assumed that such estimation is not biased and may be used instead of validation based on an external data-set or a cross-validation external to the algorithm. In this paper we show that this is not necessarily the case when classifying remote sensing imagery using training areas with several pixels or objects. According to our results, out-of-bag cross-validation clearly overestimates accuracy, both overall and per class. The reason is that, in a training patch, pixels or objects are not independent (from a statistical point of view) of each other; however, they are split by bootstrapping into in-bag and out-of-bag as if they were really independent. We believe that putting whole patch, rather than pixels/objects, in one or the other set would produce a less biased out-of-bag cross-validation. To deal with the problem, we propose a modification of the random forest algorithm to split training patches instead of the pixels (or objects) that compose them. This modified algorithm does not overestimate accuracy and has no lower predictive capability than the original. When its results are validated with an external data-set, the accuracy is not different from that obtained with the original algorithm. We analysed three remote sensing images with different classification approaches (pixel and object based); in the three cases reported, the modification we propose produces a less biased accuracy estimation.
Chen, Yinsheng; Li, Zeju; Wu, Guoqing; Yu, Jinhua; Wang, Yuanyuan; Lv, Xiaofei; Ju, Xue; Chen, Zhongping
2018-07-01
Due to the totally different therapeutic regimens needed for primary central nervous system lymphoma (PCNSL) and glioblastoma (GBM), accurate differentiation of the two diseases by noninvasive imaging techniques is important for clinical decision-making. Thirty cases of PCNSL and 66 cases of GBM with conventional T1-contrast magnetic resonance imaging (MRI) were analyzed in this study. Convolutional neural networks was used to segment tumor automatically. A modified scale invariant feature transform (SIFT) method was utilized to extract three-dimensional local voxel arrangement information from segmented tumors. Fisher vector was proposed to normalize the dimension of SIFT features. An improved genetic algorithm (GA) was used to extract SIFT features with PCNSL and GBM discrimination ability. The data-set was divided into a cross-validation cohort and an independent validation cohort by the ratio of 2:1. Support vector machine with the leave-one-out cross-validation based on 20 cases of PCNSL and 44 cases of GBM was employed to build and validate the differentiation model. Among 16,384 high-throughput features, 1356 features show significant differences between PCNSL and GBM with p < 0.05 and 420 features with p < 0.001. A total of 496 features were finally chosen by improved GA algorithm. The proposed method produces PCNSL vs. GBM differentiation with an area under the curve (AUC) curve of 99.1% (98.2%), accuracy 95.3% (90.6%), sensitivity 85.0% (80.0%) and specificity 100% (95.5%) on the cross-validation cohort (and independent validation cohort). Since the local voxel arrangement characterization provided by SIFT features, proposed method produced more competitive PCNSL and GBM differentiation performance by using conventional MRI than methods based on advanced MRI.