Airport Landside - Volume III : ALSIM Calibration and Validation.
DOT National Transportation Integrated Search
1982-06-01
This volume discusses calibration and validation procedures applied to the Airport Landside Simulation Model (ALSIM), using data obtained at Miami, Denver and LaGuardia Airports. Criteria for the selection of a validation methodology are described. T...
DOT National Transportation Integrated Search
1995-01-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...
DOT National Transportation Integrated Search
1995-09-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...
Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2
NASA Technical Reports Server (NTRS)
Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)
1998-01-01
The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.
Demougeot-Renard, Helene; De Fouquet, Chantal
2004-10-01
Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.
2018-01-01
Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.
Assessing the quality of the volume-outcome relationship in uro-oncology.
Mayer, Erik K; Purkayastha, Sanjay; Athanasiou, Thanos; Darzi, Ara; Vale, Justin A
2009-02-01
To assess systematically the quality of evidence for the volume-outcome relationship in uro-oncology, and thus facilitate the formulating of health policy within this speciality, as 'Implementation of Improving Outcome Guidance' has led to centralization of uro-oncology based on published studies that have supported a 'higher volume-better outcome' relationship, but improved awareness of methodological drawbacks in health service research has questioned the strength of this proposed volume-outcome relationship. We systematically searched previous relevant reports and extracted all articles from 1980 onwards assessing the volume-outcome relationship for cystectomy, prostatectomy and nephrectomy at the institution and/or surgeon level. Studies were assessed for their methodological quality using a previously validated rating system. Where possible, meta-analytical methods were used to calculate overall differences in outcome measures between low and high volume healthcare providers. In all, 22 studies were included in the final analysis; 19 of these were published in the last 5 years. Only four studies appropriately explored the effect of both the institution and surgeon volume on outcome measures. Mortality and length of stay were the most frequently measured outcomes. The median total quality scores within each of the operation types were 8.5, 9 and 8 for cystectomy, prostatectomy and nephrectomy, respectively (possible maximum score 18). Random-effects modelling showed a higher risk of mortality in low-volume institutions than in higher-volume institutions for both cystectomy and nephrectomy (odds ratio 1.88, 95% confidence interval 1.54-2.29, and 1.28, 1.10-1.49, respectively). The methodological quality of volume-outcome research as applied to cystectomy, prostatectomy and nephrectomy is only modest at best. Accepting several limitations, pooled analysis confirms a higher-volume, lower-mortality relationship for cystectomy and nephrectomy. Future research should focus on the development of a quality framework with a validated scoring system for the bench-marking of data to improve validity and facilitate rational policy-making within the speciality of uro-oncology.
Working Papers in Dialogue Modeling, Volume 2.
ERIC Educational Resources Information Center
Mann, William C.; And Others
The technical working papers that comprise the two volumes of this document are related to the problem of creating a valid process model of human communication in dialogue. In Volume 2, the first paper concerns study methodology, and raises such issues as the choice between system-building and process-building, and the advantages of studying cases…
NASA Technical Reports Server (NTRS)
Bogomolov, V. V.; Duncan, J. M.; Alferova, I. V.; Dulchavsky, S. A.; Ebert, D.; Hamilton, D. R.; Matveev, V. P.; Sargsyan, A. E.
2008-01-01
Recent advances in remotely guided imaging techniques on ISS allow the acquisition of high quality ultrasound data using crewmember operators with no medical background and minimal training. However, ongoing efforts are required to develop and validate methodology for complex imaging protocols to ensure their repeatability, efficiency, and suitability for use aboard the ISS. This Station Developmental Test Objective (SDTO) tests a cardiovascular evaluation methodology that takes advantage of the ISS Ultrasound capability, the Braslet-M device, and modified respiratory maneuvers (Valsalva and Mueller), to broaden the spectrum of anatomical and functional information on human cardiovascular system during long-duration space missions. The proposed methodology optimizes and combines new and previously demonstrated methods, and is expected to benefit medically indicated assessments, operational research protocols, and data collections for science. Braslet-M is a current Russian operational countermeasure that compresses the upper thigh to impede the venous return from lower extremities. The goal of the SDTO is to establish and validate a repeatable ultrasound-based methodology for the assessment of a number of cardiovascular criteria in microgravity. Braslet-M device is used as a means to acutely alter volume distribution while focused ultrasound measurements are performed. Modified respiratory maneuvers are done upon volume manipulations to record commensurate changes in anatomical and functional parameters. The overall cardiovascular effects of the Braslet-M device are not completely understood, and although not a primary objective of this SDTO, this effort will provide pilot data regarding the suitability of Braslet-M for its intended purpose, effects, and the indications for its use.
Fan, Leimin; Lee, Jacob; Hall, Jeffrey; Tolentino, Edward J; Wu, Huaiqin; El-Shourbagy, Tawakol
2011-06-01
This article describes validation work for analysis of an Abbott investigational drug (Compound A) in monkey whole blood with dried blood spots (DBS). The impact of DBS spotting volume on analyte concentration was investigated. The quantitation range was between 30.5 and 10,200 ng/ml. Accuracy and precision of quality controls, linearity of calibration curves, matrix effect, selectivity, dilution, recovery and multiple stabilities were evaluated in the validation, and all demonstrated acceptable results. Incurred sample reanalysis was performed with 57 out of 58 samples having a percentage difference (versus the mean value) less than 20%. A linear relationship between the spotting volume and the spot area was drawn. The influence of spotting volume on concentration was discussed. All validation results met good laboratory practice acceptance requirements. Radial spreading of blood on DBS cards can be a factor in DBS concentrations at smaller spotting volumes.
Wetzel, Hermann
2006-01-01
In a large number of mostly retrospective association studies, a statistical relationship between volume and quality of health care has been reported. However, the relevance of these results is frequently limited by methodological shortcomings. In this article, criteria for the evidence and definition of thresholds for volume-outcome relations are proposed, e.g. the specification of relevant outcomes for quality indicators, analysis of volume as a continuous variable with an adequate case-mix and risk adjustment, accounting for cluster effects and considering mathematical models for the derivation of cut-off values. Moreover, volume thresholds are regarded as surrogate parameters for the indirect classification of the quality of care, whose diagnostic validity and effectiveness in improving health care quality need to be evaluated in prospective studies.
Antoniou, Stavros A; Andreou, Alexandros; Antoniou, George A; Koch, Oliver O; Köhler, Gernot; Luketina, Ruzica-R; Bertsias, Antonios; Pointner, Rudolph; Granderath, Frank-Alexander
2015-11-01
Measures have been taken to improve methodological quality of randomized controlled trials (RCTs). This review systematically assessed the trends in volume and methodological quality of RCTs on minimally invasive surgery within a 10-year period. RCTs on minimally invasive surgery were searched in the 10 most cited general surgical journals and the 5 most cited journals of laparoscopic interest for the years 2002 and 2012. Bibliometric and methodological quality components were abstracted using the Scottish Intercollegiate Guidelines Network. The pooled number of RCTs from low-contribution regions demonstrated an increasing proportion of the total published RCTs, compensating for a concomitant decrease of the respective contributions from Europe and North America. International collaborations were more frequent in 2012. Acceptable or high quality RCTs accounted for 37.9% and 54.4% of RCTs published in 2002 and 2012, respectively. Components of external validity were poorly reported. Both the volume and the reporting quality of laparoscopic RCTs have increased from 2002 to 2012, but there seems to be ample room for improvement of methodological quality. Copyright © 2015 Elsevier Inc. All rights reserved.
Okariz, Ana; Guraya, Teresa; Iturrondobeitia, Maider; Ibarretxe, Julen
2017-02-01
The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. Copyright © 2016 Elsevier B.V. All rights reserved.
Higher order solution of the Euler equations on unstructured grids using quadratic reconstruction
NASA Technical Reports Server (NTRS)
Barth, Timothy J.; Frederickson, Paul O.
1990-01-01
High order accurate finite-volume schemes for solving the Euler equations of gasdynamics are developed. Central to the development of these methods are the construction of a k-exact reconstruction operator given cell-averaged quantities and the use of high order flux quadrature formulas. General polygonal control volumes (with curved boundary edges) are considered. The formulations presented make no explicit assumption as to complexity or convexity of control volumes. Numerical examples are presented for Ringleb flow to validate the methodology.
A normative price for a manufactured product: The SAMICS methodology. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.
1979-01-01
A summary for the Solar Array Manufacturing Industry Costing Standards report contains a discussion of capabilities and limitations, a non-technical overview of the methodology, and a description of the input data which must be collected. It also describes the activities that were and are being taken to ensure validity of the results and contains an up-to-date bibliography of related documents.
ERIC Educational Resources Information Center
Kaskowitz, David H.
The booklet provides detailed estimates on handicapping conditions for school aged populations. The figures are intended to help the federal government validate state child count data as required by P.L. 94-142, the Education for All Handicapped Children. Section I uncovers the methodology used to arrive at the estimates, and it identifies the…
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 1
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. This is Volume 1, an Executive Summary. Volume 2 contains Appendices A (Aerothermal Comparisons) and B (Flight Derived h sub 1/h sub u vs. M sub inf. Plots), and Volume 3 contains Appendix C (Comparison of Interference Factors among OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 3
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. Volume 2 contains Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub 1/h sub u vs. M sub inf. Plots). This is Volume 3, containing Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 2
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. This is volume 2, containing Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub i/h sub u vs. M sub inf. Plots). Volume 3 contains Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
NASA Technical Reports Server (NTRS)
Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.
1993-01-01
This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.
Methodology for Software Reliability Prediction. Volume 2.
1987-11-01
The overall acquisition ,z program shall include the resources, schedule, management, structure , and controls necessary to ensure that specified AD...Independent Verification/Validation - Programming Team Structure - Educational Level of Team Members - Experience Level of Team Members * Methods Used...Prediction or Estimation Parameter Supported: Software - Characteristics 3. Objectives: Structured programming studies and Government Ur.’.. procurement
76 FR 36582 - Submission for Review: Standard Form 2809, Health Benefits Election Form
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
..., 2010 at Volume 75 FR 39587 allowing for a 60-day public comment period. We received comments from one... comments that: 1. Evaluate whether the proposed collection of information is necessary for the proper..., including the validity of the methodology and assumptions used; 3. Enhance the quality, utility, and clarity...
NASA Technical Reports Server (NTRS)
Hamilton, Douglas; Sargsyan, Ashot E.; Ebert, Douglas; Duncan, Michael; Bogomolov, Valery V.; Alferova, Irina V.; Matveev, Vladimir P.; Dulchavsky, Scott A.
2010-01-01
The objective of this joint U.S. - Russian project was the development and validation of an in-flight methodology to assess a number of cardiac and vascular parameters associated with circulating volume and its manipulation in long-duration space flight. Responses to modified Valsalva and Mueller maneuvers were measured by cardiac and vascular ultrasound (US) before, during, and after temporary volume reduction by means of Braslet-M thigh occlusion cuffs (Russia). Materials and Methods: The study protocol was conducted in 14 sessions on 9 ISS crewmembers, with an average exposure to microgravity of 122 days. Baseline cardiovascular measurements were taken by echocardiography in multiple modes (including tissue Doppler of both ventricles) and femoral and jugular vein imaging on the International Space Station (ISS). The Braslet devices were then applied and measurements were repeated after >10 minutes. The cuffs were then released and the hemodynamic recovery process was monitored. Modified Valsalva and Mueller maneuvers were used throughout the protocol. All US data were acquired by the HDI-5000 ultrasound system aboard the ISS (ATL/Philips, USA) during remotely guided sessions. The study protocol, including the use of Braslet-M for this purpose, was approved by the ISS Human Research Multilateral Review Board (HRMRB). Results: The effects of fluid sequestration on a number of echocardiographic and vascular parameters were readily detectable by in-flight US, as were responses to respiratory maneuvers. The overall volume status assessment methodology appears to be valid and practical, with a decrease in left heart lateral E (tissue Doppler) as one of the most reliable measures. Increase in the femoral vein cross-sectional areas was consistently observed with Braslet application. Other significant differences and trends within the extensive cardiovascular data were also observed. (Decreased - RV and LV preload indices, Cardiac Output, LV E all maneuvers, LV Stroke Volume). Conclusions: This Study: 1) Addressed specific aspects of operational space medicine and space physiology, including assessment of circulating volume disturbances 2) Expanded the applications of diagnostic ultrasound imaging and Doppler techniques in microgravity. 3) Used respiratory maneuvers against the background of acute circulating volume manipulations which appear to enhance our ability to noninvasively detect volume-dependency in a number of cardiac and vascular parameters. 4) Determined that Tei index is not clinically changed therefore contractility not altered in the face of reduced preload. 5) Determined that increased Femoral Vein Area indicating blood being sequestered in lower extremities correlates with reduced preload and cardiac output. 6) That Braslet may be the only feasible means of acutely treating high pressure pulmonary edema in reduced gravity environments.
A Validity-Based Approach to Quality Control and Assurance of Automated Scoring
ERIC Educational Resources Information Center
Bejar, Isaac I.
2011-01-01
Automated scoring of constructed responses is already operational in several testing programmes. However, as the methodology matures and the demand for the utilisation of constructed responses increases, the volume of automated scoring is likely to increase at a fast pace. Quality assurance and control of the scoring process will likely be more…
2006-09-01
trademark of the Boeing Company. Mercedes - Benz ® is a registered trademark of DaimlerChrysler AG Corporation. Released by R. J. Smillie, Head...Team DCC CLIP architecture. 2.2.4 Methodology The test conditions for the DCC team consisted of an instrumented Mercedes Benz ® S class vehicle
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-05
... on November 18, 2010 at Volume 75 FR 70710 allowing for a 60-day public comment period. No comments... that: 1. Evaluate whether the proposed collection of information is necessary for the proper..., including the validity of the methodology and assumptions used; 3. Enhance the quality, utility, and clarity...
Dobbin, Kevin K; Cesano, Alessandra; Alvarez, John; Hawtin, Rachael; Janetzki, Sylvia; Kirsch, Ilan; Masucci, Giuseppe V; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Zhang, Jenny; Butterfield, Lisa H; Thurin, Magdalena
2016-01-01
There is growing recognition that immunotherapy is likely to significantly improve health outcomes for cancer patients in the coming years. Currently, while a subset of patients experience substantial clinical benefit in response to different immunotherapeutic approaches, the majority of patients do not but are still exposed to the significant drug toxicities. Therefore, a growing need for the development and clinical use of predictive biomarkers exists in the field of cancer immunotherapy. Predictive cancer biomarkers can be used to identify the patients who are or who are not likely to derive benefit from specific therapeutic approaches. In order to be applicable in a clinical setting, predictive biomarkers must be carefully shepherded through a step-wise, highly regulated developmental process. Volume I of this two-volume document focused on the pre-analytical and analytical phases of the biomarker development process, by providing background, examples and "good practice" recommendations. In the current Volume II, the focus is on the clinical validation, validation of clinical utility and regulatory considerations for biomarker development. Together, this two volume series is meant to provide guidance on the entire biomarker development process, with a particular focus on the unique aspects of developing immune-based biomarkers. Specifically, knowledge about the challenges to clinical validation of predictive biomarkers, which has been gained from numerous successes and failures in other contexts, will be reviewed together with statistical methodological issues related to bias and overfitting. The different trial designs used for the clinical validation of biomarkers will also be discussed, as the selection of clinical metrics and endpoints becomes critical to establish the clinical utility of the biomarker during the clinical validation phase of the biomarker development. Finally, the regulatory aspects of submission of biomarker assays to the U.S. Food and Drug Administration as well as regulatory considerations in the European Union will be covered.
Hyperbolic reformulation of a 1D viscoelastic blood flow model and ADER finite volume schemes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montecinos, Gino I.; Müller, Lucas O.; Toro, Eleuterio F.
2014-06-01
The applicability of ADER finite volume methods to solve hyperbolic balance laws with stiff source terms in the context of well-balanced and non-conservative schemes is extended to solve a one-dimensional blood flow model for viscoelastic vessels, reformulated as a hyperbolic system, via a relaxation time. A criterion for selecting relaxation times is found and an empirical convergence rate assessment is carried out to support this result. The proposed methodology is validated by applying it to a network of viscoelastic vessels for which experimental and numerical results are available. The agreement between the results obtained in the present paper and thosemore » available in the literature is satisfactory. Key features of the present formulation and numerical methodologies, such as accuracy, efficiency and robustness, are fully discussed in the paper.« less
A multifractal approach to space-filling recovery for PET quantification.
Willaime, Julien M Y; Aboagye, Eric O; Tsoumpas, Charalampos; Turkheimer, Federico E
2014-11-01
A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV mean) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic (18)F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical (18)F-fluorothymidine PET test-retest dataset. TLA estimates were stable for a range of resolutions typical in PET oncology (4-6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV mean or TV measurements across imaging protocols. The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.
Evaluation and Validation (E&V) Team Public Report. Volume 1.
1984-11-30
Components. .......... K-109 *Table K-6. Possible Approaches to Integrating Methodology Components Across Life Cycle Phases. .........K-109 SECTION I...provide a detailed and organized approach to the development of technology which will be used as a basis for the E&V of APSEs. The E&V Plan which is...A-4 1.2 Background ....... .................. A-6 2. SCOPE ...... ....................... A-8 3. E&V TECHNICAL APPROACH
LLCEDATA and LLCECALC for Windows version 1.0, Volume 3: Software verification and validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFadden, J.G.
1998-09-04
LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file(EDF) that represents a snapshot of both the LLCE and the tank from which it originates. LLCECALC reads the EDF and the gamma assay file (AV2) that is produced by the flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, which discusses system limitations and provides recommendations to the LLCE process. Volume 3 documents LLCEDATA and LLCECALC`s verification and validation. Two of the three installation test cases, from Volume 1, are independently confirmed. Data bases used in LLCEDATA are verified and referenced. Both phases of LLCECALC process gamma and characterization, are extensively tested to verify that the methodology and algorithms used are correct.« less
Engel, Lisa; Chui, Adora; Beaton, Dorcas E; Green, Robin E; Dawson, Deirdre R
2018-03-07
To critically appraise the measurement property evidence (ie, psychometric) for 8 observation-based financial management assessment instruments. Seven databases were searched in May 2015. Two reviewers used an independent decision-agreement process to select studies of measurement property evidence relevant to populations with adulthood acquired cognitive impairment, appraise the quality of the evidence, and extract data. Twenty-one articles were selected. This review used the COnsensus-based Standards for the selection of health Measurement Instruments review guidelines and 4-point tool to appraise evidence. After appraising the methodologic quality, the adequacy of results and volume of evidence per instrument were synthesized. Measurement property evidence with high risk of bias was excluded from the synthesis. The volume of measurement property evidence per instrument is low; most instruments had 1 to 3 included studies. Many included studies had poor methodologic quality per measurement property evidence area examined. Six of the 8 instruments reviewed had supporting construct validity/hypothesis-testing evidence of fair methodologic quality. There is a dearth of acceptable quality content validity, reliability, and responsiveness evidence for all 8 instruments. Rehabilitation practitioners assess financial management functions in adults with acquired cognitive impairments. However, there is limited published evidence to support using any of the reviewed instruments. Practitioners should exercise caution when interpreting the results of these instruments. This review highlights the importance of appraising the quality of measurement property evidence before examining the adequacy of the results and synthesizing the evidence. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
A multifractal approach to space-filling recovery for PET quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O.; Tsoumpas, Charalampos
2014-11-01
Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal andmore » synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.« less
NASA Technical Reports Server (NTRS)
Pieper, Jerry L.; Walker, Richard E.
1993-01-01
During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.
The influence of voxel size on atom probe tomography data.
Torres, K L; Daniil, M; Willard, M A; Thompson, G B
2011-05-01
A methodology for determining the optimal voxel size for phase thresholding in nanostructured materials was developed using an atom simulator and a model system of a fixed two-phase composition and volume fraction. The voxel size range was banded by the atom count within each voxel. Some voxel edge lengths were found to be too large, resulting in an averaging of compositional fluctuations; others were too small with concomitant decreases in the signal-to-noise ratio for phase identification. The simulated methodology was then applied to the more complex experimentally determined data set collected from a (Co(0.95)Fe(0.05))(88)Zr(6)Hf(1)B(4)Cu(1) two-phase nanocomposite alloy to validate the approach. In this alloy, Zr and Hf segregated to an intergranular amorphous phase while Fe preferentially segregated to a crystalline phase during the isothermal annealing step that promoted primary crystallization. The atom probe data analysis of the volume fraction was compared to transmission electron microscopy (TEM) dark-field imaging analysis and a lever rule analysis of the volume fraction within the amorphous and crystalline phases of the ribbon. Copyright © 2011 Elsevier B.V. All rights reserved.
2012-01-01
Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca). Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ), time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993) over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach) and reduction in the analysis cost by 50-fold. PMID:23146143
Catanuto, Giuseppe; Taher, Wafa; Rocco, Nicola; Catalano, Francesca; Allegra, Dario; Milotta, Filippo Luigi Maria; Stanco, Filippo; Gallo, Giovanni; Nava, Maurizio Bruno
2018-03-20
Breast shape is defined utilizing mainly qualitative assessment (full, flat, ptotic) or estimates, such as volume or distances between reference points, that cannot describe it reliably. We will quantitatively describe breast shape with two parameters derived from a statistical methodology denominated principal component analysis (PCA). We created a heterogeneous dataset of breast shapes acquired with a commercial infrared 3-dimensional scanner on which PCA was performed. We plotted on a Cartesian plane the two highest values of PCA for each breast (principal components 1 and 2). Testing of the methodology on a preoperative and postoperative surgical case and test-retest was performed by two operators. The first two principal components derived from PCA are able to characterize the shape of the breast included in the dataset. The test-retest demonstrated that different operators are able to obtain very similar values of PCA. The system is also able to identify major changes in the preoperative and postoperative stages of a two-stage reconstruction. Even minor changes were correctly detected by the system. This methodology can reliably describe the shape of a breast. An expert operator and a newly trained operator can reach similar results in a test/re-testing validation. Once developed and after further validation, this methodology could be employed as a good tool for outcome evaluation, auditing, and benchmarking.
NASA Astrophysics Data System (ADS)
Shpotyuk, Ya; Cebulski, J.; Ingram, A.; Shpotyuk, O.
2017-12-01
Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy in application to nanostructurized substances treated within three-term fitting procedure are reconsidered to parameterize their atomic-deficient structural arrangement. In contrast to conventional three-term fitting analysis of the detected PAL spectra based on admixed positron trapping and positronium (Ps) decaying, the nanostructurization due to guest nanoparticles embedded in host matrix is considered as producing modified trapping, which involves conversion between these channels. The developed approach referred to as x3-x2-coupling decomposition algorithm allows estimation free volumes of interfacial voids responsible for positron trapping and bulk lifetimes in nanoparticle-embedded substances. This methodology is validated using experimental data of Chakraverty et al. [Phys. Rev. B71 (2005) 024115] on PAL study of composites formed by guest NiFe2O4 nanocrystals grown in host SiO2 matrix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lafata, K; Ren, L; Wu, Q
Purpose: To develop a data-mining methodology based on quantum clustering and machine learning to predict expected dosimetric endpoints for lung SBRT applications based on patient-specific anatomic features. Methods: Ninety-three patients who received lung SBRT at our clinic from 2011–2013 were retrospectively identified. Planning information was acquired for each patient, from which various features were extracted using in-house semi-automatic software. Anatomic features included tumor-to-OAR distances, tumor location, total-lung-volume, GTV and ITV. Dosimetric endpoints were adopted from RTOG-0195 recommendations, and consisted of various OAR-specific partial-volume doses and maximum point-doses. First, PCA analysis and unsupervised quantum-clustering was used to explore the feature-space tomore » identify potentially strong classifiers. Secondly, a multi-class logistic regression algorithm was developed and trained to predict dose-volume endpoints based on patient-specific anatomic features. Classes were defined by discretizing the dose-volume data, and the feature-space was zero-mean normalized. Fitting parameters were determined by minimizing a regularized cost function, and optimization was performed via gradient descent. As a pilot study, the model was tested on two esophageal dosimetric planning endpoints (maximum point-dose, dose-to-5cc), and its generalizability was evaluated with leave-one-out cross-validation. Results: Quantum-Clustering demonstrated a strong separation of feature-space at 15Gy across the first-and-second Principle Components of the data when the dosimetric endpoints were retrospectively identified. Maximum point dose prediction to the esophagus demonstrated a cross-validation accuracy of 87%, and the maximum dose to 5cc demonstrated a respective value of 79%. The largest optimized weighting factor was placed on GTV-to-esophagus distance (a factor of 10 greater than the second largest weighting factor), indicating an intuitively strong correlation between this feature and both endpoints. Conclusion: This pilot study shows that it is feasible to predict dose-volume endpoints based on patient-specific anatomic features. The developed methodology can potentially help to identify patients at risk for higher OAR doses, thus improving the efficiency of treatment planning. R01-184173.« less
Methodologies for pre-validation of biofilters and wetlands for stormwater treatment.
Zhang, Kefeng; Randelovic, Anja; Aguiar, Larissa M; Page, Declan; McCarthy, David T; Deletic, Ana
2015-01-01
Water Sensitive Urban Design (WSUD) systems are frequently used as part of a stormwater harvesting treatment trains (e.g. biofilters (bio-retentions and rain-gardens) and wetlands). However, validation frameworks for such systems do not exist, limiting their adoption for end-uses such as drinking water. The first stage in the validation framework is pre-validation, which prepares information for further validation monitoring. A pre-validation roadmap, consisting of five steps, is suggested in this paper. Detailed methods for investigating target micropollutants in stormwater, and determining challenge conditions for biofilters and wetlands, are provided. A literature review was undertaken to identify and quantify micropollutants in stormwater. MUSIC V5.1 was utilized to simulate the behaviour of the systems based on 30-year rainfall data in three distinct climate zones; outputs were evaluated to identify the threshold of operational variables, including length of dry periods (LDPs) and volume of water treated per event. The paper highlights that a number of micropollutants were found in stormwater at levels above various worldwide drinking water guidelines (eight pesticides, benzene, benzo(a)pyrene, pentachlorophenol, di-(2-ethylhexyl)-phthalate and a total of polychlorinated biphenyls). The 95th percentile LDPs was exponentially related to system design area while the 5th percentile length of dry periods remained within short durations (i.e. 2-8 hours). 95th percentile volume of water treated per event was exponentially related to system design area as a percentage of an impervious catchment area. The out-comings of this study show that pre-validation could be completed through a roadmap consisting of a series of steps; this will help in the validation of stormwater treatment systems.
Methodologies for Pre-Validation of Biofilters and Wetlands for Stormwater Treatment
Zhang, Kefeng; Randelovic, Anja; Aguiar, Larissa M.; Page, Declan; McCarthy, David T.; Deletic, Ana
2015-01-01
Background Water Sensitive Urban Design (WSUD) systems are frequently used as part of a stormwater harvesting treatment trains (e.g. biofilters (bio-retentions and rain-gardens) and wetlands). However, validation frameworks for such systems do not exist, limiting their adoption for end-uses such as drinking water. The first stage in the validation framework is pre-validation, which prepares information for further validation monitoring. Objectives A pre-validation roadmap, consisting of five steps, is suggested in this paper. Detailed methods for investigating target micropollutants in stormwater, and determining challenge conditions for biofilters and wetlands, are provided. Methods A literature review was undertaken to identify and quantify micropollutants in stormwater. MUSIC V5.1 was utilized to simulate the behaviour of the systems based on 30-year rainfall data in three distinct climate zones; outputs were evaluated to identify the threshold of operational variables, including length of dry periods (LDPs) and volume of water treated per event. Results The paper highlights that a number of micropollutants were found in stormwater at levels above various worldwide drinking water guidelines (eight pesticides, benzene, benzo(a)pyrene, pentachlorophenol, di-(2-ethylhexyl)-phthalate and a total of polychlorinated biphenyls). The 95th percentile LDPs was exponentially related to system design area while the 5th percentile length of dry periods remained within short durations (i.e. 2–8 hours). 95th percentile volume of water treated per event was exponentially related to system design area as a percentage of an impervious catchment area. Conclusions The out-comings of this study show that pre-validation could be completed through a roadmap consisting of a series of steps; this will help in the validation of stormwater treatment systems. PMID:25955688
Bonnet, Benjamin; Jourdan, Franck; du Cailar, Guilhem; Fesler, Pierre
2017-08-01
End-systolic left ventricular (LV) elastance ( E es ) has been previously calculated and validated invasively using LV pressure-volume (P-V) loops. Noninvasive methods have been proposed, but clinical application remains complex. The aims of the present study were to 1 ) estimate E es according to modeling of the LV P-V curve during ejection ("ejection P-V curve" method) and validate our method with existing published LV P-V loop data and 2 ) test the clinical applicability of noninvasively detecting a difference in E es between normotensive and hypertensive subjects. On the basis of the ejection P-V curve and a linear relationship between elastance and time during ejection, we used a nonlinear least-squares method to fit the pressure waveform. We then computed the slope and intercept of time-varying elastance as well as the volume intercept (V 0 ). As a validation, 22 P-V loops obtained from previous invasive studies were digitized and analyzed using the ejection P-V curve method. To test clinical applicability, ejection P-V curves were obtained from 33 hypertensive subjects and 32 normotensive subjects with carotid tonometry and real-time three-dimensional echocardiography during the same procedure. A good univariate relationship ( r 2 = 0.92, P < 0.005) and good limits of agreement were found between the invasive calculation of E es and our new proposed ejection P-V curve method. In hypertensive patients, an increase in arterial elastance ( E a ) was compensated by a parallel increase in E es without change in E a / E es In addition, the clinical reproducibility of our method was similar to that of another noninvasive method. In conclusion, E es and V 0 can be estimated noninvasively from modeling of the P-V curve during ejection. This approach was found to be reproducible and sensitive enough to detect an expected increase in LV contractility in hypertensive patients. Because of its noninvasive nature, this methodology may have clinical implications in various disease states. NEW & NOTEWORTHY The use of real-time three-dimensional echocardiography-derived left ventricular volumes in conjunction with carotid tonometry was found to be reproducible and sensitive enough to detect expected differences in left ventricular elastance in arterial hypertension. Because of its noninvasive nature, this methodology may have clinical implications in various disease states. Copyright © 2017 the American Physiological Society.
Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli
van der Wolde, James W.; Schulze, Keith E.; Short, Kieran M.; Wong, Milagros N.; Bensley, Jonathan G.; Cullen-McEwen, Luise A.; Caruana, Georgina; Hokke, Stacey N.; Li, Jinhua; Firth, Stephen D.; Harper, Ian S.; Nikolic-Paterson, David J.; Bertram, John F.
2016-01-01
Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli. PMID:26975438
Practical tool to assess reliability of web-based medicines information.
Lebanova, Hristina; Getov, Ilko; Grigorov, Evgeni
2014-02-01
Information disseminated by medicines information systems is not always easy to apply. Nowadays internet provides access to enormous volume and range of health information that was previously inaccessible both for medical specialists and consumers. The aim of this study is to assess internet as a source of drug and health related information and to create test methodology to evaluate the top 10 visited health-related web-sites in Bulgaria. Using existing scientific methodologies for evaluation of web sources, a new algorithm of three-step approach consisting of score-card validation of the drug-related information in the 10 most visited Bulgarian web-sites was created. In many cases the drug information in the internet sites contained errors and discrepancies. Some of the published materials were not validated; they were out-of-date and could cause confusion for consumers. The quality of the online health information is a cause for considerable information noise and threat to patients' safety and rational drug use. There is a need of monitoring the drugs information available online in order to prevent patient misinformation and confusion that could lead to medication errors and abuse.
NASA Astrophysics Data System (ADS)
Nazir, Mohd Yusuf Mohd; Al-Shorgani, Najeeb Kaid Nasser; Kalil, Mohd Sahaid; Hamid, Aidil Abdul
2015-09-01
In this study, three factors (fructose concentration, agitation speed and monosodium glutamate (MSG) concentration) were optimized to enhance DHA production by Schizochytrium SW1 using response surface methodology (RSM). Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. The experiments were conducted using 500 mL flask with 100 mL working volume at 30°C for 96 hours. ANOVA analysis revealed that the process was adequately represented significantly by the quadratic model (p<0.0001) and two of the factors namely agitation speed and MSG concentration significantly affect DHA production (p<0.005). Level of influence for each variable and quadratic polynomial equation were obtained for DHA production by multiple regression analyses. The estimated optimum conditions for maximizing DHA production by SW1 were 70 g/L fructose, 250 rpm agitation speed and 12 g/L MSG. Consequently, the quadratic model was validated by applying of the estimated optimum conditions, which confirmed the model validity and 52.86% of DHA was produced.
Validation of bending tests by nanoindentation for micro-contact analysis of MEMS switches
NASA Astrophysics Data System (ADS)
Broue, Adrien; Fourcade, Thibaut; Dhennin, Jérémie; Courtade, Frédéric; Charvet, Pierre–Louis; Pons, Patrick; Lafontan, Xavier; Plana, Robert
2010-08-01
Research on contact characterization for microelectromechanical system (MEMS) switches has been driven by the necessity to reach a high-reliability level for micro-switch applications. One of the main failures observed during cycling of the devices is the increase of the electrical contact resistance. The key issue is the electromechanical behaviour of the materials used at the contact interface where the current flows through. Metal contact switches have a large and complex set of failure mechanisms according to the current level. This paper demonstrates the validity of a new methodology using a commercial nanoindenter coupled with electrical measurements on test vehicles specially designed to investigate the micro-scale contact physics. Dedicated validation tests and modelling are performed to assess the introduced methodology by analyzing the gold contact interface with 5 µm2 square bumps at various current levels. Contact temperature rise is measured, which affects the mechanical properties of the contact materials and modifies the contact topology. In addition, the data provide a better understanding of micro-contact behaviour related to the impact of current at low- to medium-power levels. This article was originally submitted for the special section 'Selected papers from the 20th Micromechanics Europe Workshop (MME 09) (Toulouse, France, 20-22 September 2009)', Journal of Micromechanics and Microengineering, volume 20, issue 6.
Eye-Tracking as a Tool in Process-Oriented Reading Test Validation
ERIC Educational Resources Information Center
Solheim, Oddny Judith; Uppstad, Per Henning
2011-01-01
The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…
van Eeden, Annelies E; Roach, Rachel E J; Halbesma, Nynke; Dekker, Friedo W
2012-01-01
To determine and compare the foundation of claims in drug-promoting advertisements in a Dutch journal for physicians and a Dutch journal for pharmacists. A cross-sectional study. We included all the drug-promoting advertisements referring to a randomized controlled trial (RCT) we could find on Medline from 2 volumes of the Dutch Journal of Medicine (Nederlands Tijdschrift voor Geneeskunde; NTvG) and the (also Dutch) Pharmaceutical Weekly (Pharmaceutisch Weekblad; PW). The validity of the advertisements (n = 54) and the methodological quality of the referenced RCTs (n = 150) were independently scored by 250 medical students using 2 standardised questionnaires. The advertisements' sources were concealed from the students. Per journal, the percentage of drug-promoting advertisements having a valid claim and the percentage of high-quality RCT references were determined. Average scores on quality and validity were compared between the 2 journals. On a scale of 0-18 points, the mean quality scores of the RCTs differed 0.3 (95% CI: -0.1-0.7) between the NTvG (score: 14.8; SD: 2.2) and the PW (score: 14.5; SD: 2.6). The difference between the validity scores of drug-promoting advertisements in the NTvG (score: 5.8; SD: 3.3) and the PW (score: 5.6; SD: 3.6) was 0.3 (95% CI: -0.3-0.9) on a scale of 0-10 points. For both journals, an average of 15% of drug-promoting advertisements was valid (defined as a validity score of > 8 points); 35% of the RCTs referred to was of good methodological quality (defined as a quality score of > 16 points). The substantiation of many claims in drug-promoting advertisements in the NTvG and the PW was mediocre. There was no difference between the 2 journals.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
The derivation of the equations is presented, the rate control algorithm described, and simulation methodologies summarized. A set of dynamics equations that can be used recursively to calculate forces and torques acting at the joints of an n link manipulator given the manipulator joint rates are derived. The equations are valid for any n link manipulator system with any kind of joints connected in any sequence. The equations of motion for the class of manipulators consisting of n rigid links interconnected by rotary joints are derived. A technique is outlined for reducing the system of equations to eliminate contraint torques. The linearized dynamics equations for an n link manipulator system are derived. The general n link linearized equations are then applied to a two link configuration. The coordinated rate control algorithm used to compute individual joint rates when given end effector rates is described. A short discussion of simulation methodologies is presented.
Sjögren, P; Ordell, S; Halling, A
2003-12-01
The aim was to describe and systematically review the methodology and reporting of validation in publications describing epidemiological registration methods for dental caries. BASIC RESEARCH METHODOLOGY: Literature searches were conducted in six scientific databases. All publications fulfilling the predetermined inclusion criteria were assessed for methodology and reporting of validation using a checklist including items described previously as well as new items. The frequency of endorsement of the assessed items was analysed. Moreover, the type and strength of evidence, was evaluated. Reporting of predetermined items relating to methodology of validation and the frequency of endorsement of the assessed items were of primary interest. Initially 588 publications were located. 74 eligible publications were obtained, 23 of which fulfilled the inclusion criteria and remained throughout the analyses. A majority of the studies reported the methodology of validation. The reported methodology of validation was generally inadequate, according to the recommendations of evidence-based medicine. The frequencies of reporting the assessed items (frequencies of endorsement) ranged from four to 84 per cent. A majority of the publications contributed to a low strength of evidence. There seems to be a need to improve the methodology and the reporting of validation in publications describing professionally registered caries epidemiology. Four of the items assessed in this study are potentially discriminative for quality assessments of reported validation.
Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H
2018-07-01
Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.
SeaWiFS Postlaunch Calibration and Validation Analyses
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine (Editor); McClain, Charles R.; Barnes, Robert A.; Eplee, Robert E., Jr.; Franz, Bryan A.; Hsu, N. Christina; Patt, Frederick S.; Pietras, Christophe M.; Robinson, Wayne D.
2000-01-01
The effort to resolve data quality issues and improve on the initial data evaluation methodologies of the SeaWiFS Project was an extensive one. These evaluations have resulted, to date, in three major reprocessings of the entire data set where each reprocessing addressed the data quality issues that could be identified up to the time of the reprocessing. Three volumes of the SeaWiFS Postlaunch Technical Report Series (Volumes 9, 10, and 11) are needed to document the improvements implemented since launch. Volume 10 continues the sequential presentation of postlaunch data analysis and algorithm descriptions begun in Volume 9. Chapter 1 of Volume 10 describes an absorbing aerosol index, similar to that produced by the Total Ozone Mapping Spectrometer (TOMS) Project, which is used to flag pixels contaminated by absorbing aerosols, such as, dust and smoke. Chapter 2 discusses the algorithm being used to remove SeaWiFS out-of-band radiance from the water-leaving radiances. Chapter 3 provides an itemization of all significant changes in the processing algorithms for each of the first three reprocessings. Chapter 4 shows the time series of global clear water and deep-water (depths greater than 1,000m) bio-optical and atmospheric properties (normalized water-leaving radiances, chlorophyll, atmospheric optical depth, etc.) based on the eight-day composites as a check on the sensor calibration stability. Chapter 5 examines the variation in the derived products with scan angle using high resolution data around Hawaii to test for residual scan modulation effects and atmospheric correction biases. Chapter 6 provides a methodology for evaluating the atmospheric correction algorithm and atmospheric derived products using ground-based observations. Similarly, Chapter 7 presents match-up comparisons of coincident satellite and in situ data to determine the accuracy of the water-leaving radiances, chlorophyll a, and K(490) products.
2016-01-01
Digital single-molecule technologies are expanding diagnostic capabilities, enabling the ultrasensitive quantification of targets, such as viral load in HIV and hepatitis C infections, by directly counting single molecules. Replacing fluorescent readout with a robust visual readout that can be captured by any unmodified cell phone camera will facilitate the global distribution of diagnostic tests, including in limited-resource settings where the need is greatest. This paper describes a methodology for developing a visual readout system for digital single-molecule amplification of RNA and DNA by (i) selecting colorimetric amplification-indicator dyes that are compatible with the spectral sensitivity of standard mobile phones, and (ii) identifying an optimal ratiometric image-process for a selected dye to achieve a readout that is robust to lighting conditions and camera hardware and provides unambiguous quantitative results, even for colorblind users. We also include an analysis of the limitations of this methodology, and provide a microfluidic approach that can be applied to expand dynamic range and improve reaction performance, allowing ultrasensitive, quantitative measurements at volumes as low as 5 nL. We validate this methodology using SlipChip-based digital single-molecule isothermal amplification with λDNA as a model and hepatitis C viral RNA as a clinically relevant target. The innovative combination of isothermal amplification chemistry in the presence of a judiciously chosen indicator dye and ratiometric image processing with SlipChip technology allowed the sequence-specific visual readout of single nucleic acid molecules in nanoliter volumes with an unmodified cell phone camera. When paired with devices that integrate sample preparation and nucleic acid amplification, this hardware-agnostic approach will increase the affordability and the distribution of quantitative diagnostic and environmental tests. PMID:26900709
Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus
2015-01-01
The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only few influences that would lead to incomplete distillation. Our validation parameters have shown that the performance of the method corresponds to the data presented for the reference method and we believe that automated steam distillation, can be used for the purpose of labelling control of alcoholic beverages.
NASA Technical Reports Server (NTRS)
Dean, P. D.; Salikuddin, M.; Ahuja, K. K.; Plumblee, H. E.; Mungur, P.
1979-01-01
The efficiency of internal noise radiation through coannular exhaust nozzle with an inverted velocity profile was studied. A preliminary investigation was first undertaken to: (1) define the test parameters which influence the internal noise radiation; (2) develop a test methodology which could realistically be used to examine the effects of the test parameters; (3) and to validate this methodology. The result was the choice of an acoustic impulse as the internal noise source in the in the jet nozzles. Noise transmission characteristics of a nozzle system were then investigated. In particular, the effects of fan nozzle convergence angle, core extention length to annulus height ratio, and flow Mach number and temperatures were studied. The results are presented as normalized directivity plots.
Borri, Marco; Schmidt, Maria A; Powell, Ceri; Koh, Dow-Mu; Riddell, Angela M; Partridge, Mike; Bhide, Shreerang A; Nutting, Christopher M; Harrington, Kevin J; Newbold, Katie L; Leach, Martin O
2015-01-01
To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters) of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment. The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4). Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters. The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4), determined with cluster validation, produced the best separation between reducing and non-reducing clusters. The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes.
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Reichle, Rolf H.; De Lannoy, Gabrielle J. M.; Liu, Qing; Colliander, Andreas; Conaty, Austin; Jackson, Thomas; Kimball, John
2015-01-01
During the post-launch SMAP calibration and validation (Cal/Val) phase there are two objectives for each science data product team: 1) calibrate, verify, and improve the performance of the science algorithm, and 2) validate the accuracy of the science data product as specified in the science requirements and according to the Cal/Val schedule. This report provides an assessment of the SMAP Level 4 Surface and Root Zone Soil Moisture Passive (L4_SM) product specifically for the product's public beta release scheduled for 30 October 2015. The primary objective of the beta release is to allow users to familiarize themselves with the data product before the validated product becomes available. The beta release also allows users to conduct their own assessment of the data and to provide feedback to the L4_SM science data product team. The assessment of the L4_SM data product includes comparisons of SMAP L4_SM soil moisture estimates with in situ soil moisture observations from core validation sites and sparse networks. The assessment further includes a global evaluation of the internal diagnostics from the ensemble-based data assimilation system that is used to generate the L4_SM product. This evaluation focuses on the statistics of the observation-minus-forecast (O-F) residuals and the analysis increments. Together, the core validation site comparisons and the statistics of the assimilation diagnostics are considered primary validation methodologies for the L4_SM product. Comparisons against in situ measurements from regional-scale sparse networks are considered a secondary validation methodology because such in situ measurements are subject to upscaling errors from the point-scale to the grid cell scale of the data product. Based on the limited set of core validation sites, the assessment presented here meets the criteria established by the Committee on Earth Observing Satellites for Stage 1 validation and supports the beta release of the data. The validation against sparse network measurements and the evaluation of the assimilation diagnostics address Stage 2 validation criteria by expanding the assessment to regional and global scales.
A Thermo-Optic Propagation Modeling Capability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schrader, Karl; Akau, Ron
2014-10-01
A new theoretical basis is derived for tracing optical rays within a finite-element (FE) volume. The ray-trajectory equations are cast into the local element coordinate frame and the full finite-element interpolation is used to determine instantaneous index gradient for the ray-path integral equation. The FE methodology (FEM) is also used to interpolate local surface deformations and the surface normal vector for computing the refraction angle when launching rays into the volume, and again when rays exit the medium. The method is implemented in the Matlab(TM) environment and compared to closed- form gradient index models. A software architecture is also developedmore » for implementing the algorithms in the Zemax(TM) commercial ray-trace application. A controlled thermal environment was constructed in the laboratory, and measured data was collected to validate the structural, thermal, and optical modeling methods.« less
When can social media lead financial markets?
Zheludev, Ilya; Smith, Robert; Aste, Tomaso
2014-02-27
Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.
When Can Social Media Lead Financial Markets?
NASA Astrophysics Data System (ADS)
Zheludev, Ilya; Smith, Robert; Aste, Tomaso
2014-02-01
Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.
When Can Social Media Lead Financial Markets?
Zheludev, Ilya; Smith, Robert; Aste, Tomaso
2014-01-01
Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes. PMID:24572909
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
External gear pumps operating with non-Newtonian fluids: Modelling and experimental validation
NASA Astrophysics Data System (ADS)
Rituraj, Fnu; Vacca, Andrea
2018-06-01
External Gear Pumps are used in various industries to pump non-Newtonian viscoelastic fluids like plastics, paints, inks, etc. For both design and analysis purposes, it is often a matter of interest to understand the features of the displacing action realized by meshing of the gears and the description of the behavior of the leakages for this kind of pumps. However, very limited work can be found in literature about methodologies suitable to model such phenomena. This article describes the technique of modelling external gear pumps that operate with non-Newtonian fluids. In particular, it explains how the displacing action of the unit can be modelled using a lumped parameter approach which involves dividing fluid domain into several control volumes and internal flow connections. This work is built upon the HYGESim simulation tool, conceived by the authors' research team in the last decade, which is for the first time extended for the simulation of non-Newtonian fluids. The article also describes several comparisons between simulation results and experimental data obtained from numerous experiments performed for validation of the presented methodology. Finally, operation of external gear pump with fluids having different viscosity characteristics is discussed.
Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.
Stern, Cindy; Chur-Hansen, Anna
2013-02-27
This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.
Duff, Kevin; Suhrie, Kayla R; Dalley, Bonnie C A; Anderson, Jeffrey S; Hoffman, John M
2018-06-08
Within neuropsychology, a number of mathematical formulae (e.g. reliable change index, standardized regression based) have been used to determine if change across time has reliably occurred. When these formulae have been compared, they often produce different results, but 'different' results do not necessarily indicate which formulae are 'best.' The current study sought to further our understanding of change formulae by comparing them to clinically relevant external criteria (amyloid deposition and hippocampal volume). In a sample of 25 older adults with varying levels of cognitive intactness, participants were tested twice across one week with a brief cognitive battery. Seven different change scores were calculated for each participant. An amyloid PET scan (to get a composite of amyloid deposition) and an MRI (to get hippocampal volume) were also obtained. Deviation-based change formulae (e.g. simple discrepancy score, reliable change index with or without correction for practice effects) were all identical in their relationship to the two neuroimaging biomarkers, and all were non-significant. Conversely, regression-based change formulae (e.g. simple and complex indices) showed stronger relationships to amyloid deposition and hippocampal volume. These results highlight the need for external validation of the various change formulae used by neuropsychologists in clinical settings and research projects. The findings also preliminarily suggest that regression-based change formulae may be more relevant than deviation-based change formulae in this context.
Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems
2015-12-01
distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and
NASA Astrophysics Data System (ADS)
Muñoz, Randy; Paredes, Javier; Huggel, Christian; Drenkhan, Fabian; García, Javier
2017-04-01
The availability and consistency of data is a determining factor for the reliability of any hydrological model and simulated results. Unfortunately, there are many regions worldwide where data is not available in the desired quantity and quality. The Santa River basin (SRB), located within a complex topographic and climatic setting in the tropical Andes of Peru is a clear example of this challenging situation. A monitoring network of in-situ stations in the SRB recorded series of hydro-meteorological variables which finally ceased to operate in 1999. In the following years, several researchers evaluated and completed many of these series. This database was used by multiple research and policy-oriented projects in the SRB. However, hydroclimatic information remains limited, making it difficult to perform research, especially when dealing with the assessment of current and future water resources. In this context, here the evaluation of different methodologies to interpolate temperature and precipitation data at a monthly time step as well as ice volume data in glacierized basins with limited data is presented. The methodologies were evaluated for the Quillcay River, a tributary of the SRB, where the hydro-meteorological data is available from nearby monitoring stations since 1983. The study period was 1983 - 1999 with a validation period among 1993 - 1999. For temperature series the aim was to extend the observed data and interpolate it. Data from Reanalysis NCEP was used to extend the observed series: 1) using a simple correlation with multiple field stations, or 2) applying the altitudinal correction proposed in previous studies. The interpolation then was applied as a function of altitude. Both methodologies provide very close results, by parsimony simple correlation is shown as a viable choice. For precipitation series, the aim was to interpolate observed data. Two methodologies were evaluated: 1) Inverse Distance Weighting whose results underestimate the amount of precipitation in high-altitudinal zones, and 2) ordinary Kriging (OK) whose variograms were calculated with the multi-annual monthly mean precipitation applying them to the whole study period. OK leads to better results in both low and high altitudinal zones. For ice volume, the aim was to estimate values from historical data: 1) with the GlabTop algorithm which needs digital elevation models, but these are available in an appropriate scale since 2009, 2) with a widely applied but controversially discussed glacier area-volume relation whose parameters were calibrated with results from the GlabTop model. Both methodologies provide reasonable results, but for historical data, the area-volume scaling only requires the glacial area easy to calculate from satellite images since 1986. In conclusion, the simple correlation, the OK and the calibrated relation for ice volume showed the best ways to interpolate glacio-climatic information. However, these methods must be carefully applied and revisited for the specific situation with high complexity. This is a first step in order to identify the most appropriate methods to interpolate and extend observed data in glacierized basins with limited information. New research should be done evaluating another methodologies and meteorological data in order to improve hydrological models and water management policies.
Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification
NASA Technical Reports Server (NTRS)
Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle
2011-01-01
NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process
Greenfield, Thomas K.; Kerr, William C.; Bond, Jason; Ye, Yu; Stockwell, Tim
2009-01-01
We investigate several types of graduated frequency (GF) instruments for monitoring drinking patterns. Two studies with 12-month GF measures and daily data were used: (i) the Australian 2004 National Drug Strategy Household Survey (n = 24,109 aged 12+; 22,546 with GF and over 8000 with yesterday data) and (ii) a US methodological study involving a 28-day daily diary plus GF summary measures drawn from the National Alcohol Survey (n = 3,025 screened, 119 eligible study completers). The NDSHS involved (i) “drop and collect” self-completed forms with random sampling methods; the Measurement study (ii) screened 3+ drinkers by telephone and collected 28-day drinking diaries and pre- and post-diary 28-day GFs. We compared mean values for the GF quantity ranges from yesterday’s drinks (study i) and 28-day diaries (study ii), also examining volume influence. Using Yesterday’s drinking, Australian results showed GF quantity range means close to arithmetic midpoints and volume effects only for the lowest two levels (1–2, and 3–4 drinks; p < .001). U.S. calibration results on the GF using 28-day diaries were similar, with a volume effect only at these low quantity levels (p < .001). Means for the highest quantity thresholds were 23.5 drinks for the 20+ (10 gram) drink level (Australia) and 15.5 drinks for the 12+ (14 g) drink level (US). In the US study, summary GF frequency and volume were highly consistent with diary-based counterparts. A conclusion is that algorithms for computing volume may be refined using validation data. We suggest measurement methods may be improved by taking better account of empirical drink ethanol content. PMID:21197381
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
Estimating the volume of glaciers in the Himalayan-Karakoram region using different methods
NASA Astrophysics Data System (ADS)
Frey, H.; Machguth, H.; Huss, M.; Huggel, C.; Bajracharya, S.; Bolch, T.; Kulkarni, A.; Linsbauer, A.; Salzmann, N.; Stoffel, M.
2014-12-01
Ice volume estimates are crucial for assessing water reserves stored in glaciers. Due to its large glacier coverage, such estimates are of particular interest for the Himalayan-Karakoram (HK) region. In this study, different existing methodologies are used to estimate the ice reserves: three area-volume relations, one slope-dependent volume estimation method, and two ice-thickness distribution models are applied to a recent, detailed, and complete glacier inventory of the HK region, spanning over the period 2000-2010 and revealing an ice coverage of 40 775 km2. An uncertainty and sensitivity assessment is performed to investigate the influence of the observed glacier area and important model parameters on the resulting total ice volume. Results of the two ice-thickness distribution models are validated with local ice-thickness measurements at six glaciers. The resulting ice volumes for the entire HK region range from 2955 to 4737 km3, depending on the approach. This range is lower than most previous estimates. Results from the ice thickness distribution models and the slope-dependent thickness estimations agree well with measured local ice thicknesses. However, total volume estimates from area-related relations are larger than those from other approaches. The study provides evidence on the significant effect of the selected method on results and underlines the importance of a careful and critical evaluation.
Uranga, Jon; Arrizabalaga, Haritz; Boyra, Guillermo; Hernandez, Maria Carmen; Goñi, Nicolas; Arregui, Igor; Fernandes, Jose A; Yurramendi, Yosu; Santiago, Josu
2017-01-01
This study presents a methodology for the automated analysis of commercial medium-range sonar signals for detecting presence/absence of bluefin tuna (Tunnus thynnus) in the Bay of Biscay. The approach uses image processing techniques to analyze sonar screenshots. For each sonar image we extracted measurable regions and analyzed their characteristics. Scientific data was used to classify each region into a class ("tuna" or "no-tuna") and build a dataset to train and evaluate classification models by using supervised learning. The methodology performed well when validated with commercial sonar screenshots, and has the potential to automatically analyze high volumes of data at a low cost. This represents a first milestone towards the development of acoustic, fishery-independent indices of abundance for bluefin tuna in the Bay of Biscay. Future research lines and additional alternatives to inform stock assessments are also discussed.
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
Uranga, Jon; Arrizabalaga, Haritz; Boyra, Guillermo; Hernandez, Maria Carmen; Goñi, Nicolas; Arregui, Igor; Fernandes, Jose A.; Yurramendi, Yosu; Santiago, Josu
2017-01-01
This study presents a methodology for the automated analysis of commercial medium-range sonar signals for detecting presence/absence of bluefin tuna (Tunnus thynnus) in the Bay of Biscay. The approach uses image processing techniques to analyze sonar screenshots. For each sonar image we extracted measurable regions and analyzed their characteristics. Scientific data was used to classify each region into a class (“tuna” or “no-tuna”) and build a dataset to train and evaluate classification models by using supervised learning. The methodology performed well when validated with commercial sonar screenshots, and has the potential to automatically analyze high volumes of data at a low cost. This represents a first milestone towards the development of acoustic, fishery-independent indices of abundance for bluefin tuna in the Bay of Biscay. Future research lines and additional alternatives to inform stock assessments are also discussed. PMID:28152032
Transportation Energy Conservation Data Book: A Selected Bibliography. Edition 3,
1978-11-01
Charlottesville, VA 22901 TITLE: Couputer-Based Resource Accounting Model TT1.1: Methodology for the Design of Urban for Automobile Technology Impact...Evaluation System ACCOUNTING; INDUSTRIAL SECTOR; ENERGY tPIESi Documentation. volume 6. CONSUM PTION: PERFORANCE: DESIGN : NASTE MEAT: Methodology for... Methodology for the Design of Urban Transportation 000172 Energy Flows In the U.S., 1973 and 1974. Volume 1: Methodology * $opdate to the Fational Energy
Li, Xue; Ahmad, Imad A Haidar; Tam, James; Wang, Yan; Dao, Gina; Blasko, Andrei
2018-02-05
A Total Organic Carbon (TOC) based analytical method to quantitate trace residues of clean-in-place (CIP) detergents CIP100 ® and CIP200 ® on the surfaces of pharmaceutical manufacturing equipment was developed and validated. Five factors affecting the development and validation of the method were identified: diluent composition, diluent volume, extraction method, location for TOC sample preparation, and oxidant flow rate. Key experimental parameters were optimized to minimize contamination and to improve the sensitivity, recovery, and reliability of the method. The optimized concentration of the phosphoric acid in the swabbing solution was 0.05M, and the optimal volume of the sample solution was 30mL. The swab extraction method was 1min sonication. The use of a clean room, as compared to an isolated lab environment, was not required for method validation. The method was demonstrated to be linear with a correlation coefficient (R) of 0.9999. The average recoveries from stainless steel surfaces at multiple spike levels were >90%. The repeatability and intermediate precision results were ≤5% across the 2.2-6.6ppm range (50-150% of the target maximum carry over, MACO, limit). The method was also shown to be sensitive with a detection limit (DL) of 38ppb and a quantitation limit (QL) of 114ppb. The method validation demonstrated that the developed method is suitable for its intended use. The methodology developed in this study is generally applicable to the cleaning verification of any organic detergents used for the cleaning of pharmaceutical manufacturing equipment made of electropolished stainless steel material. Copyright © 2017 Elsevier B.V. All rights reserved.
C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1
NASA Astrophysics Data System (ADS)
Wilson, J. L.; Jolly, M. B.
1984-01-01
A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.
Philip, Bobby; Berrill, Mark A.; Allu, Srikanth; ...
2015-01-26
We describe an efficient and nonlinearly consistent parallel solution methodology for solving coupled nonlinear thermal transport problems that occur in nuclear reactor applications over hundreds of individual 3D physical subdomains. Efficiency is obtained by leveraging knowledge of the physical domains, the physics on individual domains, and the couplings between them for preconditioning within a Jacobian Free Newton Krylov method. Details of the computational infrastructure that enabled this work, namely the open source Advanced Multi-Physics (AMP) package developed by the authors are described. The details of verification and validation experiments, and parallel performance analysis in weak and strong scaling studies demonstratingmore » the achieved efficiency of the algorithm are presented. Moreover, numerical experiments demonstrate that the preconditioner developed is independent of the number of fuel subdomains in a fuel rod, which is particularly important when simulating different types of fuel rods. Finally, we demonstrate the power of the coupling methodology by considering problems with couplings between surface and volume physics and coupling of nonlinear thermal transport in fuel rods to an external radiation transport code.« less
A validated methodology for the 3D reconstruction of cochlea geometries using human microCT images
NASA Astrophysics Data System (ADS)
Sakellarios, A. I.; Tachos, N. S.; Rigas, G.; Bibas, T.; Ni, G.; Böhnke, F.; Fotiadis, D. I.
2017-05-01
Accurate reconstruction of the inner ear is a prerequisite for the modelling and understanding of the inner ear mechanics. In this study, we present a semi-automated methodology for accurate reconstruction of the major inner ear structures (scalae, basilar membrane, stapes and semicircular canals). For this purpose, high resolution microCT images of a human specimen were used. The segmentation methodology is based on an iterative level set algorithm which provides the borders of the structures of interest. An enhanced coupled level set method which allows the simultaneous multiple image labeling without any overlapping regions has been developed for this purpose. The marching cube algorithm was applied in order to extract the surface from the segmented volume. The reconstructed geometries are then post-processed to improve the basilar membrane geometry to realistically represent physiologic dimensions. The final reconstructed model is compared to the available data from the literature. The results show that our generated inner ear structures are in good agreement with the published ones, while our approach is the most realistic in terms of the basilar membrane thickness and width reconstruction.
Development of Probabilistic Rigid Pavement Design Methodologies for Military Airfields.
1983-12-01
4A161102AT22, Task AO, Work Unit 009, "Methodology for Considering Material Variability in Pavement Design." OCE Project Monitor was Mr. S. S. Gillespie. The...PREFACE. .. ............................. VOLUME 1: STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS VOLUME 11: MATHEMATICAL FORMULATION OF...VOLUME IV: PROBABILISTIC ANALYSIS OF RIGID AIRFIELD DESIGN BY ELASTIC LAYERED THEORY VOLUME I STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS
Borri, Marco; Schmidt, Maria A.; Powell, Ceri; Koh, Dow-Mu; Riddell, Angela M.; Partridge, Mike; Bhide, Shreerang A.; Nutting, Christopher M.; Harrington, Kevin J.; Newbold, Katie L.; Leach, Martin O.
2015-01-01
Purpose To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters) of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment. Material and Methods The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4). Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters. Results The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4), determined with cluster validation, produced the best separation between reducing and non-reducing clusters. Conclusion The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes. PMID:26398888
Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors
NASA Technical Reports Server (NTRS)
Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele
2010-01-01
This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.
Chandra, Santanu; Gnanaruban, Vimalatharmaiyah; Riveros, Fabian; Rodriguez, Jose F.; Finol, Ender A.
2016-01-01
In this work, we present a novel method for the derivation of the unloaded geometry of an abdominal aortic aneurysm (AAA) from a pressurized geometry in turn obtained by 3D reconstruction of computed tomography (CT) images. The approach was experimentally validated with an aneurysm phantom loaded with gauge pressures of 80, 120, and 140 mm Hg. The unloaded phantom geometries estimated from these pressurized states were compared to the actual unloaded phantom geometry, resulting in mean nodal surface distances of up to 3.9% of the maximum aneurysm diameter. An in-silico verification was also performed using a patient-specific AAA mesh, resulting in maximum nodal surface distances of 8 μm after running the algorithm for eight iterations. The methodology was then applied to 12 patient-specific AAA for which their corresponding unloaded geometries were generated in 5–8 iterations. The wall mechanics resulting from finite element analysis of the pressurized (CT image-based) and unloaded geometries were compared to quantify the relative importance of using an unloaded geometry for AAA biomechanics. The pressurized AAA models underestimate peak wall stress (quantified by the first principal stress component) on average by 15% compared to the unloaded AAA models. The validation and application of the method, readily compatible with any finite element solver, underscores the importance of generating the unloaded AAA volume mesh prior to using wall stress as a biomechanical marker for rupture risk assessment. PMID:27538124
DOT National Transportation Integrated Search
1981-06-01
This study provides information about public attitudes towards proposed highway safety countermeasures in three program areas: alcohol and drugs, unsafe driving behaviors, and pedestrian safety. This volume describes the three research methodologies ...
Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.
DOT National Transportation Integrated Search
1979-09-01
This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...
Estilo, Emil Emmanuel C; Gabriel, Alonzo A
2018-02-01
This study was conducted to determine the effects of intrinsic juice characteristics namely insoluble solids (IS, 0-3 %w/v), and soluble solids (SS, 0-70 °Brix), and extrinsic process parameter treated volume (250-1000 mL) on the UV-C inactivation rates of heat-stressed Salmonella enterica in simulated fruit juices (SFJs). A Rotatable Central Composite Design of Experiment (CCRD) was used to determine combinations of the test variables, while Response Surface Methodology (RSM) was used to characterize and quantify the influences of the test variables on microbial inactivation. The heat-stressed cells exhibited log-linear UV-C inactivation behavior (R 2 0.952 to 0.999) in all CCRD combinations with D UV-C values ranging from 10.0 to 80.2 mJ/cm 2 . The D UV-C values obtained from the CCRD significantly fitted into a quadratic model (P < 0.0001). RSM results showed that individual linear (IS, SS, volume), individual quadratic (IS 2 and volume 2 ), and factor interactions (IS × volume and SS × volume) were found to significantly influence UV-C inactivation. Validation of the model in SFJs with combinations not included in the CCRD showed that the predictions were within acceptable error margins. Copyright © 2017. Published by Elsevier Ltd.
An Analysis of Measured Pressure Signatures From Two Theory-Validation Low-Boom Models
NASA Technical Reports Server (NTRS)
Mack, Robert J.
2003-01-01
Two wing/fuselage/nacelle/fin concepts were designed to check the validity and the applicability of sonic-boom minimization theory, sonic-boom analysis methods, and low-boom design methodology in use at the end of the 1980is. Models of these concepts were built, and the pressure signatures they generated were measured in the wind-tunnel. The results of these measurements lead to three conclusions: (1) the existing methods could adequately predict sonic-boom characteristics of wing/fuselage/fin(s) configurations if the equivalent area distributions of each component were smooth and continuous; (2) these methods needed revision so the engine-nacelle volume and the nacelle-wing interference lift disturbances could be accurately predicted; and (3) current nacelle-configuration integration methods had to be updated. With these changes in place, the existing sonic-boom analysis and minimization methods could be effectively applied to supersonic-cruise concepts for acceptable/tolerable sonic-boom overpressures during cruise.
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
Learning Methodology in the Classroom to Encourage Participation
ERIC Educational Resources Information Center
Luna, Esther; Folgueiras, Pilar
2014-01-01
Service learning is a methodology that promotes the participation of citizens in their community. This article presents a brief conceptualization of citizen participation, characteristics of service learning methodology, and validation of a programme that promotes service-learning projects. This validation highlights the suitability of this…
Shephard, Roy J
2017-03-01
The Douglas bag technique is reviewed as one in a series of articles looking at historical insights into measurement of whole body metabolic rate. Consideration of all articles looking at Douglas bag technique and chemical gas analysis has here focused on the growing appreciation of errors in measuring expired volumes and gas composition, and subjective reactions to airflow resistance and dead space. Multiple small sources of error have been identified and appropriate remedies proposed over a century of use of the methodology. Changes in the bag lining have limited gas diffusion, laboratories conducting gas analyses have undergone validation, and WHO guidelines on airflow resistance have minimized reactive effects. One remaining difficulty is a contamination of expirate by dead space air, minimized by keeping the dead space <70 mL. Care must also be taken to ensure a steady state, and formal validation of the Douglas bag method still needs to be carried out. We may conclude that the Douglas bag method has helped to define key concepts in exercise physiology. Although now superceded in many applications, the errors in a meticulously completed measurement are sufficiently low to warrant retention of the Douglas bag as the gold standard when evaluating newer open-circuit methodology.
A methodology to derive Synthetic Design Hydrographs for river flood management
NASA Astrophysics Data System (ADS)
Tomirotti, Massimo; Mignosa, Paolo
2017-12-01
The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.
Calculation of the radiative properties of photosynthetic microorganisms
NASA Astrophysics Data System (ADS)
Dauchet, Jérémi; Blanco, Stéphane; Cornet, Jean-François; Fournier, Richard
2015-08-01
A generic methodological chain for the predictive calculation of the light-scattering and absorption properties of photosynthetic microorganisms within the visible spectrum is presented here. This methodology has been developed in order to provide the radiative properties needed for the analysis of radiative transfer within photobioreactor processes, with a view to enable their optimization for large-scale sustainable production of chemicals for energy and chemistry. It gathers an electromagnetic model of light-particle interaction along with detailed and validated protocols for the determination of input parameters: morphological and structural characteristics of the studied microorganisms as well as their photosynthetic-pigment content. The microorganisms are described as homogeneous equivalent-particles whose shape and size distribution is characterized by image analysis. The imaginary part of their refractive index is obtained thanks to a new and quite extended database of the in vivo absorption spectra of photosynthetic pigments (that is made available to the reader). The real part of the refractive index is then calculated by using the singly subtractive Kramers-Krönig approximation, for which the anchor point is determined with the Bruggeman mixing rule, based on the volume fraction of the microorganism internal-structures and their refractive indices (extracted from a database). Afterwards, the radiative properties are estimated using the Schiff approximation for spheroidal or cylindrical particles, as a first step toward the description of the complexity and diversity of the shapes encountered within the microbial world. Finally, these predictive results are confronted to experimental normal-hemispherical transmittance spectra for validation. This entire procedure is implemented for Rhodospirillum rubrum, Arthrospira platensis and Chlamydomonas reinhardtii, each representative of the main three kinds of photosynthetic microorganisms, i.e. respectively photosynthetic bacteria, cyanobacteria and eukaryotic microalgae. The obtained results are in very good agreement with the experimental measurements when the shape of the microorganisms is well described (in comparison to the standard volume-equivalent sphere approximation). As a main perspective, the consideration of the helical shape of Arthrospira platensis appears to be a key to an accurate estimation of its radiative properties. On the whole, the presented methodological chain also appears of great interest for other scientific communities such as atmospheric science, oceanography, astrophysics and engineering.
Keyes, S D; Gillard, F; Soper, N; Mavrogordato, M N; Sinclair, I; Roose, T
2016-06-14
The mechanical impedance of soils inhibits the growth of plant roots, often being the most significant physical limitation to root system development. Non-invasive imaging techniques have recently been used to investigate the development of root system architecture over time, but the relationship with soil deformation is usually neglected. Correlative mapping approaches parameterised using 2D and 3D image data have recently gained prominence for quantifying physical deformation in composite materials including fibre-reinforced polymers and trabecular bone. Digital Image Correlation (DIC) and Digital Volume Correlation (DVC) are computational techniques which use the inherent material texture of surfaces and volumes, captured using imaging techniques, to map full-field deformation components in samples during physical loading. Here we develop an experimental assay and methodology for four-dimensional, in vivo X-ray Computed Tomography (XCT) and apply a Digital Volume Correlation (DVC) approach to the data to quantify deformation. The method is validated for a field-derived soil under conditions of uniaxial compression, and a calibration study is used to quantify thresholds of displacement and strain measurement. The validated and calibrated approach is then demonstrated for an in vivo test case in which an extending maize root in field-derived soil was imaged hourly using XCT over a growth period of 19h. This allowed full-field soil deformation data and 3D root tip dynamics to be quantified in parallel for the first time. This fusion of methods paves the way for comparative studies of contrasting soils and plant genotypes, improving our understanding of the fundamental mechanical processes which influence root system development. Copyright © 2016 Elsevier Ltd. All rights reserved.
Pirat, Bahar; Little, Stephen H; Igo, Stephen R; McCulloch, Marti; Nosé, Yukihiko; Hartley, Craig J; Zoghbi, William A
2009-03-01
The proximal isovelocity surface area (PISA) method is useful in the quantitation of aortic regurgitation (AR). We hypothesized that actual measurement of PISA provided with real-time 3-dimensional (3D) color Doppler yields more accurate regurgitant volumes than those estimated by 2-dimensional (2D) color Doppler PISA. We developed a pulsatile flow model for AR with an imaging chamber in which interchangeable regurgitant orifices with defined shapes and areas were incorporated. An ultrasonic flow meter was used to calculate the reference regurgitant volumes. A total of 29 different flow conditions for 5 orifices with different shapes were tested at a rate of 72 beats/min. 2D PISA was calculated as 2pi r(2), and 3D PISA was measured from 8 equidistant radial planes of the 3D PISA. Regurgitant volume was derived as PISA x aliasing velocity x time velocity integral of AR/peak AR velocity. Regurgitant volumes by flow meter ranged between 12.6 and 30.6 mL/beat (mean 21.4 +/- 5.5 mL/beat). Regurgitant volumes estimated by 2D PISA correlated well with volumes measured by flow meter (r = 0.69); however, a significant underestimation was observed (y = 0.5x + 0.6). Correlation with flow meter volumes was stronger for 3D PISA-derived regurgitant volumes (r = 0.83); significantly less underestimation of regurgitant volumes was seen, with a regression line close to identity (y = 0.9x + 3.9). Direct measurement of PISA is feasible, without geometric assumptions, using real-time 3D color Doppler. Calculation of aortic regurgitant volumes with 3D color Doppler using this methodology is more accurate than conventional 2D method with hemispheric PISA assumption.
SeaWiFS Postlaunch Calibration and Validation Analyses
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); McClain, Charles R.; Ainsworth, Ewa J.; Barnes, Robert A.; Eplee, Robert E., Jr.; Patt, Frederick S.; Robinson, Wayne D.; Wang, Menghua; Bailey, Sean W.
2000-01-01
The effort to resolve data quality issues and improve on the initial data evaluation methodologies of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Project was an extensive one. These evaluations have resulted, to date, in three major reprocessings of the entire data set where each reprocessing addressed the data quality issues that could be identified up to the time of each reprocessing. The number of chapters (21) needed to document this extensive work in the SeaWiFS Postlaunch Technical Report Series requires three volumes. The chapters in Volumes 9, 10, and 11 are in a logical order sequencing through sensor calibration, atmospheric correction, masks and flags, product evaluations, and bio-optical algorithms. The first chapter of Volume 9 is an overview of the calibration and validation program, including a table of activities from the inception of the SeaWiFS Project. Chapter 2 describes the fine adjustments of sensor detector knee radiances, i.e., radiance levels where three of the four detectors in each SeaWiFS band saturate. Chapters 3 and 4 describe the analyses of the lunar and solar calibration time series, respectively, which are used to track the temporal changes in radiometric sensitivity in each band. Chapter 5 outlines the procedure used to adjust band 7 relative to band 8 to derive reasonable aerosol radiances in band 7 as compared to those in band 8 in the vicinity of Lanai, Hawaii, the vicarious calibration site. Chapter 6 presents the procedure used to estimate the vicarious calibration gain adjustment factors for bands 1-6 using the waterleaving radiances from the Marine Optical Buoy (MOBY) offshore of Lanai. Chapter 7 provides the adjustments to the coccolithophore flag algorithm which were required for improved performance over the prelaunch version. Chapter 8 is an overview of the numerous modifications to the atmospheric correction algorithm that have been implemented. Chapter 9 describes the methodology used to remove artifacts of sun glint contamination for portions of the imagery outside the sun glint mask. Finally, Chapter 10 explains a modification to the ozone interpolation method to account for actual time differences between the SeaWiFS and Total Ozone Mapping Spectrometer (TOMS) orbits.
Harris, Joshua D; Erickson, Brandon J; Cvetanovich, Gregory L; Abrams, Geoffrey D; McCormick, Frank M; Gupta, Anil K; Verma, Nikhil N; Bach, Bernard R; Cole, Brian J
2014-02-01
Condition-specific questionnaires are important components in evaluation of outcomes of surgical interventions. No condition-specific study methodological quality questionnaire exists for evaluation of outcomes of articular cartilage surgery in the knee. To develop a reliable and valid knee articular cartilage-specific study methodological quality questionnaire. Cross-sectional study. A stepwise, a priori-designed framework was created for development of a novel questionnaire. Relevant items to the topic were identified and extracted from a recent systematic review of 194 investigations of knee articular cartilage surgery. In addition, relevant items from existing generic study methodological quality questionnaires were identified. Items for a preliminary questionnaire were generated. Redundant and irrelevant items were eliminated, and acceptable items modified. The instrument was pretested and items weighed. The instrument, the MARK score (Methodological quality of ARticular cartilage studies of the Knee), was tested for validity (criterion validity) and reliability (inter- and intraobserver). A 19-item, 3-domain MARK score was developed. The 100-point scale score demonstrated face validity (focus group of 8 orthopaedic surgeons) and criterion validity (strong correlation to Cochrane Quality Assessment score and Modified Coleman Methodology Score). Interobserver reliability for the overall score was good (intraclass correlation coefficient [ICC], 0.842), and for all individual items of the MARK score, acceptable to perfect (ICC, 0.70-1.000). Intraobserver reliability ICC assessed over a 3-week interval was strong for 2 reviewers (≥0.90). The MARK score is a valid and reliable knee articular cartilage condition-specific study methodological quality instrument. This condition-specific questionnaire may be used to evaluate the quality of studies reporting outcomes of articular cartilage surgery in the knee.
Construction concepts and validation of the 3D printed UST_2 modular stellarator
NASA Astrophysics Data System (ADS)
Queral, V.
2015-03-01
High accuracy, geometric complexity and thus high cost of stellarators tend to hinder the advance of stellarator research. Nowadays, new manufacturing methods might be developed for the production of small and middle-size stellarators. The methods should demonstrate advantages with respect common fabrication methods, like casting, cutting, forging and welding, for the construction of advanced highly convoluted modular stellarators. UST2 is a small modular three period quasi-isodynamic stellarator of major radius 0.26 m and plasma volume 10 litres being currently built to validate additive manufacturing (3D printing) for stellarator construction. The modular coils are wound in grooves defined on six 3D printed half period frames designed as light truss structures filled by a strong filler. A geometrically simple assembling configuration has been concocted for UST2 so as to try to lower the cost of the device while keeping the positioning accuracy of the different elements. The paper summarizes the construction and assembling concepts developed, the devised positioning methodology, the design of the coil frames and positioning elements and, an initial validation of the assembling of the components.
Mavel, Sylvie; Lefèvre, Antoine; Bakhos, David; Dufour-Rainfray, Diane; Blasco, Hélène; Emond, Patrick
2018-05-22
Although there is some data from animal studies, the metabolome of inner ear fluid in humans remains unknown. Characterization of the metabolome of the perilymph would allow for better understanding of its role in auditory function and for identification of biomarkers that might allow prediction of response to therapeutics. There is a major technical challenge due to the small sample of perilymph fluid available for analysis (sub-microliter). The objectives of this study were to develop and validate a methodology for analysis of perilymph metabolome using liquid chromatography-high resolution mass spectrometry (LC-HRMS). Due to the low availability of perilymph fluid; a methodological study was first performed using low volumes (0.8 μL) of cerebrospinal fluid (CSF) and optimized the LC-HRMS parameters using targeted and non-targeted metabolomics approaches. We obtained excellent parameters of reproducibility for about 100 metabolites. This methodology was then used to analyze perilymph fluid using two complementary chromatographic supports: reverse phase (RP-C18) and hydrophilic interaction liquid chromatography (HILIC). Both methods were highly robust and showed their complementarity, thus reinforcing the interest to combine these chromatographic supports. A fingerprinting was obtained from 98 robust metabolites (analytical variability <30%), where amino acids (e.g., asparagine, valine, glutamine, alanine, etc.), carboxylic acids and derivatives (e.g., lactate, carnitine, trigonelline, creatinine, etc.) were observed as first-order signals. This work lays the foundations of a robust analytical workflow for the exploration of the perilymph metabolome dedicated to the research of biomarkers for the diagnosis/prognosis of auditory pathologies. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dung Nguyen, The; Kappas, Martin
2017-04-01
In the last several years, the interest in forest biomass and carbon stock estimation has increased due to its importance for forest management, modelling carbon cycle, and other ecosystem services. However, no estimates of biomass and carbon stocks of deferent forest cover types exist throughout in the Xuan Lien Nature Reserve, Thanh Hoa, Viet Nam. This study investigates the relationship between above ground carbon stock and different vegetation indices and to identify the most likely vegetation index that best correlate with forest carbon stock. The terrestrial inventory data come from 380 sample plots that were randomly sampled. Individual tree parameters such as DBH and tree height were collected to calculate the above ground volume, biomass and carbon for different forest types. The SPOT6 2013 satellite data was used in the study to obtain five vegetation indices NDVI, RDVI, MSR, RVI, and EVI. The relationships between the forest carbon stock and vegetation indices were investigated using a multiple linear regression analysis. R-square, RMSE values and cross-validation were used to measure the strength and validate the performance of the models. The methodology presented here demonstrates the possibility of estimating forest volume, biomass and carbon stock. It can also be further improved by addressing more spectral bands data and/or elevation.
Scalability and Validation of Big Data Bioinformatics Software.
Yang, Andrian; Troup, Michael; Ho, Joshua W K
2017-01-01
This review examines two important aspects that are central to modern big data bioinformatics analysis - software scalability and validity. We argue that not only are the issues of scalability and validation common to all big data bioinformatics analyses, they can be tackled by conceptually related methodological approaches, namely divide-and-conquer (scalability) and multiple executions (validation). Scalability is defined as the ability for a program to scale based on workload. It has always been an important consideration when developing bioinformatics algorithms and programs. Nonetheless the surge of volume and variety of biological and biomedical data has posed new challenges. We discuss how modern cloud computing and big data programming frameworks such as MapReduce and Spark are being used to effectively implement divide-and-conquer in a distributed computing environment. Validation of software is another important issue in big data bioinformatics that is often ignored. Software validation is the process of determining whether the program under test fulfils the task for which it was designed. Determining the correctness of the computational output of big data bioinformatics software is especially difficult due to the large input space and complex algorithms involved. We discuss how state-of-the-art software testing techniques that are based on the idea of multiple executions, such as metamorphic testing, can be used to implement an effective bioinformatics quality assurance strategy. We hope this review will raise awareness of these critical issues in bioinformatics.
A methodology for collecting valid software engineering data
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Weiss, David M.
1983-01-01
An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.
Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui
2017-12-01
Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.
Mesquita, D P; Dias, O; Amaral, A L; Ferreira, E C
2009-04-01
In recent years, a great deal of attention has been focused on the research of activated sludge processes, where the solid-liquid separation phase is frequently considered of critical importance, due to the different problems that severely affect the compaction and the settling of the sludge. Bearing that in mind, in this work, image analysis routines were developed in Matlab environment, allowing the identification and characterization of microbial aggregates and protruding filaments in eight different wastewater treatment plants, for a combined period of 2 years. The monitoring of the activated sludge contents allowed for the detection of bulking events proving that the developed image analysis methodology is adequate for a continuous examination of the morphological changes in microbial aggregates and subsequent estimation of the sludge volume index. In fact, the obtained results proved that the developed image analysis methodology is a feasible method for the continuous monitoring of activated sludge systems and identification of disturbances.
The Effects of Twitter Sentiment on Stock Price Returns.
Ranco, Gabriele; Aleksovski, Darko; Caldarelli, Guido; Grčar, Miha; Mozetič, Igor
2015-01-01
Social media are increasingly reflecting and influencing behavior of other complex systems. In this paper we investigate the relations between a well-known micro-blogging platform Twitter and financial markets. In particular, we consider, in a period of 15 months, the Twitter volume and sentiment about the 30 stock companies that form the Dow Jones Industrial Average (DJIA) index. We find a relatively low Pearson correlation and Granger causality between the corresponding time series over the entire time period. However, we find a significant dependence between the Twitter sentiment and abnormal returns during the peaks of Twitter volume. This is valid not only for the expected Twitter volume peaks (e.g., quarterly announcements), but also for peaks corresponding to less obvious events. We formalize the procedure by adapting the well-known "event study" from economics and finance to the analysis of Twitter data. The procedure allows to automatically identify events as Twitter volume peaks, to compute the prevailing sentiment (positive or negative) expressed in tweets at these peaks, and finally to apply the "event study" methodology to relate them to stock returns. We show that sentiment polarity of Twitter peaks implies the direction of cumulative abnormal returns. The amount of cumulative abnormal returns is relatively low (about 1-2%), but the dependence is statistically significant for several days after the events.
Hofmeester, Ilse; Kollen, Boudewijn J; Steffens, Martijn G; Bosch, J L H Ruud; Drake, Marcus J; Weiss, Jeffrey P; Blanker, Marco H
2015-04-01
To systematically review and evaluate the impact of the International Continence Society (ICS)-2002 report on standardisation of terminology in nocturia, on publications reporting on nocturia and nocturnal polyuria (NP). In 2002, the ICS defined NP as a Nocturnal Polyuria Index (nocturnal urine volume/total 24-h urine volume) of >0.2-0.33, depending on age. In April 2013 the PubMed and Embase databases were searched for studies (in English, German, French or Dutch) based on original data and adult participants, investigating the relationship between nocturia and NP. A methodological quality assessment was performed, including scores on external validity, internal validity and informativeness. Quality scores of items were compared between studies published before and after the ICS-2002 report. The search yielded 78 publications based on 66 studies. Quality scores of studies were generally high for internal validity (median 5, interquartile range [IQR] 4-6) but low for external validity. After publication of the ICS-2002 report, external validity showed a significant change from 1 (IQR 1-2) to 2 (IQR 1-2.5; P = 0.019). Nocturia remained undefined in 12 studies. In all, 19 different definitions were used for NP, most often being the ICS (or similar) definition: this covered 52% (n = 11) of studies before and 66% (n = 27) after the ICS-2002 report. Clear definitions of both nocturia and NP were identified in 67% and 76% before, and in 88% and 88% of the studies after the ICS-2002 report, respectively. The ICS-2002 report on standardisation of terminology in nocturia appears to have had a beneficial impact on reporting definitions of nocturia and NP, enabling better interpretation of results and comparisons between research projects. Because the external validity of most of the 66 studies is considered a problem, the results of these studies may not be validly extrapolated to other populations. The ICS definition of NP is used most often. However, its discriminative value seems limited due to the estimated difference of 0.6 nocturnal voids between individuals with and without NP. Refinement of current definitions based on robust research is required. Based on pathophysiological reasoning, we argue that it may be more appropriate to define NP based on nocturnal urine production or nocturnal voided volumes, rather than on a diurnal urine production pattern. © 2014 The Authors. BJU International © 2014 BJU International.
ERIC Educational Resources Information Center
Bethel, James; Green, James L.; Nord, Christine; Kalton, Graham; West, Jerry
2005-01-01
This report is Volume 2 of the methodology report that provides information about the development, design, and conduct of the 9-month data collection of the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). This volume begins with a brief overview of the ECLS-B, but focuses on the sample design, calculation of response rates, development…
ERIC Educational Resources Information Center
Afzal, Waseem
2017-01-01
Introduction: The purpose of this paper is to propose a methodology to conceptualize, operationalize, and empirically validate the concept of information need. Method: The proposed methodology makes use of both qualitative and quantitative perspectives, and includes a broad array of approaches such as literature reviews, expert opinions, focus…
NASA Astrophysics Data System (ADS)
Hubbard, J.; Onac, B. P.; Kruse, S.; Forray, F. L.
2017-12-01
Research at Scăriloara Ice Cave has proceeded for over 150 years, primarily driven by the presence and paleoclimatic importance of the large perennial ice block and various ice speleothems located within its galleries. Previous observations of the ice block led to rudimentary volume estimates of 70,000 to 120,000 cubic meters (m3), prospectively placing it as one of the world's largest cave ice deposits. The cave morphology and the surface of the ice block are now recreated in a total station survey-validated 3D model, produced using Structure from Motion (SfM) software. With the total station survey and the novel use of ArcGIS tools, the SfM validation process is drastically simplified to produce a scaled, georeferenced, and photo-texturized 3D model of the cave environment with a root-mean-square error (RMSE) of 0.24 m. Furthermore, ground penetrating radar data was collected and spatially oriented with the total station survey to recreate the ice block basal surface and was combined with the SfM model to create a model of the ice block itself. The resulting ice block model has a volume of over 118,000 m3 with an uncertainty of 9.5%, with additional volumes left un-surveyed. The varying elevation of the ice block basal surface model reflect specific features of the cave roof, such as areas of enlargement, shafts, and potential joints, which offer further validation and inform theories on cave and ice genesis. Specifically, a large depression area was identified as a potential area of initial ice growth. Finally, an ice thickness map was produced that will aid in the designing of future ice coring projects. This methodology presents a powerful means to observe and accurately characterize and measure cave and cave ice morphologies with ease and affordability. Results further establish the significance of Scăriloara's ice block to paleoclimate research, provide insights into cave and ice block genesis, and aid future study design.
NASA Astrophysics Data System (ADS)
Martin, Ffion A.; Warrior, Nicholas A.; Simacek, Pavel; Advani, Suresh; Hughes, Adrian; Darlington, Roger; Senan, Eissa
2018-03-01
Very short manufacture cycle times are required if continuous carbon fibre and epoxy composite components are to be economically viable solutions for high volume composite production for the automotive industry. Here, a manufacturing process variant of resin transfer moulding (RTM), targets a reduction of in-mould manufacture time by reducing the time to inject and cure components. The process involves two stages; resin injection followed by compression. A flow simulation methodology using an RTM solver for the process has been developed. This paper compares the simulation prediction to experiments performed using industrial equipment. The issues encountered during the manufacturing are included in the simulation and their sensitivity to the process is explored.
The RAAF Logistics Study. Volume 4,
1986-10-01
Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system
Validation of Imaging With Pathology in Laryngeal Cancer: Accuracy of the Registration Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caldas-Magalhaes, Joana, E-mail: J.CaldasMagalhaes@umcutrecht.nl; Kasperts, Nicolien; Kooij, Nina
2012-02-01
Purpose: To investigate the feasibility and accuracy of an automated method to validate gross tumor volume (GTV) delineations with pathology in laryngeal and hypopharyngeal cancer. Methods and Materials: High-resolution computed tomography (CT{sub HR}), magnetic resonance imaging (MRI), and positron emission tomography (PET) scans were obtained from 10 patients before total laryngectomy. The GTV was delineated separately in each imaging modality. The laryngectomy specimen was sliced transversely in 3-mm-thick slices, and whole-mount hematoxylin-eosin stained (H and E) sections were obtained. A pathologist delineated tumor tissue in the H and E sections (GTV{sub PATH}). An automatic three-dimensional (3D) reconstruction of the specimenmore » was performed, and the CT{sub HR}, MRI, and PET were semiautomatically and rigidly registered to the 3D specimen. The accuracy of the pathology-imaging registration and the specimen deformation and shrinkage were assessed. The tumor delineation inaccuracies were compared with the registration errors. Results: Good agreement was observed between anatomical landmarks in the 3D specimen and in the in vivo images. Limited deformations and shrinkage (3% {+-} 1%) were found inside the cartilage skeleton. The root mean squared error of the registration between the 3D specimen and the CT, MRI, and PET was on average 1.5, 3.0, and 3.3 mm, respectively, in the cartilage skeleton. The GTV{sub PATH} volume was 7.2 mL, on average. The GTVs based on CT, MRI, and PET generated a mean volume of 14.9, 18.3, and 9.8 mL and covered the GTV{sub PATH} by 85%, 88%, and 77%, respectively. The tumor delineation inaccuracies exceeded the registration error in all the imaging modalities. Conclusions: Validation of GTV delineations with pathology is feasible with an average overall accuracy below 3.5 mm inside the laryngeal skeleton. The tumor delineation inaccuracies were larger than the registration error. Therefore, an accurate histological validation of anatomical and functional imaging techniques for GTV delineation is possible in laryngeal cancer patients.« less
External Validity in the Study of Human Development: Theoretical and Methodological Issues
ERIC Educational Resources Information Center
Hultsch, David F.; Hickey, Tom
1978-01-01
An examination of the concept of external validity from two theoretical perspectives: a traditional mechanistic approach and a dialectical organismic approach. Examines the theoretical and methodological implications of these perspectives. (BD)
New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.
Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María
2017-08-01
In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Wu, Amery D.; Stone, Jake E.; Liu, Yan
2016-01-01
This article proposes and demonstrates a methodology for test score validation through abductive reasoning. It describes how abductive reasoning can be utilized in support of the claims made about test score validity. This methodology is demonstrated with a real data example of the Canadian English Language Proficiency Index Program…
Results of Fall 2001 Pilot: Methodology for Validation of Course Prerequisites.
ERIC Educational Resources Information Center
Serban, Andreea M.; Fleming, Steve
The purpose of this study was to test a methodology that will help Santa Barbara City College (SBCC), California, to validate the course prerequisites that fall under the category of highest level of scrutiny--data collection and analysis--as defined by the Chancellor's Office. This study gathered data for the validation of prerequisites for three…
NASA Technical Reports Server (NTRS)
Mueller, J. L. (Editor); Fargion, Giuletta S. (Editor); McClain, Charles R. (Editor); Pegau, Scott; Zaneveld, J. Ronald V.; Mitchell, B. Gregg; Kahru, Mati; Wieland, John; Stramska, Malgorzat
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.
NASA Technical Reports Server (NTRS)
Mueller, J. L. (Editor); Fargion, Giulietta S. (Editor); McClain, Charles R. (Editor)
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.
ERIC Educational Resources Information Center
Rubilar, Álvaro Sebastián Bustos; Badillo, Gonzalo Zubieta
2017-01-01
In this article, we report how a geometric task based on the ACODESA methodology (collaborative learning, scientific debate and self-reflection) promotes the reformulation of the students' validations and allows revealing the students' aims in each of the stages of the methodology. To do so, we present the case of a team and, particularly, one of…
Simulation validation and management
NASA Astrophysics Data System (ADS)
Illgen, John D.
1995-06-01
Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.
ERIC Educational Resources Information Center
Wylie, Ruth C.
This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…
Effects of obesity on lung volume and capacity in children and adolescents: a systematic review
Winck, Aline Dill; Heinzmann-Filho, João Paulo; Soares, Rafaela Borges; da Silva, Juliana Severo; Woszezenki, Cristhiele Taís; Zanatta, Letiane Bueno
2016-01-01
Abstract Objective: To assess the effects of obesity on lung volume and capacity in children and adolescents. Data source: This is a systematic review, carried out in Pubmed, Lilacs, Scielo and PEDro databases, using the following Keywords: Plethysmography; Whole Body OR Lung Volume Measurements OR Total Lung Capacity OR Functional Residual Capacity OR Residual Volume AND Obesity. Observational studies or clinical trials that assessed the effects of obesity on lung volume and capacity in children and adolescents (0-18 years) without any other associated disease; in English; Portuguese and Spanish languages were selected. Methodological quality was assessed by the Agency for Healthcare Research and Quality. Data synthesis: Of the 1,030 articles, only four were included in the review. The studies amounted to 548 participants, predominantly males, with sample size ranging from 45 to 327 individuals. 100% of the studies evaluated nutritional status through BMI (z-score) and 50.0% reported the data on abdominal circumference. All demonstrated that obesity causes negative effects on lung volume and capacity, causing a reduction mainly in functional residual capacity in 75.0% of the studies; in the expiratory reserve volume in 50.0% and in the residual volume in 25.0%. The methodological quality ranged from moderate to high, with 75.0% of the studies classified as having high methodological quality. Conclusions: Obesity causes deleterious effects on lung volume and capacity in children and adolescents, mainly by reducing functional residual capacity, expiratory reserve volume and residual volume. PMID:27130483
NASA Astrophysics Data System (ADS)
Dai, Xiaoyu; Haussener, Sophia
2018-02-01
A multi-scale methodology for the radiative transfer analysis of heterogeneous media composed of morphologically-complex components on two distinct scales is presented. The methodology incorporates the exact morphology at the various scales and utilizes volume-averaging approaches with the corresponding effective properties to couple the scales. At the continuum level, the volume-averaged coupled radiative transfer equations are solved utilizing (i) effective radiative transport properties obtained by direct Monte Carlo simulations at the pore level, and (ii) averaged bulk material properties obtained at particle level by Lorenz-Mie theory or discrete dipole approximation calculations. This model is applied to a soot-contaminated snow layer, and is experimentally validated with reflectance measurements of such layers. A quantitative and decoupled understanding of the morphological effect on the radiative transport is achieved, and a significant influence of the dual-scale morphology on the macroscopic optical behavior is observed. Our results show that with a small amount of soot particles, of the order of 1ppb in volume fraction, the reduction in reflectance of a snow layer with large ice grains can reach up to 77% (at a wavelength of 0.3 μm). Soot impurities modeled as compact agglomerates yield 2-3% lower reduction of the reflectance in a thick show layer compared to snow with soot impurities modeled as chain-like agglomerates. Soot impurities modeled as equivalent spherical particles underestimate the reflectance reduction by 2-8%. This study implies that the morphology of the heterogeneities in a media significantly affects the macroscopic optical behavior and, specifically for the soot-contaminated snow, indicates the non-negligible role of soot on the absorption behavior of snow layers. It can be equally used in technical applications for the assessment and optimization of optical performance in multi-scale media.
NASA Technical Reports Server (NTRS)
Gault, J. W. (Editor); Trivedi, K. S. (Editor); Clary, J. B. (Editor)
1980-01-01
The validation process comprises the activities required to insure the agreement of system realization with system specification. A preliminary validation methodology for fault tolerant systems documented. A general framework for a validation methodology is presented along with a set of specific tasks intended for the validation of two specimen system, SIFT and FTMP. Two major areas of research are identified. First, are those activities required to support the ongoing development of the validation process itself, and second, are those activities required to support the design, development, and understanding of fault tolerant systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of differentmore » approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.« less
Stabilized Finite Elements in FUN3D
NASA Technical Reports Server (NTRS)
Anderson, W. Kyle; Newman, James C.; Karman, Steve L.
2017-01-01
A Streamlined Upwind Petrov-Galerkin (SUPG) stabilized finite-element discretization has been implemented as a library into the FUN3D unstructured-grid flow solver. Motivation for the selection of this methodology is given, details of the implementation are provided, and the discretization for the interior scheme is verified for linear and quadratic elements by using the method of manufactured solutions. A methodology is also described for capturing shocks, and simulation results are compared to the finite-volume formulation that is currently the primary method employed for routine engineering applications. The finite-element methodology is demonstrated to be more accurate than the finite-volume technology, particularly on tetrahedral meshes where the solutions obtained using the finite-volume scheme can suffer from adverse effects caused by bias in the grid. Although no effort has been made to date to optimize computational efficiency, the finite-element scheme is competitive with the finite-volume scheme in terms of computer time to reach convergence.
Quirk, Emma; Doggett, Adrian; Bretnall, Alison
2014-08-05
Spray Dried Dispersions (SDD) are uniform mixtures of a specific ratio of amorphous active pharmaceutical ingredient (API) and polymer prepared via a spray drying process. Volatile solvents are employed during spray drying to facilitate the formation of the SDD material. Following manufacture, analytical methodology is required to determine residual levels of the spray drying solvent and its associated impurities. Due to the high level of polymer in the SDD samples, direct liquid injection with Gas Chromatography (GC) is not a viable option for analysis. This work describes the development and validation of an analytical approach to determine residual levels of acetone and acetone related impurities, mesityl oxide (MO) and diacetone alcohol (DAA), in drug product intermediates prepared as SDDs using GC with headspace (HS) autosampling. The method development for these analytes presented a number of analytical challenges which had to be overcome before the levels of the volatiles of interest could be accurately quantified. GCHS could be used after two critical factors were implemented; (1) calculation and application of conversion factors to 'correct' for the reactions occurring between acetone, MO and DAA during generation of the headspace volume for analysis, and the addition of an equivalent amount of polymer into all reference solutions used for quantitation to ensure comparability between the headspace volumes generated for both samples and external standards. This work describes the method development and optimisation of the standard preparation, the headspace autosampler operating parameters and the chromatographic conditions, together with a summary of the validation of the methodology. The approach has been demonstrated to be robust and suitable to accurately determine levels of acetone, MO and DAA in SDD materials over the linear concentration range 0.008-0.4μL/mL, with minimum quantitation limits of 20ppm for acetone and MO, and 80ppm for DAA. Copyright © 2014 Elsevier B.V. All rights reserved.
Vila, Marlene; Llompart, Maria; Garcia-Jares, Carmen; Homem, Vera; Dagnac, Thierry
2018-06-06
A methodology based on solid-phase microextraction (SPME) followed by gas chromatography-tandem mass spectrometry (GC-MS/MS) has been developed for the simultaneous analysis of eleven multiclass ultraviolet (UV) filters in beach sand. To the best of our knowledge, this is the first time that this extraction technique is applied to the analysis of UV filters in sand samples, and in other kind of environmental solid samples. Main extraction parameters such as the fibre coating, the amount of sample, the addition of salt, the volume of water added to the sand, and the temperature were optimized. An experimental design approach was implemented in order to find out the most favourable conditions. The final conditions consisted of adding 1 mL of water to 1 g of sample followed by the headspace SPME for 20 min at 100 °C, using PDMS/DVB as fibre coating. The SPME-GC-MS/MS method was validated in terms of linearity, accuracy, limits of detection and quantification, and precision. Recovery studies were also performed at three concentration levels in real Atlantic and Mediterranean sand samples. The recoveries were generally above 85% and relative standard deviations below 11%. The limits of detection were in the pg g -1 level. The validated methodology was successfully applied to the analysis of real sand samples collected from Atlantic Ocean beaches in the Northwest coast of Spain and Portugal, Canary Islands (Spain), and from Mediterranean Sea beaches in Mallorca Island (Spain). The most frequently found UV filters were ethylhexyl salicylate (EHS), homosalate (HMS), 4-methylbenzylidene camphor (4MBC), 2-ethylhexyl methoxycinnamate (2EHMC) and octocrylene (OCR), with concentrations up to 670 ng g -1 . Copyright © 2018 Elsevier B.V. All rights reserved.
Lunven, Catherine; Turpault, Sandrine; Beyer, Yann-Joel; O'Brien, Amy; Delfolie, Astrid; Boyanova, Neli; Sanderink, Ger-Jan; Baldinetti, Francesca
2016-01-01
Background: Teriflunomide, a once-daily oral immunomodulator approved for treatment of relapsing-remitting multiple sclerosis, is eliminated slowly from plasma. If necessary to rapidly lower plasma concentrations of teriflunomide, an accelerated elimination procedure using cholestyramine or activated charcoal may be used. The current bioanalytical assay for determination of plasma teriflunomide concentration requires laboratory facilities for blood centrifugation and plasma storage. An alternative method, with potential for greater convenience, is dried blood spot (DBS) methodology. Analytical and clinical validations are required to switch from plasma to DBS (finger-prick sampling) methodology. Methods: Using blood samples from healthy subjects, an LC-MS/MS assay method for quantification of teriflunomide in DBS over a range of 0.01–10 mcg/mL was developed and validated for specificity, selectivity, accuracy, precision, reproducibility, and stability. Results were compared with those from the current plasma assay for determination of plasma teriflunomide concentration. Results: Method was specific and selective relative to endogenous compounds, with process efficiency ∼88%, and no matrix effect. Inaccuracy and imprecision for intraday and interday analyses were <15% at all concentrations tested. Quantification of teriflunomide in DBS assay was not affected by blood deposit volume and punch position within spot, and hematocrit level had a limited but acceptable effect on measurement accuracy. Teriflunomide was stable for at least 4 months at room temperature, and for at least 24 hours at 37°C with and without 95% relative humidity, to cover sampling, drying, and shipment conditions in the field. The correlation between DBS and plasma concentrations (R2 = 0.97), with an average blood to plasma ratio of 0.59, was concentration independent and constant over time. Conclusions: DBS sampling is a simple and practical method for monitoring teriflunomide concentrations. PMID:27015245
Gustafsson, P M; Robinson, P D; Lindblad, A; Oberli, D
2016-11-01
Multiple-breath inert gas washout (MBW) is ideally suited for early detection and monitoring of serious lung disease, such as cystic fibrosis, in infants and young children. Validated commercial options for the MBW technique are limited, and suitability of nitrogen (N 2 )-based MBW is of concern given the detrimental effect of exposure to pure O 2 on infant breathing pattern. We propose novel methodology using commercially available N 2 MBW equipment to facilitate 4% sulfur hexafluoride (SF 6 ) multiple-breath inert gas wash-in and washout suitable for the infant age range. CO 2 , O 2 , and sidestream molar mass sensor signals were used to accurately calculate SF 6 concentrations. An improved dynamic method for synchronization of gas and respiratory flow was developed to take into account variations in sidestream sample flow during MBW measurement. In vitro validation of triplicate functional residual capacity (FRC) assessments was undertaken under dry ambient conditions using lung models ranging from 90 to 267 ml, with tidal volumes of 28-79 ml, and respiratory rates 20-60 per minute. The relative mean (SD, 95% confidence interval) error of triplicate FRC determinations by washout was -0.26 (1.84, -3.86 to +3.35)% and by wash-in was 0.57 (2.66, -4.66 to +5.79)%. The standard deviations [mean (SD)] of percentage error among FRC triplicates were 1.40 (1.14) and 1.38 (1.32) for washout and wash-in, respectively. The novel methodology presented achieved FRC accuracy as outlined by current MBW consensus recommendations (95% of measurements within 5% accuracy). Further clinical evaluation is required, but this new technique, using existing commercially available equipment, has exciting potential for research and clinical use. Copyright © 2016 the American Physiological Society.
NASA Technical Reports Server (NTRS)
Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.
Membranes with artificial free-volume for biofuel production
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.
2015-01-01
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity. PMID:26104672
Membranes with artificial free-volume for biofuel production
NASA Astrophysics Data System (ADS)
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.
2015-06-01
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.
Membranes with artificial free-volume for biofuel production
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; ...
2015-06-24
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. Here, we have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the termmore » artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. Moreover, we found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.« less
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
Towards a sharp-interface volume-of-fluid methodology for modeling evaporation
NASA Astrophysics Data System (ADS)
Pathak, Ashish; Raessi, Mehdi
2017-11-01
In modeling evaporation, the diffuse-interface (one-domain) formulation yields inaccurate results. Recent efforts approaching the problem via a sharp-interface (two-domain) formulation have shown significant improvements. The reasons behind their better performance are discussed in the present work. All available sharp-interface methods, however, exclusively employ the level-set. In the present work, we develop a sharp-interface evaporation model in a volume-of-fluid (VOF) framework in order to leverage its mass-conserving property as well as its ability to handle large topographical changes. We start with a critical review of the assumptions underlying the mathematical equations governing evaporation. For example, it is shown that the assumption of incompressibility can only be applied in special circumstances. The famous D2 law used for benchmarking is valid exclusively to steady-state test problems. Transient is present over significant lifetime of a micron-size droplet. Therefore, a 1D spherical fully transient model is developed to provide a benchmark transient solution. Finally, a 3D Cartesian Navier-Stokes evaporation solver is developed. Some preliminary validation test-cases are presented for static and moving drop evaporation. This material is based upon work supported by the Department of Energy, Office of Energy Efficiency and Renewable Energy and the Department of Defense, Tank and Automotive Research, Development, and Engineering Center, under Award Number DEEE0007292.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Yidong; Andrs, David; Martineau, Richard Charles
This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for timemore » integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.« less
ERIC Educational Resources Information Center
Osler, James Edward, II
2015-01-01
This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…
Pirat, Bahar; Little, Stephen H.; Igo, Stephen R.; McCulloch, Marti; Nosé, Yukihiko; Hartley, Craig J.; Zoghbi, William A.
2012-01-01
Objective The proximal isovelocity surface area (PISA) method is useful in the quantitation of aortic regurgitation (AR). We hypothesized that actual measurement of PISA provided with real-time 3-dimensional (3D) color Doppler yields more accurate regurgitant volumes than those estimated by 2-dimensional (2D) color Doppler PISA. Methods We developed a pulsatile flow model for AR with an imaging chamber in which interchangeable regurgitant orifices with defined shapes and areas were incorporated. An ultrasonic flow meter was used to calculate the reference regurgitant volumes. A total of 29 different flow conditions for 5 orifices with different shapes were tested at a rate of 72 beats/min. 2D PISA was calculated as 2π r2, and 3D PISA was measured from 8 equidistant radial planes of the 3D PISA. Regurgitant volume was derived as PISA × aliasing velocity × time velocity integral of AR/peak AR velocity. Results Regurgitant volumes by flow meter ranged between 12.6 and 30.6 mL/beat (mean 21.4 ± 5.5 mL/beat). Regurgitant volumes estimated by 2D PISA correlated well with volumes measured by flow meter (r = 0.69); however, a significant underestimation was observed (y = 0.5x + 0.6). Correlation with flow meter volumes was stronger for 3D PISA-derived regurgitant volumes (r = 0.83); significantly less underestimation of regurgitant volumes was seen, with a regression line close to identity (y = 0.9x + 3.9). Conclusion Direct measurement of PISA is feasible, without geometric assumptions, using real-time 3D color Doppler. Calculation of aortic regurgitant volumes with 3D color Doppler using this methodology is more accurate than conventional 2D method with hemispheric PISA assumption. PMID:19168322
Validating a new methodology for optical probe design and image registration in fNIRS studies
Wijeakumar, Sobanawartiny; Spencer, John P.; Bohache, Kevin; Boas, David A.; Magnotta, Vincent A.
2015-01-01
Functional near-infrared spectroscopy (fNIRS) is an imaging technique that relies on the principle of shining near-infrared light through tissue to detect changes in hemodynamic activation. An important methodological issue encountered is the creation of optimized probe geometry for fNIRS recordings. Here, across three experiments, we describe and validate a processing pipeline designed to create an optimized, yet scalable probe geometry based on selected regions of interest (ROIs) from the functional magnetic resonance imaging (fMRI) literature. In experiment 1, we created a probe geometry optimized to record changes in activation from target ROIs important for visual working memory. Positions of the sources and detectors of the probe geometry on an adult head were digitized using a motion sensor and projected onto a generic adult atlas and a segmented head obtained from the subject's MRI scan. In experiment 2, the same probe geometry was scaled down to fit a child's head and later digitized and projected onto the generic adult atlas and a segmented volume obtained from the child's MRI scan. Using visualization tools and by quantifying the amount of intersection between target ROIs and channels, we show that out of 21 ROIs, 17 and 19 ROIs intersected with fNIRS channels from the adult and child probe geometries, respectively. Further, both the adult atlas and adult subject-specific MRI approaches yielded similar results and can be used interchangeably. However, results suggest that segmented heads obtained from MRI scans be used for registering children's data. Finally, in experiment 3, we further validated our processing pipeline by creating a different probe geometry designed to record from target ROIs involved in language and motor processing. PMID:25705757
Effects of obesity on lung volume and capacity in children and adolescents: a systematic review.
Winck, Aline Dill; Heinzmann-Filho, João Paulo; Soares, Rafaela Borges; da Silva, Juliana Severo; Woszezenki, Cristhiele Taís; Zanatta, Letiane Bueno
2016-12-01
To assess the effects of obesity on lung volume and capacity in children and adolescents. This is a systematic review, carried out in Pubmed, Lilacs, Scielo and PEDro databases, using the following Keywords: Plethysmography; Whole Body OR Lung Volume Measurements OR Total Lung Capacity OR Functional Residual Capacity OR Residual Volume AND Obesity. Observational studies or clinical trials that assessed the effects of obesity on lung volume and capacity in children and adolescents (0-18 years) without any other associated disease; in English; Portuguese and Spanish languages were selected. Methodological quality was assessed by the Agency for Healthcare Research and Quality. Of the 1,030 articles, only four were included in the review. The studies amounted to 548 participants, predominantly males, with sample size ranging from 45 to 327 individuals. 100% of the studies evaluated nutritional status through BMI (z-score) and 50.0% reported the data on abdominal circumference. All demonstrated that obesity causes negative effects on lung volume and capacity, causing a reduction mainly in functional residual capacity in 75.0% of the studies; in the expiratory reserve volume in 50.0% and in the residual volume in 25.0%. The methodological quality ranged from moderate to high, with 75.0% of the studies classified as having high methodological quality. Obesity causes deleterious effects on lung volume and capacity in children and adolescents, mainly by reducing functional residual capacity, expiratory reserve volume and residual volume. Copyright © 2016 Sociedade de Pediatria de São Paulo. Publicado por Elsevier Editora Ltda. All rights reserved.
Zito, Felicia; De Bernardi, Elisabetta; Soffientini, Chiara; Canzi, Cristina; Casati, Rosangela; Gerundini, Paolo; Baselli, Giuseppe
2012-09-01
In recent years, segmentation algorithms and activity quantification methods have been proposed for oncological (18)F-fluorodeoxyglucose (FDG) PET. A full assessment of these algorithms, necessary for a clinical transfer, requires a validation on data sets provided with a reliable ground truth as to the imaged activity distribution, which must be as realistic as possible. The aim of this work is to propose a strategy to simulate lesions of uniform uptake and irregular shape in an anthropomorphic phantom, with the possibility to easily obtain a ground truth as to lesion activity and borders. Lesions were simulated with samples of clinoptilolite, a family of natural zeolites of irregular shape, able to absorb aqueous solutions of (18)F-FDG, available in a wide size range, and nontoxic. Zeolites were soaked in solutions of (18)F-FDG for increasing times up to 120 min and their absorptive properties were characterized as function of soaking duration, solution concentration, and zeolite dry weight. Saturated zeolites were wrapped in Parafilm, positioned inside an Alderson thorax-abdomen phantom and imaged with a PET-CT scanner. The ground truth for the activity distribution of each zeolite was obtained by segmenting high-resolution finely aligned CT images, on the basis of independently obtained volume measurements. The fine alignment between CT and PET was validated by comparing the CT-derived ground truth to a set of zeolites' PET threshold segmentations in terms of Dice index and volume error. The soaking time necessary to achieve saturation increases with zeolite dry weight, with a maximum of about 90 min for the largest sample. At saturation, a linear dependence of the uptake normalized to the solution concentration on zeolite dry weight (R(2) = 0.988), as well as a uniform distribution of the activity over the entire zeolite volume from PET imaging were demonstrated. These findings indicate that the (18)F-FDG solution is able to saturate the zeolite pores and that the concentration does not influence the distribution uniformity of both solution and solute, at least at the trace concentrations used for zeolite activation. An additional proof of uniformity of zeolite saturation was obtained observing a correspondence between uptake and adsorbed volume of solution, corresponding to about 27.8% of zeolite volume. As to the ground truth for zeolites positioned inside the phantom, the segmentation of finely aligned CT images provided reliable borders, as demonstrated by a mean absolute volume error of 2.8% with respect to the PET threshold segmentation corresponding to the maximum Dice. The proposed methodology allowed obtaining an experimental phantom data set that can be used as a feasible tool to test and validate quantification and segmentation algorithms for PET in oncology. The phantom is currently under consideration for being included in a benchmark designed by AAPM TG211, which will be available to the community to evaluate PET automatic segmentation methods.
NASA Astrophysics Data System (ADS)
Moffitt, Blake Almy
Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are problematic for design space exploration. To begin addressing the current gaps in fuel cell aircraft development, a methodology has been developed to explore and characterize the near-term performance of fuel cell powered UAVs. The first step of the methodology is the development of a valid MDA. This is accomplished by using propagated uncertainty estimates to guide the decomposition of a MDA into key contributing analyses (CAs) that can be individually refined and validated to increase the overall accuracy of the MDA. To assist in MDA development, a flexible framework for simultaneously solving the CAs is specified. This enables the MDA to be easily adapted to changes in technology and the changes in data that occur throughout a design process. Various CAs that model a polymer electrolyte membrane fuel cell (PEMFC) UAV are developed, validated, and shown to be in agreement with hardware-in-the-loop simulations of a fully developed fuel cell propulsion system. After creating a valid MDA, the final step of the methodology is the synthesis of the MDA with an uncertainty propagation analysis, an optimization routine, and a chance constrained problem formulation. This synthesis allows an efficient calculation of the probabilistic constraint boundaries and Pareto frontiers that will govern the design space and influence design decisions relating to optimization and uncertainty mitigation. A key element of the methodology is uncertainty propagation. The methodology uses Systems Sensitivity Analysis (SSA) to estimate the uncertainty of key performance metrics due to uncertainties in design variables and uncertainties in the accuracy of the CAs. A summary of SSA is provided and key rules for properly decomposing a MDA for use with SSA are provided. Verification of SSA uncertainty estimates via Monte Carlo simulations is provided for both an example problem as well as a detailed MDA of a fuel cell UAV. Implementation of the methodology was performed on a small fuel cell UAV designed to carry a 2.2 kg payload with 24 hours of endurance. Uncertainty distributions for both design variables and the CAs were estimated based on experimental results and were found to dominate the design space. To reduce uncertainty and test the flexibility of the MDA framework, CAs were replaced with either empirical, or semi-empirical relationships during the optimization process. The final design was validated via a hardware-in-the loop simulation. Finally, the fuel cell UAV probabilistic design space was studied. A graphical representation of the design space was generated and the optima due to deterministic and probabilistic constraints were identified. The methodology was used to identify Pareto frontiers of the design space which were shown on contour plots of the design space. Unanticipated discontinuities of the Pareto fronts were observed as different constraints became active providing useful information on which to base design and development decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gatsonis, Nikolaos A.; Spirkin, Anton
2009-06-01
The mathematical formulation and computational implementation of a three-dimensional particle-in-cell methodology on unstructured Delaunay-Voronoi tetrahedral grids is presented. The method allows simulation of plasmas in complex domains and incorporates the duality of the Delaunay-Voronoi in all aspects of the particle-in-cell cycle. Charge assignment and field interpolation weighting schemes of zero- and first-order are formulated based on the theory of long-range constraints. Electric potential and fields are derived from a finite-volume formulation of Gauss' law using the Voronoi-Delaunay dual. Boundary conditions and the algorithms for injection, particle loading, particle motion, and particle tracking are implemented for unstructured Delaunay grids. Error andmore » sensitivity analysis examines the effects of particles/cell, grid scaling, and timestep on the numerical heating, the slowing-down time, and the deflection times. The problem of current collection by cylindrical Langmuir probes in collisionless plasmas is used for validation. Numerical results compare favorably with previous numerical and analytical solutions for a wide range of probe radius to Debye length ratios, probe potentials, and electron to ion temperature ratios. The versatility of the methodology is demonstrated with the simulation of a complex plasma microsensor, a directional micro-retarding potential analyzer that includes a low transparency micro-grid.« less
Dark Energy Survey Year 1 Results: galaxy mock catalogues for BAO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avila, S.; et al.
Mock catalogues are a crucial tool in the analysis of galaxy surveys data, both for the accurate computation of covariance matrices, and for the optimisation of analysis methodology and validation of data sets. In this paper, we present a set of 1800 galaxy mock catalogues designed to match the Dark Energy Survey Year-1 BAO sample (Crocce et al. 2017) in abundance, observational volume, redshift distribution and uncertainty, and redshift dependent clustering. The simulated samples were built upon HALOGEN (Avila et al. 2015) halo catalogues, based on a $2LPT$ density field with an exponential bias. For each of them, a lightconemore » is constructed by the superposition of snapshots in the redshift range $0.45« less
Igual, Laura; Soliva, Joan Carles; Escalera, Sergio; Gimeno, Roger; Vilarroya, Oscar; Radeva, Petia
2012-12-01
We present a fully automatic diagnostic imaging test for Attention-Deficit/Hyperactivity Disorder diagnosis assistance based on previously found evidences of caudate nucleus volumetric abnormalities. The proposed method consists of different steps: a new automatic method for external and internal segmentation of caudate based on Machine Learning methodologies; the definition of a set of new volume relation features, 3D Dissociated Dipoles, used for caudate representation and classification. We separately validate the contributions using real data from a pediatric population and show precise internal caudate segmentation and discrimination power of the diagnostic test, showing significant performance improvements in comparison to other state-of-the-art methods. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanninen, M.F.; O'Donoghue, P.E.; Popelar, C.F.
1993-02-01
The project was undertaken for the purposes of quantifying the Battelle slow crack growth (SCG) test for predicting long-term performance of polyethylene gas distribution pipes, and of demonstrating the applicability of the methodology for use by the gas industry for accelerated characterization testing, thereby bringing the SCG test development effort to a closure. The work has revealed that the Battelle SCG test, and the linear fracture mechanics interpretation that it currently utilizes, is valid for a class of PE materials. The long-term performance of these materials in various operating conditions can therefore be effectively predicted.
Multi-physics optimization of three-dimensional microvascular polymeric components
NASA Astrophysics Data System (ADS)
Aragón, Alejandro M.; Saksena, Rajat; Kozola, Brian D.; Geubelle, Philippe H.; Christensen, Kenneth T.; White, Scott R.
2013-01-01
This work discusses the computational design of microvascular polymeric materials, which aim at mimicking the behavior found in some living organisms that contain a vascular system. The optimization of the topology of the embedded three-dimensional microvascular network is carried out by coupling a multi-objective constrained genetic algorithm with a finite-element based physics solver, the latter validated through experiments. The optimization is carried out on multiple conflicting objective functions, namely the void volume fraction left by the network, the energy required to drive the fluid through the network and the maximum temperature when the material is subjected to thermal loads. The methodology presented in this work results in a viable alternative for the multi-physics optimization of these materials for active-cooling applications.
The Effects of Twitter Sentiment on Stock Price Returns
Ranco, Gabriele; Aleksovski, Darko; Caldarelli, Guido; Grčar, Miha; Mozetič, Igor
2015-01-01
Social media are increasingly reflecting and influencing behavior of other complex systems. In this paper we investigate the relations between a well-known micro-blogging platform Twitter and financial markets. In particular, we consider, in a period of 15 months, the Twitter volume and sentiment about the 30 stock companies that form the Dow Jones Industrial Average (DJIA) index. We find a relatively low Pearson correlation and Granger causality between the corresponding time series over the entire time period. However, we find a significant dependence between the Twitter sentiment and abnormal returns during the peaks of Twitter volume. This is valid not only for the expected Twitter volume peaks (e.g., quarterly announcements), but also for peaks corresponding to less obvious events. We formalize the procedure by adapting the well-known “event study” from economics and finance to the analysis of Twitter data. The procedure allows to automatically identify events as Twitter volume peaks, to compute the prevailing sentiment (positive or negative) expressed in tweets at these peaks, and finally to apply the “event study” methodology to relate them to stock returns. We show that sentiment polarity of Twitter peaks implies the direction of cumulative abnormal returns. The amount of cumulative abnormal returns is relatively low (about 1–2%), but the dependence is statistically significant for several days after the events. PMID:26390434
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…
Methodological challenges of validating a clinical decision-making tool in the practice environment.
Brennan, Caitlin W; Daly, Barbara J
2015-04-01
Validating a measurement tool intended for use in the practice environment poses challenges that may not be present when validating a tool intended solely for research purposes. The aim of this article is to describe the methodological challenges of validating a clinical decision-making tool, the Oncology Acuity Tool, which nurses use to make nurse assignment and staffing decisions prospectively each shift. Data were derived from a larger validation study, during which several methodological challenges arose. Revisions to the tool, including conducting iterative feedback cycles with end users, were necessary before the validation study was initiated. The "true" value of patient acuity is unknown, and thus, two approaches to inter-rater reliability assessment were used. Discordant perspectives existed between experts and end users. Balancing psychometric rigor with clinical relevance may be achieved through establishing research-practice partnerships, seeking active and continuous feedback with end users, and weighing traditional statistical rules of thumb with practical considerations. © The Author(s) 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Felicione, F. S.
2006-01-23
The potential for generation of gases in transuranic (TRU) waste by microbial activity, chemical interactions, corrosion, and radiolysis was addressed in the Argonne National Laboratory-West (ANL-West) Gas-Generation Experiments (GGE). Data was collected over several years by simulating the conditions in the Waste Isolation Pilot Plant (WIPP) after the eventual intrusion of brine into the repository. Fourteen test containers with various actual TRU waste immersed in representative brine were inoculated with WIPP-relevant microbes, pressurized with inert gases, and kept in an inert-atmosphere environment for several years to provide estimates of the gas-generation rates that will be used in computer models formore » future WIPP Performance Assessments. Modest temperature variations occurred during the long-term ANL-West experiments. Although the experiment temperatures always remained well within the experiment specifications, the small temperature variation was observed to affect the test container pressure far more than had been anticipated. In fact, the pressure variations were so large, and seemingly erratic, that it was impossible to discern whether the data was even valid and whether the long-term pressure trend was increasing, decreasing, or constant. The result was that no useful estimates of gas-generation rates could be deduced from the pressure data. Several initial attempts were made to quantify the pressure fluctuations by relating these to the measured temperature variation, but none was successful. The work reported here carefully analyzed the pressure measurements to determine if these were valid or erroneous data. It was found that a thorough consideration of the physical phenomena that were occurring can, in conjunction with suitable gas laws, account quite accurately for the pressure changes that were observed. Failure of the earlier attempts to validate the data was traced to the omission of several phenomena, the most important being the variation in the headspace volume caused by thermal expansion and contraction within the brine and waste. A further effort was directed at recovering useful results from the voluminous archived pressure data. An analytic methodology to do this was developed. This methodology was applied to each archived pressure measurement to nullify temperature and other effects to yield an adjusted pressure, from which gas-generation rates could be calculated. A review of the adjusted-pressure data indicated that generated-gas concentrations among these containers after approximately 3.25 years of test operation ranged from zero to over 17,000 ppm by volume. Four test containers experienced significant gas generation. All test containers that showed evidence of significant gas generation contained carbon-steel in the waste, indicating that corrosion was the predominant source of gas generation.« less
Causal Interpretations of Psychological Attributes
ERIC Educational Resources Information Center
Kane, Mike
2017-01-01
In the article "Rethinking Traditional Methods of Survey Validation" Andrew Maul describes a minimalist validation methodology for survey instruments, which he suggests is widely used in some areas of psychology and then critiques this methodology empirically and conceptually. He provides a reduction ad absurdum argument by showing that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, Paolo; Theiler, C.; Fasoli, A.
A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less
NASA Astrophysics Data System (ADS)
Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB
2017-11-01
Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.
Shea, Beverley J; Grimshaw, Jeremy M; Wells, George A; Boers, Maarten; Andersson, Neil; Hamel, Candyce; Porter, Ashley C; Tugwell, Peter; Moher, David; Bouter, Lex M
2007-02-15
Our objective was to develop an instrument to assess the methodological quality of systematic reviews, building upon previous tools, empirical evidence and expert consensus. A 37-item assessment tool was formed by combining 1) the enhanced Overview Quality Assessment Questionnaire (OQAQ), 2) a checklist created by Sacks, and 3) three additional items recently judged to be of methodological importance. This tool was applied to 99 paper-based and 52 electronic systematic reviews. Exploratory factor analysis was used to identify underlying components. The results were considered by methodological experts using a nominal group technique aimed at item reduction and design of an assessment tool with face and content validity. The factor analysis identified 11 components. From each component, one item was selected by the nominal group. The resulting instrument was judged to have face and content validity. A measurement tool for the 'assessment of multiple systematic reviews' (AMSTAR) was developed. The tool consists of 11 items and has good face and content validity for measuring the methodological quality of systematic reviews. Additional studies are needed with a focus on the reproducibility and construct validity of AMSTAR, before strong recommendations can be made on its use.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
Passenger rail vehicle safety assessment methodology. Volume I, Summary of safe performance limits.
DOT National Transportation Integrated Search
2000-04-01
This report presents a methodology based on computer simulation that asseses the safe dyamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical design parameters and characteristic properties of bo...
Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J
2013-01-01
Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.
Toward a Digital Thread and Data Package for Metals-Additive Manufacturing.
Kim, D B; Witherell, P; Lu, Y; Feng, S
2017-01-01
Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread.
Toward a Digital Thread and Data Package for Metals-Additive Manufacturing
Kim, D. B.; Witherell, P.; Lu, Y.; Feng, S.
2017-01-01
Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread. PMID:28691115
Railroad Classification Yard Technology Manual: Volume II : Yard Computer Systems
DOT National Transportation Integrated Search
1981-08-01
This volume (Volume II) of the Railroad Classification Yard Technology Manual documents the railroad classification yard computer systems methodology. The subjects covered are: functional description of process control and inventory computer systems,...
NASA Astrophysics Data System (ADS)
Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.
2005-11-01
Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.
Manterola, Carlos; Torres, Rodrigo; Burgos, Luis; Vial, Manuel; Pineda, Viviana
2006-07-01
Surgery is a curative treatment for gastric cancer (GC). As relapse is frequent, adjuvant therapies such as postoperative chemo radiotherapy have been tried. In Chile, some hospitals adopted Macdonald's study as a protocol for the treatment of GC. To determine methodological quality and internal and external validity of the Macdonald study. Three instruments were applied that assess methodological quality. A critical appraisal was done and the internal and external validity of the methodological quality was analyzed with two scales: MINCIR (Methodology and Research in Surgery), valid for therapy studies and CONSORT (Consolidated Standards of Reporting Trials), valid for randomized controlled trials (RCT). Guides and scales were applied by 5 researchers with training in clinical epidemiology. The reader's guide verified that the Macdonald study was not directed to answer a clearly defined question. There was random assignment, but the method used is not described and the patients were not considered until the end of the study (36% of the group with surgery plus chemo radiotherapy did not complete treatment). MINCIR scale confirmed a multicentric RCT, not blinded, with an unclear randomized sequence, erroneous sample size estimation, vague objectives and no exclusion criteria. CONSORT system proved the lack of working hypothesis and specific objectives as well as an absence of exclusion criteria and identification of the primary variable, an imprecise estimation of sample size, ambiguities in the randomization process, no blinding, an absence of statistical adjustment and the omission of a subgroup analysis. The instruments applied demonstrated methodological shortcomings that compromise the internal and external validity of the.
Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.
The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighter’s cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts;more » and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.« less
Collecting and validating experiential expertise is doable but poses methodological challenges.
Burda, Marika H F; van den Akker, Marjan; van der Horst, Frans; Lemmens, Paul; Knottnerus, J André
2016-04-01
To give an overview of important methodological challenges in collecting, validating, and further processing experiential expertise and how to address these challenges. Based on our own experiences in studying the concept, operationalization, and contents of experiential expertise, we have formulated methodological issues regarding the inventory and application of experiential expertise. The methodological challenges can be categorized in six developmental research stages, comprising the conceptualization of experiential expertise, methods to harvest experiential expertise, the validation of experiential expertise, evaluation of the effectiveness, how to translate experiential expertise into acceptable guidelines, and how to implement these. The description of methodological challenges and ways to handle those are illustrated using diabetes mellitus as an example. Experiential expertise can be defined and operationalized in terms of successful illness-related behaviors and translated into recommendations regarding life domains. Pathways have been identified to bridge the gaps between the world of patients' daily lives and the medical world. Copyright © 2016 Elsevier Inc. All rights reserved.
Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005
ERIC Educational Resources Information Center
Coffman, Julia, Ed.
2005-01-01
This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…
Micro-Raman Technology to Interrogate Two-Phase Extraction on a Microfluidic Device.
Nelson, Gilbert L; Asmussen, Susan E; Lines, Amanda M; Casella, Amanda J; Bottenus, Danny R; Clark, Sue B; Bryan, Samuel A
2018-05-21
Microfluidic devices provide ideal environments to study solvent extraction. When droplets form and generate plug flow down the microfluidic channel, the device acts as a microreactor in which the kinetics of chemical reactions and interfacial transfer can be examined. Here, we present a methodology that combines chemometric analysis with online micro-Raman spectroscopy to monitor biphasic extractions within a microfluidic device. Among the many benefits of microreactors is the ability to maintain small sample volumes, which is especially important when studying solvent extraction in harsh environments, such as in separations related to the nuclear fuel cycle. In solvent extraction, the efficiency of the process depends on complex formation and rates of transfer in biphasic systems. Thus, it is important to understand the kinetic parameters in an extraction system to maintain a high efficiency and effectivity of the process. This monitoring provided concentration measurements in both organic and aqueous plugs as they were pumped through the microfluidic channel. The biphasic system studied was comprised of HNO 3 as the aqueous phase and 30% (v/v) tributyl phosphate in n-dodecane comprised the organic phase, which simulated the plutonium uranium reduction extraction (PUREX) process. Using pre-equilibrated solutions (post extraction), the validity of the technique and methodology is illustrated. Following this validation, solutions that were not equilibrated were examined and the kinetics of interfacial mass transfer within the biphasic system were established. Kinetic results of extraction were compared to kinetics already determined on a macro scale to prove the efficacy of the technique.
Risk-based Methodology for Validation of Pharmaceutical Batch Processes.
Wiles, Frederick
2013-01-01
In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.
Construct Validity: Advances in Theory and Methodology
Strauss, Milton E.; Smith, Gregory T.
2008-01-01
Measures of psychological constructs are validated by testing whether they relate to measures of other constructs as specified by theory. Each test of relations between measures reflects on the validity of both the measures and the theory driving the test. Construct validation concerns the simultaneous process of measure and theory validation. In this chapter, we review the recent history of validation efforts in clinical psychological science that has led to this perspective, and we review five recent advances in validation theory and methodology of importance for clinical researchers. These are: the emergence of nonjustificationist philosophy of science; an increasing appreciation for theory and the need for informative tests of construct validity; valid construct representation in experimental psychopathology; the need to avoid representing multidimensional constructs with a single score; and the emergence of effective new statistical tools for the evaluation of convergent and discriminant validity. PMID:19086835
A compressible Navier-Stokes solver with two-equation and Reynolds stress turbulence closure models
NASA Technical Reports Server (NTRS)
Morrison, Joseph H.
1992-01-01
This report outlines the development of a general purpose aerodynamic solver for compressible turbulent flows. Turbulent closure is achieved using either two equation or Reynolds stress transportation equations. The applicable equation set consists of Favre-averaged conservation equations for the mass, momentum and total energy, and transport equations for the turbulent stresses and turbulent dissipation rate. In order to develop a scheme with good shock capturing capabilities, good accuracy and general geometric capabilities, a multi-block cell centered finite volume approach is used. Viscous fluxes are discretized using a finite volume representation of a central difference operator and the source terms are treated as an integral over the control volume. The methodology is validated by testing the algorithm on both two and three dimensional flows. Both the two equation and Reynolds stress models are used on a two dimensional 10 degree compression ramp at Mach 3, and the two equation model is used on the three dimensional flow over a cone at angle of attack at Mach 3.5. With the development of this algorithm, it is now possible to compute complex, compressible high speed flow fields using both two equation and Reynolds stress turbulent closure models, with the capability of eventually evaluating their predictive performance.
Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines.
Ma, Ping; Lien, Fue-Sang; Yee, Eugene
2017-01-01
This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz.
1977-04-01
U* AFFDL-TR-77-7 0 VOLUME III " VALIDATION OF MIL-F-9490D - GENERAL SPECIFICATION FOR FLIGHT CONTROL SYSTEM "FOR PILOTED MILITARY AIRCRAFT VOLUME...ý A1O 1 C I\\.FFBL Ti(-77-7. Vol. III f Validatio~n of UL-P-9-490D#,*. General Spacificatior "~inal 1’l -_t e for Flight ContrsA Zyn’om for Piloted...cation MIL-F-9490D (USAF), "Flight Control Systems - Design, Installation and Test of Piloted Aircraft, General Specifications for," dated 6 June 1975, by
de Souza, Vanessa; Zeitoun, Sandra Salloum; Lopes, Camila Takao; de Oliveira, Ana Paula Dias; Lopes, Juliana de Lima; de Barros, Alba Lucia Botura Leite
2014-06-01
To consensually validate the operational definitions of the nursing diagnoses activity intolerance, excessive fluid volume, and decreased cardiac output in patients with decompensated heart failure. Consensual validation was performed in two stages: analogy by similarity of defining characteristics, and development of operational definitions and validation with experts. A total of 38 defining characteristics were found. Operational definitions were developed and content-validated. One hundred percent of agreement was achieved among the seven experts after five rounds. "Ascites" was added in the nursing diagnosis excessive fluid volume. The consensual validation improves interpretation of human response, grounding the selection of nursing interventions and contributing to improved nursing outcomes. Support the assessment of patients with decompensated heart failure. © 2013 NANDA International.
Preliminary Validation of Composite Material Constitutive Characterization
John G. Michopoulos; Athanasios lliopoulos; John C. Hermanson; Adrian C. Orifici; Rodney S. Thomson
2012-01-01
This paper is describing the preliminary results of an effort to validate a methodology developed for composite material constitutive characterization. This methodology involves using massive amounts of data produced from multiaxially tested coupons via a 6-DoF robotic system called NRL66.3 developed at the Naval Research Laboratory. The testing is followed by...
Pérez-Beteta, J.; Molina, D.; Martínez-González, A.; Arregui, E.; Asenjo, B.; Iglesias, L.; Martino, J.; Pérez-Romasanta, L.; Arana, E.; Pérez-García, V. M.
2017-01-01
Abstract Introduction: Glioblastoma is the most frequent and lethal malignant brain tumor in adults. Preoperative magnetic resonance imaging is routinely used for diagnosis and treatment planning. One of the gold standards in imaging are the post-contrast T1-weighted magnetic resonance images (MRIs) that are used to define the macroscopic part of the tumor. A modern mathematical model has predicted a relation between tumor’s growth speed, and so the patient outcome, with the width of contrast enhancing areas measured on the pre-surgery post-contrast 3D T1-weighted images. Other works have hypothesized the role of the tumor’s surface as a driver for growth and infiltration in cancer. Materials and Methods: A retrospective study involving 7 hospitals was organized for the validation of the model predictions. Inclusion criteria where unifocal tumors and availability of 3D T1w MRIs and clinical data (age, survival, type of treatment, etc). 219 patients were included in the study. Tumors were manually segmented and thirty quantitative geometrical 3D variables computed including volumes (total tumor, contrast enhancing and necrotic), surfaces, volume ratios, 3D maximal diameter and several measures of the contrast enhancing ‘rim’. A novel measure was defined accounting for how much the tumor surface as measured on the post-contrast T1w MRIs deviates from a sphere of the same volume. Kaplan-Meir and Cox regression methods were used to validate the relevance of the different features. Results: Patients with small contrast enhancing width (< 3.7 mm) showed a significant improvement in survival (p = 0.002, differences in median = 191 days, HR = 1.701). The tumor surface irregularity turned out to be a very powerful predictor of survival (p = 0.004, differences in median = 104 days, HR = 1.516). Both parameters were significant in multivariate analysis together with the age (p = 0.007 for surface irregularity and p = 0.009 for rim width). No other parameters including volumes, volume ratios, surfaces, maximal diameters, and other features were associated with survival neither in univariate nor in multivariate analyses. Conclusions: The width of the contrast enhancing rim and a novel surface irregularity parameter were predictors of survival in a large cohort of GBM patients. Since the study methodology was very robust (high resolution MRIs, tumor segmentation methodology) and the number of geometrically meaningful parameters large in comparison with previous studies this study may shed some light on the long debated topic of the role of geometrical measures on GBM patient’s survival. FUNDING: James S. Mc. Donnell Foundation (USA) 21st Century Science Initiative in Mathematical and Complex Systems Approaches for Brain Cancer [Collaborative award 220020450 and planning grant 220020420], MINECO/FEDER [MTM2015-71200-R], JCCM [PEII-2014-031-P].
A semi-automatic method for left ventricle volume estimate: an in vivo validation study
NASA Technical Reports Server (NTRS)
Corsi, C.; Lamberti, C.; Sarti, A.; Saracino, G.; Shiota, T.; Thomas, J. D.
2001-01-01
This study aims to the validation of the left ventricular (LV) volume estimates obtained by processing volumetric data utilizing a segmentation model based on level set technique. The validation has been performed by comparing real-time volumetric echo data (RT3DE) and magnetic resonance (MRI) data. A validation protocol has been defined. The validation protocol was applied to twenty-four estimates (range 61-467 ml) obtained from normal and pathologic subjects, which underwent both RT3DE and MRI. A statistical analysis was performed on each estimate and on clinical parameters as stroke volume (SV) and ejection fraction (EF). Assuming MRI estimates (x) as a reference, an excellent correlation was found with volume measured by utilizing the segmentation procedure (y) (y=0.89x + 13.78, r=0.98). The mean error on SV was 8 ml and the mean error on EF was 2%. This study demonstrated that the segmentation technique is reliably applicable on human hearts in clinical practice.
LES of Swirling Reacting Flows via the Unstructured scalar-FDF Solver
NASA Astrophysics Data System (ADS)
Ansari, Naseem; Pisciuneri, Patrick; Strakey, Peter; Givi, Peyman
2011-11-01
Swirling flames pose a significant challenge for computational modeling due to the presence of recirculation regions and vortex shedding. In this work, results are presented of LES of two swirl stabilized non-premixed flames (SM1 and SM2) via the FDF methodology. These flames are part of the database for validation of turbulent-combustion models. The scalar-FDF is simulated on a domain discretized by unstructured meshes, and is coupled with a finite volume flow solver. In the SM1 flame (with a low swirl number) chemistry is described by the flamelet model based on the full GRI 2.11 mechanism. The SM2 flame (with a high swirl number) is simulated via a 46-step 17-species mechanism. The simulated results are assessed via comparison with experimental data.
Digital Microfluidics for Nucleic Acid Amplification
Veigas, Bruno; Fortunato, Elvira; Martins, Rodrigo; Águas, Hugo; Igreja, Rui; Baptista, Pedro V.
2017-01-01
Digital Microfluidics (DMF) has emerged as a disruptive methodology for the control and manipulation of low volume droplets. In DMF, each droplet acts as a single reactor, which allows for extensive multiparallelization of biological and chemical reactions at a much smaller scale. DMF devices open entirely new and promising pathways for multiplex analysis and reaction occurring in a miniaturized format, thus allowing for healthcare decentralization from major laboratories to point-of-care with accurate, robust and inexpensive molecular diagnostics. Here, we shall focus on DMF platforms specifically designed for nucleic acid amplification, which is key for molecular diagnostics of several diseases and conditions, from pathogen identification to cancer mutations detection. Particular attention will be given to the device architecture, materials and nucleic acid amplification applications in validated settings. PMID:28672827
Serra, H; Nogueira, J M F
2005-11-11
In the present contribution, a new automated on-line hydride generation methodology was developed for dibutyltin and tributyltin speciation at the trace level, using a programmable temperature-vaporizing inlet followed by capillary gas chromatography coupled to mass spectrometry in the selected ion-monitoring mode acquisition (PTV-GC/MS(SIM)). The methodology involves a sequence defined by two running methods, the first one configured for hydride generation with sodium tetrahydroborate as derivatising agent and the second configured for speciation purposes, using a conventional autosampler and data acquisition controlled by the instrument's software. From the method-development experiments, it had been established that injector configuration has a great effect on the speciation of the actual methodology, particularly, the initial inlet temperature (-20 degrees C; He: 150 ml/min), injection volume (2 microl) and solvent characteristics using the solvent venting mode. Under optimized conditions, a remarkable instrumental performance including very good precision (RSD < 4%), excellent linear dynamic range (up to 50 microg/ml) and limits of detection of 0.12 microg/ml and 9 ng/ml, were obtained for dibutyltin and tributyltin, respectively. The feasibility of the present methodology was validated through assays upon in-house spiked water (2 ng/ml) and a certified reference sediment matrix (Community Bureau of Reference, CRM 462, Nr. 330 dibutyltin: 68+/-12 ng/g; tributyltin: 54+/-15 ng/g on dry mass basis), using liquid-liquid extraction (LLE) and solid-phase extraction (SPE) sample enrichment and multiple injections (2 x 5 microl) for sensitivity enhancement. The methodology evidenced high reproducibility, is easy to work-up, sensitive and showed to be a suitable alternative to replace the currently dedicated analytical systems for organotin speciation in environmental matrices at the trace level.
Longo, Umile Giuseppe; Saris, Daniël; Poolman, Rudolf W; Berton, Alessandra; Denaro, Vincenzo
2012-10-01
The aims of this study were to obtain an overview of the methodological quality of studies on the measurement properties of rotator cuff questionnaires and to describe how well various aspects of the design and statistical analyses of studies on measurement properties are performed. A systematic review of published studies on the measurement properties of rotator cuff questionnaires was performed. Two investigators independently rated the quality of the studies using the Consensus-based Standards for the selection of health Measurement Instruments checklist. This checklist was developed in an international Delphi consensus study. Sixteen studies were included, in which two measurement instruments were evaluated, namely the Western Ontario Rotator Cuff Index and the Rotator Cuff Quality-of-Life Measure. The methodological quality of the included studies was adequate on some properties (construct validity, reliability, responsiveness, internal consistency, and translation) but need to be improved on other aspects. The most important methodological aspects that need to be developed are as follows: measurement error, content validity, structural validity, cross-cultural validity, criterion validity, and interpretability. Considering the importance of adequate measurement properties, it is concluded that, in the field of rotator cuff pathology, there is room for improvement in the methodological quality of studies measurement properties. II.
Density matters: Review of approaches to setting organism-based ballast water discharge standards
Lee II,; Frazier,; Ruiz,
2010-01-01
As part of their effort to develop national ballast water discharge standards under NPDES permitting, the Office of Water requested that WED scientists identify and review existing approaches to generating organism-based discharge standards for ballast water. Six potential approaches were identified and the utility and uncertainties of each approach was evaluated. During the process of reviewing the existing approaches, the WED scientists, in conjunction with scientists at the USGS and Smithsonian Institution, developed a new approach (per capita invasion probability or "PCIP") that addresses many of the limitations of the previous methodologies. THE PCIP approach allows risk managers to generate quantitative discharge standards using historical invasion rates, ballast water discharge volumes, and ballast water organism concentrations. The statistical power of sampling ballast water for both the validation of ballast water treatment systems and ship-board compliance monitoring with the existing methods, though it should be possible to obtain sufficient samples during treatment validation. The report will go to a National Academy of Sciences expert panel that will use it in their evaluation of approaches to developing ballast water discharge standards for the Office of Water.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is comprised of five volumes. The volume presents the study conclusions, summarizes the methodology used (more detail is found in Volume 3), discusses four case study applications of the model, and contains profiles of coastal communities in an Appendix.
Sagen, Ase; Kåresen, Rolf; Skaane, Per; Risberg, May Arna
2009-05-01
To evaluate concurrent and construct validity for the Simplified Water Displacement Instrument (SWDI), an instrument for measuring arm volumes and arm lymphedema as a result of breast cancer surgery. Validity design. Hospital setting. Women (N=23; mean age, 64+/-11y) were examined 6 years after breast cancer surgery with axillary node dissection. Not applicable. The SWDI was included for measuring arm volumes to estimate arm lymphedema as a result of breast cancer surgery. A computed tomography (CT) scan was included to examine the cross-sectional areas (CSAs) in square millimeters for the subcutaneous tissue, for the muscle tissue, and for measuring tissue density in Hounsfield units. Magnetic resonance imaging (MRI) with T2-weighted sequences was included to show increased signal intensity in subcutaneous and muscle tissue areas. The affected arm volume measured by the SWDI was significantly correlated to the total CSA of the affected upper limb (R=.904) and also to the CSA of the subcutaneous tissue and muscles tissue (R=.867 and R=.725), respectively (P<.001). The CSA of the subcutaneous tissue for the upper limb was significantly larger compared with the control limb (11%). Tissue density measured in Hounsfield units did not correlate significantly with arm volume (P>.05). The affected arm volume was significantly larger (5%) than the control arm volume (P<.05). Five (22%) women had arm lymphedema defined as a 10% increase in the affected arm volume compared with the control arm volume, and an increased signal intensity was identified in all 5 women on MRI (T2-weighted, kappa=.777, P<.001). The SWDI showed high concurrent and construct validity as shown with significant correlations between the CSA (CT) of the subcutaneous and muscle areas of the affected limb and the affected arm volume (P>.001). There was a high agreement between those subjects who were diagnosed with arm lymphedema by using the SWDI and the increased signal intensity on MRI, with a kappa value of .777 (P<.001). High construct validity for the SWDI was confirmed for arm lymphedema as a volume increase, but it was not confirmed for lymphedema without an increase in arm volume (swelling). The SWDI is a simple and valid tool for estimating arm volume and arm lymphedema after breast cancer surgery.
Jiang, Zheng; Wang, Hong; Wu, Qi-nan
2015-06-01
To optimize the processing of polysaccharide extraction from Spirodela polyrrhiza. Five factors related to extraction rate of polysaccharide were optimized by the Plackett-Burman design. Based on this study, three factors, including alcohol volume fraction, extraction temperature and ratio of material to liquid, were regarded as investigation factors by Box-Behnken response surface methodology. The effect order of three factors on the extraction rate of polysaccharide from Spirodela polyrrhiza were as follows: extraction temperature, alcohol volume fraction,ratio of material to liquid. According to Box-Behnken response, the best extraction conditions were: alcohol volume fraction of 81%, ratio of material to liquid of 1:42, extraction temperature of 100 degrees C, extraction time of 60 min for four times. Plackett-Burman design and Box-Behnken response surface methodology used to optimize the extraction process for the polysaccharide in this study is effective and stable.
Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines
Lien, Fue-Sang
2017-01-01
This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz. PMID:28378012
Virupaksha, Harve Shanmugam; Kalmady, Sunil V.; Shivakumar, Venkataram; Arasappa, Rashmi; Venkatasubramanian, Ganesan; Gangadhar, Bangalore N.
2012-01-01
Context: Insula, which is a vital brain region for self-awareness, empathy, and sensory stimuli processing, is critically implicated in schizophrenia pathogenesis. Existing studies on insula volume abnormalities report inconsistent findings potentially due to the evaluation of ‘antipsychotic-treated’ schizophrenia patients as well as suboptimal methodology. Aim: To understand the role of insula in schizophrenia. Materials and Methods: In this first-time 3-T magnetic resonance imaging study, we examined antipsychotic-naive schizophrenic patients (N=30) and age-, sex-, handedness- and education-matched healthy controls (N=28). Positive and negative symptoms were scored with good interrater reliability (intraclass correlation coefficient (ICC)>0.9) by using the scales for negative and positive symptoms. Gray matter volume of insula and its anterior/posterior subregions were measured by using a three-dimensional, interactive, semiautomated software based on the valid method with good interrater reliability (ICC>0.85). Intracranial volume was automatically measured by using the FreeSurfer software. Results: Patients had significantly deficient gray matter volumes of left (F=33.4; P<0.00001) and right (F=11.9; P=0.001) insula after controlling for the effects of age, sex, and intracranial volume. Patients with predominantly negative symptoms had a significantly deficient right posterior insula volume than those with predominantly positive symptoms (F=6.3; P=0.02). Asymmetry index analysis revealed anterior insular asymmetry to be significantly reversed (right>left) in male patients in comparison with male controls (left>right) (t=2.7; P=0.01). Conclusions: Robust insular volume deficits in antipsychotic-naive schizophrenia support intrinsic role for insula in pathogenesis of this disorder. The first-time demonstration of a relationship between right posterior insular deficit and negative symptoms is in tune with the background neurobiological literature. Another novel observation of sex-specific anterior insular asymmetry reversal in patients supports evolutionary postulates of schizophrenia pathogenesis. PMID:23162188
Virupaksha, Harve Shanmugam; Kalmady, Sunil V; Shivakumar, Venkataram; Arasappa, Rashmi; Venkatasubramanian, Ganesan; Gangadhar, Bangalore N
2012-04-01
Insula, which is a vital brain region for self-awareness, empathy, and sensory stimuli processing, is critically implicated in schizophrenia pathogenesis. Existing studies on insula volume abnormalities report inconsistent findings potentially due to the evaluation of 'antipsychotic-treated' schizophrenia patients as well as suboptimal methodology. To understand the role of insula in schizophrenia. In this first-time 3-T magnetic resonance imaging study, we examined antipsychotic-naive schizophrenic patients (N=30) and age-, sex-, handedness- and education-matched healthy controls (N=28). Positive and negative symptoms were scored with good interrater reliability (intraclass correlation coefficient (ICC)>0.9) by using the scales for negative and positive symptoms. Gray matter volume of insula and its anterior/posterior subregions were measured by using a three-dimensional, interactive, semiautomated software based on the valid method with good interrater reliability (ICC>0.85). Intracranial volume was automatically measured by using the FreeSurfer software. Patients had significantly deficient gray matter volumes of left (F=33.4; P<0.00001) and right (F=11.9; P=0.001) insula after controlling for the effects of age, sex, and intracranial volume. Patients with predominantly negative symptoms had a significantly deficient right posterior insula volume than those with predominantly positive symptoms (F=6.3; P=0.02). Asymmetry index analysis revealed anterior insular asymmetry to be significantly reversed (right>left) in male patients in comparison with male controls (left>right) (t=2.7; P=0.01). Robust insular volume deficits in antipsychotic-naive schizophrenia support intrinsic role for insula in pathogenesis of this disorder. The first-time demonstration of a relationship between right posterior insular deficit and negative symptoms is in tune with the background neurobiological literature. Another novel observation of sex-specific anterior insular asymmetry reversal in patients supports evolutionary postulates of schizophrenia pathogenesis.
Volume, conservation and instruction: A classroom based solomon four group study of conflict
NASA Astrophysics Data System (ADS)
Rowell, J. A.; Dawson, C. J.
The research reported is an attempt to widen the applicability of Piagetian theory-based conflict methodology from individual situations to whole classes. A Solomon four group experimental design augmented by a delayed posttest, was used to provide a controlled framework for studying the effects of conflict instruction on Grade 8 students' ability to conserve volume of noncompressible matter, and to apply that knowledge to gas volume. The results, reported for individuals and groups, show the methodology can be effective, particularly when instruction is preceded by a pretest. Immediate posttest differences in knowledge of gas volume between spontaneous (pretest) conservers and instructed conservers of volume of noncompressible matter were no longer in evidence on the delayed posttest. This observation together with the effects of pretesting and of the instructional sequence are shown to have a consistent Piagetian interpretation. Practical implications are discussed.
Discrete and continuous dynamics modeling of a mass moving on a flexible structure
NASA Technical Reports Server (NTRS)
Herman, Deborah Ann
1992-01-01
A general discrete methodology for modeling the dynamics of a mass that moves on the surface of a flexible structure is developed. This problem was motivated by the Space Station/Mobile Transporter system. A model reduction approach is developed to make the methodology applicable to large structural systems. To validate the discrete methodology, continuous formulations are also developed. Three different systems are examined: (1) simply-supported beam, (2) free-free beam, and (3) free-free beam with two points of contact between the mass and the flexible beam. In addition to validating the methodology, parametric studies were performed to examine how the system's physical properties affect its dynamics.
Evaluating performance of stormwater sampling approaches using a dynamic watershed model.
Ackerman, Drew; Stein, Eric D; Ritter, Kerry J
2011-09-01
Accurate quantification of stormwater pollutant levels is essential for estimating overall contaminant discharge to receiving waters. Numerous sampling approaches exist that attempt to balance accuracy against the costs associated with the sampling method. This study employs a novel and practical approach of evaluating the accuracy of different stormwater monitoring methodologies using stormflows and constituent concentrations produced by a fully validated continuous simulation watershed model. A major advantage of using a watershed model to simulate pollutant concentrations is that a large number of storms representing a broad range of conditions can be applied in testing the various sampling approaches. Seventy-eight distinct methodologies were evaluated by "virtual samplings" of 166 simulated storms of varying size, intensity and duration, representing 14 years of storms in Ballona Creek near Los Angeles, California. The 78 methods can be grouped into four general strategies: volume-paced compositing, time-paced compositing, pollutograph sampling, and microsampling. The performances of each sampling strategy was evaluated by comparing the (1) median relative error between the virtually sampled and the true modeled event mean concentration (EMC) of each storm (accuracy), (2) median absolute deviation about the median or "MAD" of the relative error or (precision), and (3) the percentage of storms where sampling methods were within 10% of the true EMC (combined measures of accuracy and precision). Finally, costs associated with site setup, sampling, and laboratory analysis were estimated for each method. Pollutograph sampling consistently outperformed the other three methods both in terms of accuracy and precision, but was the most costly method evaluated. Time-paced sampling consistently underestimated while volume-paced sampling over estimated the storm EMCs. Microsampling performance approached that of pollutograph sampling at a substantial cost savings. The most efficient method for routine stormwater monitoring in terms of a balance between performance and cost was volume-paced microsampling, with variable sample pacing to ensure that the entirety of the storm was captured. Pollutograph sampling is recommended if the data are to be used for detailed analysis of runoff dynamics.
ERIC Educational Resources Information Center
Genovesi, Giovanni, Ed.
This collection, the last of four volumes on the history of compulsory education among the nations of Europe and the western hemisphere, analyzes statistics, methodology, reforms, and new tendencies. Twelve of the document's 18 articles are written in English, 3 are written in French and 3 are in Italian. Summaries accompany most articles; three…
A standardised protocol for the validation of banking methodologies for arterial allografts.
Lomas, R J; Dodd, P D F; Rooney, P; Pegg, D E; Hogg, P A; Eagle, M E; Bennett, K E; Clarkson, A; Kearney, J N
2013-09-01
The objective of this study was to design and test a protocol for the validation of banking methodologies for arterial allografts. A series of in vitro biomechanical and biological assessments were derived, and applied to paired fresh and banked femoral arteries. The ultimate tensile stress and strain, suture pullout stress and strain, expansion/rupture under hydrostatic pressure, histological structure and biocompatibility properties of disinfected and cryopreserved femoral arteries were compared to those of fresh controls. No significant differences were detected in any of the test criteria. This validation protocol provides an effective means of testing and validating banking protocols for arterial allografts.
2011-09-01
a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range
Methodological Validation of Quality of Life Questionnaire for Coal Mining Groups-Indian Scenario
ERIC Educational Resources Information Center
Sen, Sayanti; Sen, Goutam; Tewary, B. K.
2012-01-01
Maslow's hierarchy-of-needs theory has been used to predict development of Quality of Life (QOL) in countries over time. In this paper an attempt has been taken to derive a methodological validation of quality of life questionnaire which have been prepared for the study area. The objective of the study is to standardize a questionnaire tool to…
CFD Analysis of the SBXC Glider Airframe
2016-06-01
mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the previous research data...greater than 15 m/s. 14. SUBJECT TERMS finite element method, computational fluid dynamics, Y Plus, mesh element quality, aerodynamic data, fluid...based mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toltz, A; Seuntjens, J; Hoesl, M
Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebralmore » body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less
Zarit, Steven H.; Liu, Yin; Bangerter, Lauren R.; Rovine, Michael J.
2017-01-01
Objectives There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Method Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Results Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. Conclusion An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite. PMID:26729467
Zarit, Steven H; Bangerter, Lauren R; Liu, Yin; Rovine, Michael J
2017-03-01
There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite.
Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico
NASA Astrophysics Data System (ADS)
Nathenson, M.; Fierstein, J.
2012-12-01
Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico Manuel Nathenson and Judy Fierstein U.S. Geological Survey, 345 Middlefield Road MS-910, Menlo Park, CA 94025 In a recent numerical simulation of tephra transport and deposition for the 1982 eruption, Bonasia et al. (2012) used masses for the tephra layers (A-1, B, and C) based on the volume data of Carey and Sigurdsson (1986) calculated by the methodology of Rose et al. (1973). For reasons not clear, using the same methodology we obtained volumes for layers A-1 and B much less than those previously reported. For example, for layer A-1, Carey and Sigurdsson (1986) reported a volume of 0.60 km3, whereas we obtain a volume of 0.23 km3. Moreover, applying the more recent methodology of tephra-volume calculation (Pyle, 1989; Fierstein and Nathenson, 1992) and using the isopachs maps in Carey and Sigurdsson (1986), we calculate a total tephra volume of 0.52 km3 (A-1, 0.135; B, 0.125; and C, 0.26 km3). In contrast, Carey and Sigurdsson (1986) report a much larger total volume of 2.19 km3. Such disagreement not only reflects the differing methodologies, but we propose that the volumes calculated with the methodology of Pyle and of Fierstein and Nathenson—involving the use of straight lines on a log thickness versus square root of area plot—better represent the actual fall deposits. After measuring the areas for the isomass contours for the HAZMAPP and FALL3D simulations in Bonasia et al. (2012), we applied the Pyle-Fierstein and Nathenson methodology to calculate the tephra masses deposited on the ground. These masses from five of the simulations range from 70% to 110% of those reported by Carey and Sigurdsson (1986), whereas that for layer B in the HAZMAP calculation is 160%. In the Bonasia et al. (2012) study, the mass erupted by the volcano is a critical input used in the simulation to produce an ash cloud that deposits tephra on the ground. Masses on the ground (as calculated by us) for five of the simulations range from 20% to 46% of the masses used as simulation inputs, whereas that for layer B in the HAZMAP calculation is 74%. It is not clear why the percentages are so variable, nor why the output volumes are such small percentages of the input erupted mass. From our volume calculations, the masses on the ground from the simulations are factors of 2.3 to 10 times what was actually deposited. Given this finding from our reevaluation of volumes, the simulations appear to overestimate the hazards from eruptions of sizes that occurred at El Chichón. Bonasia, R., A. Costa, A. Folch, G. Macedonio, and L. Capra, (2012), Numerical simulation of tephra transport and deposition of the 1982 El Chichón eruption and implications for hazard assessment, J. Volc. Geotherm. Res., 231-232, 39-49. Carey, S. and H. Sigurdsson, (1986), The 1982 eruptions of El Chichon volcano, Mexico: Observations and numerical modelling of tephra-fall distribution, Bull. Volcanol., 48, 127-141. Fierstein, J., and M. Nathenson, (1992), Another look at the calculation of fallout tephra volumes, Bull. Volcanol., 54, 156-167. Pyle, D.M., (1989), The thickness, volume and grainsize of tephra fall deposits, Bull. Volcanol., 51, 1-15. Rose, W.I., Jr., S. Bonis, R.E. Stoiber, M. Keller, and T. Bickford, (1973), Studies of volcanic ash from two recent Central American eruptions, Bull. Volcanol., 37, 338-364.
Kim, Jung-Hee; Shin, Sujin; Park, Jin-Hwa
2015-04-01
The purpose of this study was to evaluate the methodological quality of nursing studies using structural equation modeling in Korea. Databases of KISS, DBPIA, and National Assembly Library up to March 2014 were searched using the MeSH terms 'nursing', 'structure', 'model'. A total of 152 studies were screened. After removal of duplicates and non-relevant titles, 61 papers were read in full. Of the sixty-one articles retrieved, 14 studies were published between 1992 and 2000, 27, between 2001 and 2010, and 20, between 2011 and March 2014. The methodological quality of the review examined varied considerably. The findings of this study suggest that more rigorous research is necessary to address theoretical identification, two indicator rule, distribution of sample, treatment of missing values, mediator effect, discriminant validity, convergent validity, post hoc model modification, equivalent models issues, and alternative models issues should be undergone. Further research with robust consistent methodological study designs from model identification to model respecification is needed to improve the validity of the research.
Ma, Yunzhi; Lacroix, Fréderic; Lavallée, Marie-Claude; Beaulieu, Luc
2015-01-01
To validate the Advanced Collapsed cone Engine (ACE) dose calculation engine of Oncentra Brachy (OcB) treatment planning system using an (192)Ir source. Two levels of validation were performed, conformant to the model-based dose calculation algorithm commissioning guidelines of American Association of Physicists in Medicine TG-186 report. Level 1 uses all-water phantoms, and the validation is against TG-43 methodology. Level 2 uses real-patient cases, and the validation is against Monte Carlo (MC) simulations. For each case, the ACE and TG-43 calculations were performed in the OcB treatment planning system. ALGEBRA MC system was used to perform MC simulations. In Level 1, the ray effect depends on both accuracy mode and the number of dwell positions. The volume fraction with dose error ≥2% quickly reduces from 23% (13%) for a single dwell to 3% (2%) for eight dwell positions in the standard (high) accuracy mode. In Level 2, the 10% and higher isodose lines were observed overlapping between ACE (both standard and high-resolution modes) and MC. Major clinical indices (V100, V150, V200, D90, D50, and D2cc) were investigated and validated by MC. For example, among the Level 2 cases, the maximum deviation in V100 of ACE from MC is 2.75% but up to ~10% for TG-43. Similarly, the maximum deviation in D90 is 0.14 Gy between ACE and MC but up to 0.24 Gy for TG-43. ACE demonstrated good agreement with MC in most clinically relevant regions in the cases tested. Departure from MC is significant for specific situations but limited to low-dose (<10% isodose) regions. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.
In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less
Second Language Listening Strategy Research: Methodological Challenges and Perspectives
ERIC Educational Resources Information Center
Santos, Denise; Graham, Suzanne; Vanderplank, Robert
2008-01-01
This paper explores methodological issues related to research into second language listening strategies. We argue that a number of central questions regarding research methodology in this line of enquiry are underexamined, and we engage in the discussion of three key methodological questions: (1) To what extent is a verbal report a valid and…
ERIC Educational Resources Information Center
Construction Systems Management, Inc., Anchorage, AK.
Volume II of a 3-volume report demonstrates the use of Design Determinants and Options (presented in Volume I) in the planning and design of small rural Alaskan secondary schools. Section I, a checklist for gathering site-specific information to be used as a data base for facility design, is organized in the same format as Volume I, which can be…
Chang, W-K; Chao, Y-C; Mcclave, S-A; Yeh, M-K
2005-10-01
Gastric residual volumes are widely used to evaluate gastric emptying for patients receiving enteral feeding, but controversy exists about what constitutes gastric residual volume. We have developed a method by using refractometer and derived mathematical equations to calculate the formula concentration, total residual volume (TRV), and formula volume. In this study, we like to validate these mathematical equations before they can be implemented for clinical patient care. Four dietary formulas were evaluated in two consecutive validation experiments. Firstly, dietary formula volume of 50, 100, 200, and 400 ml were diluted with 50 ml water, and then the Brix value (BV) was measured by the refractometer. Secondly, 50 ml of water, then 100 ml of dietary formula were infused into a beaker, and followed by the BV measurement. After this, 50 ml of water was infused and followed by the second BV measurement. The entire procedure of infusing of dietary formula (100 ml) and waster (50 ml) was repeated twice and followed by the BV measurement. The formula contents (formula concentration, TRV, and formula volume) were calculated by mathematical equations. The calculated formula concentrations, TRVs, and formula volumes measured from mathematic equations were strongly close to the true values in the first and second validation experiments (R2>0.98, P<0.001). Refractometer and the derived mathematical equations may be used to accurately measure the formula concentration, TRV, and formula volume and served as a tool to monitor gastric emptying for patients receiving enteral feeding.
Methodological issues in microdialysis sampling for pharmacokinetic studies.
de Lange, E C; de Boer, A G; Breimer, D D
2000-12-15
Microdialysis is an in vivo technique that permits monitoring of local concentrations of drugs and metabolites at specific sites in the body. Microdialysis has several characteristics, which makes it an attractive tool for pharmacokinetic research. About a decade ago the microdialysis technique entered the field of pharmacokinetic research, in the brain, and later also in peripheral tissues and blood. Within this period much has been learned on the proper use of this technique. Today, it has outgrown its child diseases and its potentials and limitations have become more or less well defined. As microdialysis is a delicate technique for which experimental factors appear to be critical with respect to the validity of the experimental outcomes, several factors should be considered. These include the probe; the perfusion solution; post-surgery interval in relation to surgical trauma, tissue integrity and repeated experiments; the analysis of microdialysate samples; and the quantification of microdialysate data. Provided that experimental conditions are optimized to give valid and quantitative results, microdialysis can provide numerous data points from a relatively small number of individual animals to determine detailed pharmacokinetic information. An example of one of the added values of this technique compared with other in vivo pharmacokinetic techniques, is that microdialysis reflects free concentrations in tissues and plasma. This gives the opportunity to assess information on drug transport equilibration across membranes such as the blood-brain barrier, which already has provided new insights. With the progress of analytical methodology, especially with respect to low volume/low concentration measurements and simultaneous measurement of multiple compounds, the applications and importance of the microdialysis technique in pharmacokinetic research will continue to increase.
Automatic tree parameter extraction by a Mobile LiDAR System in an urban context.
Herrero-Huerta, Mónica; Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo
2018-01-01
In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees.
Automatic tree parameter extraction by a Mobile LiDAR System in an urban context
Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo
2018-01-01
In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees. PMID:29689076
Low validity of Google Trends for behavioral forecasting of national suicide rates.
Tran, Ulrich S; Andel, Rita; Niederkrotenthaler, Thomas; Till, Benedikt; Ajdacic-Gross, Vladeta; Voracek, Martin
2017-01-01
Recent research suggests that search volumes of the most popular search engine worldwide, Google, provided via Google Trends, could be associated with national suicide rates in the USA, UK, and some Asian countries. However, search volumes have mostly been studied in an ad hoc fashion, without controls for spurious associations. This study evaluated the validity and utility of Google Trends search volumes for behavioral forecasting of suicide rates in the USA, Germany, Austria, and Switzerland. Suicide-related search terms were systematically collected and respective Google Trends search volumes evaluated for availability. Time spans covered 2004 to 2010 (USA, Switzerland) and 2004 to 2012 (Germany, Austria). Temporal associations of search volumes and suicide rates were investigated with time-series analyses that rigorously controlled for spurious associations. The number and reliability of analyzable search volume data increased with country size. Search volumes showed various temporal associations with suicide rates. However, associations differed both across and within countries and mostly followed no discernable patterns. The total number of significant associations roughly matched the number of expected Type I errors. These results suggest that the validity of Google Trends search volumes for behavioral forecasting of national suicide rates is low. The utility and validity of search volumes for the forecasting of suicide rates depend on two key assumptions ("the population that conducts searches consists mostly of individuals with suicidal ideation", "suicide-related search behavior is strongly linked with suicidal behavior"). We discuss strands of evidence that these two assumptions are likely not met. Implications for future research with Google Trends in the context of suicide research are also discussed.
The validity of ultrasound estimation of muscle volumes.
Infantolino, Benjamin W; Gales, Daniel J; Winter, Samantha L; Challis, John H
2007-08-01
The purpose of this study was to validate ultrasound muscle volume estimation in vivo. To examine validity, vastus lateralis ultrasound images were collected from cadavers before muscle dissection; after dissection, the volumes were determined by hydrostatic weighing. Seven thighs from cadaver specimens were scanned using a 7.5-MHz ultrasound probe (SSD-1000, Aloka, Japan). The perimeter of the vastus lateralis was identified in the ultrasound images and manually digitized. Volumes were then estimated using the Cavalieri principle, by measuring the image areas of sets of parallel two-dimensional slices through the muscles. The muscles were then dissected from the cadavers, and muscle volume was determined via hydrostatic weighing. There was no statistically significant difference between the ultrasound estimation of muscle volume and that estimated using hydrostatic weighing (p > 0.05). The mean percentage error between the two volume estimates was 0.4% +/- 6.9. Three operators all performed four digitizations of all images from one randomly selected muscle; there was no statistical difference between operators or trials and the intraclass correlation was high (>0.8). The results of this study indicate that ultrasound is an accurate method for estimating muscle volumes in vivo.
Gradient-Based Aerodynamic Shape Optimization Using ADI Method for Large-Scale Problems
NASA Technical Reports Server (NTRS)
Pandya, Mohagna J.; Baysal, Oktay
1997-01-01
A gradient-based shape optimization methodology, that is intended for practical three-dimensional aerodynamic applications, has been developed. It is based on the quasi-analytical sensitivities. The flow analysis is rendered by a fully implicit, finite volume formulation of the Euler equations.The aerodynamic sensitivity equation is solved using the alternating-direction-implicit (ADI) algorithm for memory efficiency. A flexible wing geometry model, that is based on surface parameterization and platform schedules, is utilized. The present methodology and its components have been tested via several comparisons. Initially, the flow analysis for for a wing is compared with those obtained using an unfactored, preconditioned conjugate gradient approach (PCG), and an extensively validated CFD code. Then, the sensitivities computed with the present method have been compared with those obtained using the finite-difference and the PCG approaches. Effects of grid refinement and convergence tolerance on the analysis and shape optimization have been explored. Finally the new procedure has been demonstrated in the design of a cranked arrow wing at Mach 2.4. Despite the expected increase in the computational time, the results indicate that shape optimization, which require large numbers of grid points can be resolved with a gradient-based approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guan, Xuefei; Zhou, S. Kevin; Rasselkorde, El Mahjoub
The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location.more » The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.« less
NASA Astrophysics Data System (ADS)
Guan, Xuefei; Rasselkorde, El Mahjoub; Abbasi, Waheed; Zhou, S. Kevin
2015-03-01
The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location. The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.
Venturelli, Massimo; Jeong, Eun-Kee; Richardson, Russell S.
2014-01-01
The assessment of muscle volume, and changes over time, have significant clinical and research-related implications. Methods to assess muscle volume vary from simple and inexpensive to complex and expensive. Therefore this study sought to examine the validity of muscle volume estimated simply by anthropometry compared with the more complex proton magnetic resonance imaging (1H-MRI) across a wide spectrum of individuals including those with a spinal cord injury (SCI), a group recognized to exhibit significant muscle atrophy. Accordingly, muscle volume of the thigh and lower leg of eight subjects with a SCI and eight able-bodied subjects (controls) was determined by anthropometry and 1H-MRI. With either method, muscle volumes were significantly lower in the SCI compared with the controls (P < 0.05) and, using pooled data from both groups, anthropometric measurements of muscle volume were strongly correlated to the values assessed by 1H-MRI in both the thigh (r2 = 0.89; P < 0.05) and lower leg (r2 = 0.98; P < 0.05). However, the anthropometric approach systematically overestimated muscle volume compared with 1H-MRI in both the thigh (mean bias = 2407cm3) and the lower (mean bias = 170 cm3) leg. Thus with an appropriate correction for this systemic overestimation, muscle volume estimated from anthropometric measurements is a valid approach and provides acceptable accuracy across a spectrum of adults with normal muscle mass to a SCI and severe muscle atrophy. In practical terms this study provides the formulas that add validity to the already simple and inexpensive anthropometric approach to assess muscle volume in clinical and research settings. PMID:24458749
A novel adaptive scoring system for segmentation validation with multiple reference masks
NASA Astrophysics Data System (ADS)
Moltz, Jan H.; Rühaak, Jan; Hahn, Horst K.; Peitgen, Heinz-Otto
2011-03-01
The development of segmentation algorithms for different anatomical structures and imaging protocols is an important task in medical image processing. The validation of these methods, however, is often treated as a subordinate task. Since manual delineations, which are widely used as a surrogate for the ground truth, exhibit an inherent uncertainty, it is preferable to use multiple reference segmentations for an objective validation. This requires a consistent framework that should fulfill three criteria: 1) it should treat all reference masks equally a priori and not demand consensus between the experts; 2) it should evaluate the algorithmic performance in relation to the inter-reference variability, i.e., be more tolerant where the experts disagree about the true segmentation; 3) it should produce results that are comparable for different test data. We show why current state-of-the-art frameworks as the one used at several MICCAI segmentation challenges do not fulfill these criteria and propose a new validation methodology. A score is computed in an adaptive way for each individual segmentation problem, using a combination of volume- and surface-based comparison metrics. These are transformed into the score by relating them to the variability between the reference masks which can be measured by comparing the masks with each other or with an estimated ground truth. We present examples from a study on liver tumor segmentation in CT scans where our score shows a more adequate assessment of the segmentation results than the MICCAI framework.
Evaluation Of The MODIS-VIIRS Land Surface Reflectance Fundamental Climate Data Record.
NASA Astrophysics Data System (ADS)
Roger, J. C.; Vermote, E.; Skakun, S.; Murphy, E.; Holben, B. N.; Justice, C. O.
2016-12-01
The land surface reflectance is a fundamental climate data record at the basis of the derivation of other climate data records (Albedo, LAI/Fpar, Vegetation indices) and has been recognized as a key parameter in the understanding of the land-surface-climate processes. Here, we present the validation of the Land surface reflectance used for MODIS and VIIRS data. This methodology uses the 6SV Code and data from the AERONET network. The first part was to define a protocol to use the AERONET data. To correctly take into account the aerosol model, we used the aerosol microphysical properties provided by the AERONET network including size-distribution (%Cf, %Cc, rf, rc, σr, σc), complex refractive indices and sphericity. Over the 670 available AERONET sites, we selected 230 sites with sufficient data. To be useful for validation, the aerosol model should be readily available anytime, which is rarely the case. We then used regressions for each microphysical parameter using the aerosol optical thickness at 440nm and the Angström coefficient as parameters. Comparisons with the AERONET dataset give good APU (Accuracy-Precision-Uncertainties) for each parameter. The second part of the study relies on the theoretical land surface retrieval. We generated TOA synthetic data using aerosol models from AERONET and determined APU on the surface reflectance retrieval while applying the MODIS and VIRRS Atmospheric correction software. Over 250 AERONET sites, the global uncertainties are for MODIS band 1 (red) is always lower than 0.0015 (when surface reflectance is > 0.04). This very good result shows the validity of our reference. Then, we used this reference for validating the MODIS and VIIRS surface reflectance products. The overall accuracy clearly reaches specifications. Finally, we will present an error budget of the surface reflectance retrieval. Indeed, to better understand how to improve the methodology, we defined an exhaustive error budget. We included all inputs i.e. sensor, calibration, aerosol properties, atmospheric conditions… This latter work provides a lot of information, such as the aerosol optical thickness obviously drives the uncertainties of the retrieval, the absorption and the volume concentration of the fine aerosol mode have an important impact as well…
Via, Esther; Radua, Joaquim; Cardoner, Narcis; Happé, Francesca; Mataix-Cols, David
2011-04-01
Studies investigating abnormalities of regional gray matter volume in autism spectrum disorder (ASD) have yielded contradictory results. It is unclear whether the current subtyping of ASD into autistic disorder and Asperger disorder is neurobiologically valid. To conduct a quantitative meta-analysis of voxel-based morphometry studies exploring gray matter volume abnormalities in ASD, to examine potential neurobiological differences among ASD subtypes, and to create an online database to facilitate replication and further analyses by other researchers. We retrieved studies from PubMed, ScienceDirect, Scopus, and Web of Knowledge databases between June 3, 1999, the date of the first voxel-based morphometry study in ASD, and October 31, 2010. Studies were also retrieved from reference lists and review articles. We contacted authors soliciting additional data. Twenty-four data sets met inclusion criteria, comprising 496 participants with ASD and 471 healthy control individuals. Peak coordinates of clusters of regional gray matter differences between participants with ASD and controls, as well as demographic, clinical, and methodologic variables, were extracted from each study or obtained from the authors. No differences in overall gray matter volume were found between participants with ASD and healthy controls. Participants with ASD were found to have robust decreases of gray matter volume in the bilateral amygdala-hippocampus complex and the bilateral precuneus. A small increase of gray matter volume in the middle-inferior frontal gyrus was also found. No significant differences in overall or regional gray matter volumes were found between autistic disorder and Asperger disorder. Decreases of gray matter volume in the right precuneus were statistically higher in adults than in adolescents with ASD. These results confirm the crucial involvement of structures linked to social cognition in ASD. The absence of significant differences between ASD subtypes may have important nosologic implications for the DSM-5. The publically available database will be a useful resource for future research.
Longo, Benedetto; Farcomeni, Alessio; Ferri, Germano; Campanale, Antonella; Sorotos, Micheal; Santanelli, Fabio
2013-07-01
Breast volume assessment enhances preoperative planning of both aesthetic and reconstructive procedures, helping the surgeon in the decision-making process of shaping the breast. Numerous methods of breast size determination are currently reported but are limited by methodologic flaws and variable estimations. The authors aimed to develop a unifying predictive formula for volume assessment in small to large breasts based on anthropomorphic values. Ten anthropomorphic breast measurements and direct volumes of 108 mastectomy specimens from 88 women were collected prospectively. The authors performed a multivariate regression to build the optimal model for development of the predictive formula. The final model was then internally validated. A previously published formula was used as a reference. Mean (±SD) breast weight was 527.9 ± 227.6 g (range, 150 to 1250 g). After model selection, sternal notch-to-nipple, inframammary fold-to-nipple, and inframammary fold-to-fold projection distances emerged as the most important predictors. The resulting formula (the BREAST-V) showed an adjusted R of 0.73. The estimated expected absolute error on new breasts is 89.7 g (95 percent CI, 62.4 to 119.1 g) and the expected relative error is 18.4 percent (95 percent CI, 12.9 to 24.3 percent). Application of reference formula on the sample yielded worse predictions than those derived by the formula, showing an R of 0.55. The BREAST-V is a reliable tool for predicting small to large breast volumes accurately for use as a complementary device in surgeon evaluation. An app entitled BREAST-V for both iOS and Android devices is currently available for free download in the Apple App Store and Google Play Store. Diagnostic, II.
Approaches for advancing scientific understanding of macrosystems
Levy, Ofir; Ball, Becky A.; Bond-Lamberty, Ben; Cheruvelil, Kendra S.; Finley, Andrew O.; Lottig, Noah R.; Surangi W. Punyasena,; Xiao, Jingfeng; Zhou, Jizhong; Buckley, Lauren B.; Filstrup, Christopher T.; Keitt, Tim H.; Kellner, James R.; Knapp, Alan K.; Richardson, Andrew D.; Tcheng, David; Toomey, Michael; Vargas, Rodrigo; Voordeckers, James W.; Wagner, Tyler; Williams, John W.
2014-01-01
The emergence of macrosystems ecology (MSE), which focuses on regional- to continental-scale ecological patterns and processes, builds upon a history of long-term and broad-scale studies in ecology. Scientists face the difficulty of integrating the many elements that make up macrosystems, which consist of hierarchical processes at interacting spatial and temporal scales. Researchers must also identify the most relevant scales and variables to be considered, the required data resources, and the appropriate study design to provide the proper inferences. The large volumes of multi-thematic data often associated with macrosystem studies typically require validation, standardization, and assimilation. Finally, analytical approaches need to describe how cross-scale and hierarchical dynamics and interactions relate to macroscale phenomena. Here, we elaborate on some key methodological challenges of MSE research and discuss existing and novel approaches to meet them.
Ensuring near-optimum homogeneity and densification levels in nano-reinforced ceramics
NASA Astrophysics Data System (ADS)
Dassios, Konstantinos G.; Barkoula, Nektaria-Marianthi; Alafogianni, Panagiota; Bonnefont, Guillaume; Fantozzi, Gilbert; Matikas, Theodore E.
2016-04-01
The development of a new generation of high temperature ceramic materials for aerospace applications, reinforced at a scale closer to the molecular level and three orders of magnitude less than conventional fibrous reinforcements, by embedded carbon nanotubes, has recently emerged as a uniquely challenging scientific effort. The properties of such materials depend strongly on two main factors: i) the homogeneity of the dispersion of the hydrophobic medium throughout the ceramic volume and ii) the ultimate density of the resultant product after sintering of the green body at the high-temperatures and pressures required for ceramic consolidation. The present works reports the establishment of two independent experimental strategies which ensure achievement of near perfect levels of tube dispersion homogeneity and fully dense final products. The proposed methodologies are validated across non-destructive evaluation data of materials performance.
Combustion chamber analysis code
NASA Technical Reports Server (NTRS)
Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.
1993-01-01
A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.
NASA Astrophysics Data System (ADS)
Fontchastagner, Julien; Lubin, Thierry; Mezani, Smaïl; Takorabet, Noureddine
2018-03-01
This paper presents a design optimization of an axial-flux eddy-current magnetic coupling. The design procedure is based on a torque formula derived from a 3D analytical model and a population algorithm method. The main objective of this paper is to determine the best design in terms of magnets volume in order to transmit a torque between two movers, while ensuring a low slip speed and a good efficiency. The torque formula is very accurate and computationally efficient, and is valid for any slip speed values. Nevertheless, in order to solve more realistic problems, and then, take into account the thermal effects on the torque value, a thermal model based on convection heat transfer coefficients is also established and used in the design optimization procedure. Results show the effectiveness of the proposed methodology.
Boka, Vasiliki-Ioanna; Argyropoulou, Aikaterini; Gikas, Evangelos; Angelis, Apostolis; Aligiannis, Nektarios; Skaltsounis, Alexios-Leandros
2015-11-01
A high-performance thin-layer chromatographic methodology was developed and validated for the isolation and quantitative determination of oleuropein in two extracts of Olea europaea leaves. OLE_A was a crude acetone extract, while OLE_AA was its defatted residue. Initially, high-performance thin-layer chromatography was employed for the purification process of oleuropein with fast centrifugal partition chromatography, replacing high-performance liquid-chromatography, in the stage of the determination of the distribution coefficient and the retention volume. A densitometric method was developed for the determination of the distribution coefficients, KC = CS/CM. The total concentrations of the target compound in the stationary phase (CS) and in the mobile phase (CM) were calculated by the area measured in the high-performance thin-layer chromatogram. The estimated Kc was also used for the calculation of the retention volume, VR, with a chromatographic retention equation. The obtained data were successfully applied for the purification of oleuropein and the experimental results confirmed the theoretical predictions, indicating that high-performance thin-layer chromatography could be an important counterpart in the phytochemical study of natural products. The isolated oleuropein (purity > 95%) was subsequently used for the estimation of its content in each extract with a simple, sensitive and accurate high-performance thin-layer chromatography method. The best fit calibration curve from 1.0 µg/track to 6.0 µg/track of oleuropein was polynomial and the quantification was achieved by UV detection at λ 240 nm. The method was validated giving rise to an efficient and high-throughput procedure, with the relative standard deviation % of repeatability and intermediate precision not exceeding 4.9% and accuracy between 92% and 98% (recovery rates). Moreover, the method was validated for robustness, limit of quantitation, and limit of detection. The amount of oleuropein for OLE_A, OLE_AA, and an aqueous extract of olive leaves was estimated to be 35.5% ± 2.7, 51.5% ± 1.4, and 12.5% ± 0.12, respectively. Statistical analysis proved that the method is repeatable and selective, and can be effectively applied for the estimation of oleuropein in olive leaves' extracts, and could potentially replace high-performance liquid chromatography methodologies developed so far. Thus, the phytochemical investigation of oleuropein could be based on high-performance thin-layer chromatography coupled with separation processes, such as fast centrifugal partition chromatography, showing efficacy and credibility. Georg Thieme Verlag KG Stuttgart · New York.
Abdo-Man: a 3D-printed anthropomorphic phantom for validating quantitative SIRT.
Gear, Jonathan I; Cummings, Craig; Craig, Allison J; Divoli, Antigoni; Long, Clive D C; Tapner, Michael; Flux, Glenn D
2016-12-01
The use of selective internal radiation therapy (SIRT) is rapidly increasing, and the need for quantification and dosimetry is becoming more widespread to facilitate treatment planning and verification. The aim of this project was to develop an anthropomorphic phantom that can be used as a validation tool for post-SIRT imaging and its application to dosimetry. The phantom design was based on anatomical data obtained from a T1-weighted volume-interpolated breath-hold examination (VIBE) on a Siemens Aera 1.5 T MRI scanner. The liver, lungs and abdominal trunk were segmented using the Hermes image processing workstation. Organ volumes were then uploaded to the Delft Visualization and Image processing Development Environment for smoothing and surface rendering. Triangular meshes defining the iso-surfaces were saved as stereo lithography (STL) files and imported into the Autodesk® Meshmixer software. Organ volumes were subtracted from the abdomen and a removable base designed to allow access to the liver cavity. Connection points for placing lesion inserts and filling holes were also included. The phantom was manufactured using a Stratasys Connex3 PolyJet 3D printer. The printer uses stereolithography technology combined with ink jet printing. Print material is a solid acrylic plastic, with similar properties to polymethylmethacrylate (PMMA). Measured Hounsfield units and calculated attenuation coefficients of the material were shown to also be similar to PMMA. Total print time for the phantom was approximately 5 days. Initial scans of the phantom have been performed with Y-90 bremsstrahlung SPECT/CT, Y-90 PET/CT and Tc-99m SPECT/CT. The CT component of these images compared well with the original anatomical reference, and measurements of volume agreed to within 9 %. Quantitative analysis of the phantom was performed using all three imaging techniques. Lesion and normal liver absorbed doses were calculated from the quantitative images in three dimensions using the local deposition method. 3D printing is a flexible and cost-efficient technology for manufacture of anthropomorphic phantom. Application of such phantoms will enable quantitative imaging and dosimetry methodologies to be evaluated, which with optimisation could help improve outcome for patients.
NASA Technical Reports Server (NTRS)
Baker, A. J.; Iannelli, G. S.; Manhardt, Paul D.; Orzechowski, J. A.
1993-01-01
This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.
Select Methodology for Validating Advanced Satellite Measurement Systems
NASA Technical Reports Server (NTRS)
Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.
2008-01-01
Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.
Xu, Lingyu; Xu, Yuancheng; Coulden, Richard; Sonnex, Emer; Hrybouski, Stanislau; Paterson, Ian; Butler, Craig
2018-05-11
Epicardial adipose tissue (EAT) volume derived from contrast enhanced (CE) computed tomography (CT) scans is not well validated. We aim to establish a reliable threshold to accurately quantify EAT volume from CE datasets. We analyzed EAT volume on paired non-contrast (NC) and CE datasets from 25 patients to derive appropriate Hounsfield (HU) cutpoints to equalize two EAT volume estimates. The gold standard threshold (-190HU, -30HU) was used to assess EAT volume on NC datasets. For CE datasets, EAT volumes were estimated using three previously reported thresholds: (-190HU, -30HU), (-190HU, -15HU), (-175HU, -15HU) and were analyzed by a semi-automated 3D Fat analysis software. Subsequently, we applied a threshold correction to (-190HU, -30HU) based on mean differences in radiodensity between NC and CE images (ΔEATrd = CE radiodensity - NC radiodensity). We then validated our findings on EAT threshold in 21 additional patients with paired CT datasets. EAT volume from CE datasets using previously published thresholds consistently underestimated EAT volume from NC dataset standard by a magnitude of 8.2%-19.1%. Using our corrected threshold (-190HU, -3HU) in CE datasets yielded statistically identical EAT volume to NC EAT volume in the validation cohort (186.1 ± 80.3 vs. 185.5 ± 80.1 cm 3 , Δ = 0.6 cm 3 , 0.3%, p = 0.374). Estimating EAT volume from contrast enhanced CT scans using a corrected threshold of -190HU, -3HU provided excellent agreement with EAT volume from non-contrast CT scans using a standard threshold of -190HU, -30HU. Copyright © 2018. Published by Elsevier B.V.
Development and Validation of a Photonumeric Scale for Evaluation of Volume Deficit of the Hand
Donofrio, Lisa; Hardas, Bhushan; Murphy, Diane K.; Carruthers, Jean; Carruthers, Alastair; Sykes, Jonathan M.; Creutz, Lela; Marx, Ann; Dill, Sara
2016-01-01
BACKGROUND A validated scale is needed for objective and reproducible comparisons of hand appearance before and after treatment in practice and clinical studies. OBJECTIVE To describe the development and validation of the 5-point photonumeric Allergan Hand Volume Deficit Scale. METHODS The scale was developed to include an assessment guide, verbal descriptors, morphed images, and real-subject images for each grade. The clinical significance of a 1-point score difference was evaluated in a review of image pairs representing varying differences in severity. Interrater and intrarater reliability was evaluated in a live-subject validation study (N = 296) completed during 2 sessions occurring 3 weeks apart. RESULTS A score difference of ≥1 point was shown to reflect a clinically significant difference (mean [95% confidence interval] absolute score difference, 1.12 [0.99–1.26] for clinically different image pairs and 0.45 [0.33–0.57] for not clinically different pairs). Intrarater agreement between the 2 validation sessions was almost perfect (mean weighted kappa = 0.83). Interrater agreement was almost perfect during the second session (0.82, primary end point). CONCLUSION The Allergan Hand Volume Deficit Scale is a validated and reliable scale for physician rating of hand volume deficit. PMID:27661741
Epistemological Dialogue of Validity: Building Validity in Educational and Social Research
ERIC Educational Resources Information Center
Cakir, Mustafa
2012-01-01
The notion of validity in the social sciences is evolving and is influenced by philosophy of science, critiques of objectivity, and epistemological debates. Methodology for validation of the knowledge claims is diverse across different philosophies of science. In other words, definition and the way to establish of validity have evolved as…
Statistical Anomalies of Bitflips in SRAMs to Discriminate SBUs From MCUs
NASA Astrophysics Data System (ADS)
Clemente, Juan Antonio; Franco, Francisco J.; Villa, Francesca; Baylac, Maud; Rey, Solenne; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul
2016-08-01
Recently, the occurrence of multiple events in static tests has been investigated by checking the statistical distribution of the difference between the addresses of the words containing bitflips. That method has been successfully applied to Field Programmable Gate Arrays (FPGAs) and the original authors indicate that it is also valid for SRAMs. This paper presents a modified methodology that is based on checking the XORed addresses with bitflips, rather than on the difference. Irradiation tests on CMOS 130 & 90 nm SRAMs with 14-MeV neutrons have been performed to validate this methodology. Results in high-altitude environments are also presented and cross-checked with theoretical predictions. In addition, this methodology has also been used to detect modifications in the organization of said memories. Theoretical predictions have been validated with actual data provided by the manufacturer.
Schaefer, J; Burckhardt, B B; Tins, J; Bartel, A; Laeer, S
2017-01-01
The pharmacotherapy of pediatric patients suffering from heart failure is extrapolated from adults due to missing data in children. Development and validation of a low-volume immunoassay for the reliable determination of renin. The immunoassay was validated according to international guidelines. The assay allows the reliable determination of renin in 40 μL plasma within a calibration range of 4-128 pg/mL. Between-run accuracy varied from -3.3 to +3.0% (relative error), while between-run precision ranged from 4.9 to 11.3% (coefficient of variation). The low-volume immunoassay facilitates the reliable collection of pharmacodynamic data in children.
Camacho-Sandoval, Rosa; Sosa-Grande, Eréndira N; González-González, Edith; Tenorio-Calvo, Alejandra; López-Morales, Carlos A; Velasco-Velázquez, Marco; Pavón-Romero, Lenin; Pérez-Tapia, Sonia Mayra; Medina-Rivero, Emilio
2018-06-05
Physicochemical and structural properties of proteins used as active pharmaceutical ingredients of biopharmaceuticals are determinant to carry out their biological activity. In this regard, the assays intended to evaluate functionality of biopharmaceuticals provide confirmatory evidence that they contain the appropriate physicochemical properties and structural conformation. The validation of the methodologies used for the assessment of critical quality attributes of biopharmaceuticals is a key requirement for manufacturing under GMP environments. Herein we present the development and validation of a flow cytometry-based methodology for the evaluation of adalimumab's affinity towards membrane-bound TNFα (mTNFα) on recombinant CHO cells. This in vitro methodology measures the interaction between an in-solution antibody and its target molecule onto the cell surface through a fluorescent signal. The characteristics evaluated during the validation exercise showed that this methodology is suitable for its intended purpose. The assay demonstrated to be accurate (r 2 = 0.92, slope = 1.20), precise (%CV ≤ 18.31) and specific (curve fitting, r 2 = 0.986-0.997) to evaluate binding of adalimumab to mTNFα. The results obtained here provide evidence that detection by flow cytometry is a viable alternative for bioassays used in the pharmaceutical industry. In addition, this methodology could be standardized for the evaluation of other biomolecules acting through the same mechanism of action. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Parilla, Philip A.; Gross, Karl; Hurst, Katherine; Gennett, Thomas
2016-03-01
The ultimate goal of the hydrogen economy is the development of hydrogen storage systems that meet or exceed the US DOE's goals for onboard storage in hydrogen-powered vehicles. In order to develop new materials to meet these goals, it is extremely critical to accurately, uniformly and precisely measure materials' properties relevant to the specific goals. Without this assurance, such measurements are not reliable and, therefore, do not provide a benefit toward the work at hand. In particular, capacity measurements for hydrogen storage materials must be based on valid and accurate results to ensure proper identification of promising materials for further development. Volumetric capacity determinations are becoming increasingly important for identifying promising materials, yet there exists controversy on how such determinations are made and whether such determinations are valid due to differing methodologies to count the hydrogen content. These issues are discussed herein, and we show mathematically that capacity determinations can be made rigorously and unambiguously if the constituent volumes are well defined and measurable in practice. It is widely accepted that this occurs for excess capacity determinations and we show here that this can happen for the total capacity determination. Because the adsorption volume is undefined, the absolute capacity determination remains imprecise. Furthermore, we show that there is a direct relationship between determining the respective capacities and the calibration constants used for the manometric and gravimetric techniques. Several suggested volumetric capacity figure-of-merits are defined, discussed and reporting requirements recommended. Finally, an example is provided to illustrate these protocols and concepts.
Airport Landside. Volume I. Planning Guide.
DOT National Transportation Integrated Search
1982-01-01
This volume describes a methodology for performing airport landside planning by applying the Airport Landside Simulation Model (ALSIM) developed by TSC. For this analysis, the airport landside is defined as extending from the airport boundary to the ...
FAA Rotorcraft Research, Engineering, and Development Bibliography 1962-1989
1990-05-01
Albert G. Delucien) (NTIS: ADA 102 521) FAA/CT-88/10 Digital Systems Validation Handbook - Volume II (R.L. McDowall, Hardy P. Curd, Lloyd N. Popish... Digital Systems in Avionics and Flight Control Applications, Handbook - Volume I, (Ellis F. Hilt, Donald Eldredge, Jeff Webb, Charles Lucius, Michael S...Structure Statistics of Helicopter GPS Navigation with the Magnavox Z-Set (Robert D. Till) FAA/CT-82/115 Handbook - Volume I, Validation of Digital
Market projections of cellulose nanomaterial-enabled products-- Part 2: Volume estimates
John Cowie; E.M. (Ted) Bilek; Theodore H. Wegner; Jo Anne Shatkin
2014-01-01
Nanocellulose has enormous potential to provide an important materials platform in numerous product sectors. This study builds on previous work by the same authors in which likely high-volume, low-volume, and novel applications for cellulosic nanomaterials were identified. In particular, this study creates a transparent methodology and estimates the potential annual...
Human Rehabilitation Techniques. Project Papers. Volume IV, Part B.
ERIC Educational Resources Information Center
Dudek, R. A.; And Others
Volume IV, Part B of a six-volume final report (which covers the findings of a research project on policy and technology related to rehabilitation of disabled individuals) presents a continuation of papers (Part A) giving an overview of project methodology, much of the data used in projecting consequences and policymaking impacts in project…
Stavrou, Elissaios; Zaug, Joseph M.; Bastea, Sorin; ...
2016-04-07
Quasi-hydrostatic high-pressure equations of state (EOS) are typically determined, for crystalline solids, by measuring unit-cell volumes using x-ray diffraction (XRD) techniques. However, when characterizing low-symmetry materials with large unit cells, conventional XRD approaches may become problematic. To overcome this issue, we examined the utility of a "direct" approach toward determining high pressure material volume by measuring surface area and sample thickness using optical microscopy and interferometry (OMI) respectively. We have validated this experimental approach by comparing results obtained for TATB (2,4,6-triamino-1,3,5-trinitrobenzene) with an EOS determined from synchrotron XRD measurements; and, a good match is observed. We have measured the highmore » pressure EOS of 5-nitro-2,4-dihydro-1,2,4-triazol-3-one (α-NTO) up to 33 GPa. No high-pressure XRD EOS data have been published on α-NTO, probably due to its complex crystal structure. Furthermore, the results of this study suggest that OMI is a reliable and versatile alternative for determining EOSs, especially when conventional methodologies are impractical.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stavrou, Elissaios, E-mail: stavrou1@llnl.gov; Zaug, Joseph M., E-mail: zaug1@llnl.gov; Bastea, Sorin
2016-04-07
Quasi-hydrostatic high-pressure equations of state (EOS) are typically determined, for crystalline solids, by measuring unit-cell volumes using x-ray diffraction (XRD) techniques. However, when characterizing low-symmetry materials with large unit cells, conventional XRD approaches may become problematic. To overcome this issue, we examined the utility of a “direct” approach toward determining high pressure material volume by measuring surface area and sample thickness using optical microscopy and interferometry (OMI), respectively. We have validated this experimental approach by comparing results obtained for 2,4,6-triamino-1,3,5-trinitrobenzene TATB with an EOS determined from synchrotron XRD measurements; and, a good match is observed. We have measured the high pressure EOS of 5-nitro-2,4-dihydro-1,2,4,-triazol-3-one (α-NTO) upmore » to 28 GPa. No high-pressure XRD EOS data have been published on α-NTO, probably due to its complex crystal structure. The results of this study suggest that OMI is a reliable and versatile alternative for determining EOSs, especially when conventional methodologies are impractical.« less
Han, Yuqian; Ma, Qinchuan; Lu, Jie; Xue, Yong; Xue, Changhu
2012-12-15
A simple, rapid and sensitive method was developed for determination of 17α-methyltestosterone in aquatic products by extraction with subcritical 1,1,1,2-tetrafluoroethane (R134a) extraction and high performance liquid chromatography (HPLC). Response surface methodology (RSM) was adopted to optimise extraction pressure, temperature and co-solvent volume. The optimum extraction conditions predicted within the experimental ranges were as follows: pressure 5 MPa, temperature 31°C, and co-solvent volume 3.35ml. The analysis was carried out on XDB-C(18) column (4.6 mm × 250 mm, 5 μm) with the mobile phase acetonitrile-water (55:45, v/v), flow rate 0.8 ml/min, temperature 30°C and wavelength 245 nm. Good linearity of detection was obtained for 17α-methyltestosterone between concentrations of 50-250 ng/ml, r(2)=0.999. The method was validated using samples fortified with 17α-methyltestosterone at levels of 10, 30 and 50 ng/g, the mean recovery exceeds 90%, and the RSD values were less than 10%. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Sabour, Siamak
2018-03-08
The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.
Burruss, Robert
2009-01-01
Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible.
Burruss, R.C.
2009-01-01
Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible. ?? 2009 Elsevier Ltd. All rights reserved.
Kimhy, David; Delespaul, Philippe; Ahn, Hongshik; Cai, Shengnan; Shikhman, Marina; Lieberman, Jeffrey A; Malaspina, Dolores; Sloan, Richard P
2010-11-01
Psychosis has been repeatedly suggested to be affected by increases in stress and arousal. However, there is a dearth of evidence supporting the temporal link between stress, arousal, and psychosis during "real-world" functioning. This paucity of evidence may stem from limitations of current research methodologies. Our aim is to the test the feasibility and validity of a novel methodology designed to measure concurrent stress and arousal in individuals with psychosis during "real-world" daily functioning. Twenty patients with psychosis completed a 36-hour ambulatory assessment of stress and arousal. We used experience sampling method with palm computers to assess stress (10 times per day, 10 AM → 10 PM) along with concurrent ambulatory measurement of cardiac autonomic regulation using a Holter monitor. The clocks of the palm computer and Holter monitor were synchronized, allowing the temporal linking of the stress and arousal data. We used power spectral analysis to determine the parasympathetic contributions to autonomic regulation and sympathovagal balance during 5 minutes before and after each experience sample. Patients completed 79% of the experience samples (75% with a valid concurrent arousal data). Momentary increases in stress had inverse correlation with concurrent parasympathetic activity (ρ = -.27, P < .0001) and positive correlation with sympathovagal balance (ρ = .19, P = .0008). Stress and heart rate were not significantly related (ρ = -.05, P = .3875). The findings support the feasibility and validity of our methodology in individuals with psychosis. The methodology offers a novel way to study in high time resolution the concurrent, "real-world" interactions between stress, arousal, and psychosis. The authors discuss the methodology's potential applications and future research directions.
Konz, Tobias; Migliavacca, Eugenia; Dayon, Loïc; Bowman, Gene; Oikonomidi, Aikaterini; Popp, Julius; Rezzi, Serge
2017-05-05
We here describe the development, validation and application of a quantitative methodology for the simultaneous determination of 29 elements in human serum using state-of-the-art inductively coupled plasma triple quadrupole mass spectrometry (ICP-MS/MS). This new methodology offers high-throughput elemental profiling using simple dilution of minimal quantity of serum samples. We report the outcomes of the validation procedure including limits of detection/quantification, linearity of calibration curves, precision, recovery and measurement uncertainty. ICP-MS/MS-based ionomics was used to analyze human serum of 120 older adults. Following a metabolomic data mining approach, the generated ionome profiles were subjected to principal component analysis revealing gender and age-specific differences. The ionome of female individuals was marked by higher levels of calcium, phosphorus, copper and copper to zinc ratio, while iron concentration was lower with respect to male subjects. Age was associated with lower concentrations of zinc. These findings were complemented with additional readouts to interpret micronutrient status including ceruloplasmin, ferritin and inorganic phosphate. Our data supports a gender-specific compartmentalization of the ionome that may reflect different bone remodelling in female individuals. Our ICP-MS/MS methodology enriches the panel of validated "Omics" approaches to study molecular relationships between the exposome and the ionome in relation with nutrition and health.
Low validity of Google Trends for behavioral forecasting of national suicide rates
Niederkrotenthaler, Thomas; Till, Benedikt; Ajdacic-Gross, Vladeta; Voracek, Martin
2017-01-01
Recent research suggests that search volumes of the most popular search engine worldwide, Google, provided via Google Trends, could be associated with national suicide rates in the USA, UK, and some Asian countries. However, search volumes have mostly been studied in an ad hoc fashion, without controls for spurious associations. This study evaluated the validity and utility of Google Trends search volumes for behavioral forecasting of suicide rates in the USA, Germany, Austria, and Switzerland. Suicide-related search terms were systematically collected and respective Google Trends search volumes evaluated for availability. Time spans covered 2004 to 2010 (USA, Switzerland) and 2004 to 2012 (Germany, Austria). Temporal associations of search volumes and suicide rates were investigated with time-series analyses that rigorously controlled for spurious associations. The number and reliability of analyzable search volume data increased with country size. Search volumes showed various temporal associations with suicide rates. However, associations differed both across and within countries and mostly followed no discernable patterns. The total number of significant associations roughly matched the number of expected Type I errors. These results suggest that the validity of Google Trends search volumes for behavioral forecasting of national suicide rates is low. The utility and validity of search volumes for the forecasting of suicide rates depend on two key assumptions (“the population that conducts searches consists mostly of individuals with suicidal ideation”, “suicide-related search behavior is strongly linked with suicidal behavior”). We discuss strands of evidence that these two assumptions are likely not met. Implications for future research with Google Trends in the context of suicide research are also discussed. PMID:28813490
Automated MRI parcellation of the frontal lobe
Ranta, Marin E.; Chen, Min; Crocetti, Deana; Prince, Jerry L.; Subramaniam, Krish; Fischl, Bruce; Kaufmann, Walter E.; Mostofsky, Stewart H.
2014-01-01
Examination of associations between specific disorders and physical properties of functionally relevant frontal lobe sub-regions is a fundamental goal in neuropsychiatry. Here we present and evaluate automated methods of frontal lobe parcellation with the programs FreeSurfer(FS) and TOADS-CRUISE(T-C), based on the manual method described in Ranta et al. (2009) in which sulcal-gyral landmarks were used to manually delimit functionally relevant regions within the frontal lobe: i.e., primary motor cortex, anterior cingulate, deep white matter, premotor cortex regions (supplementary motor complex, frontal eye field and lateral premotor cortex) and prefrontal cortex (PFC) regions (medial PFC, dorsolateral PFC, inferior PFC, lateral orbitofrontal cortex (OFC) and medial OFC). Dice's coefficient, a measure of overlap, and percent volume difference were used to measure the reliability between manual and automated delineations for each frontal lobe region. For FS, mean Dice's coefficient for all regions was 0.75 and percent volume difference was 21.2%. For T-C the mean Dice's coefficient was 0.77 and the mean percent volume difference for all regions was 20.2%. These results, along with a high degree of agreement between the two automated methods (mean Dice's coefficient = 0.81, percent volume difference = 12.4%) and a proof-of-principle group difference analysis that highlights the consistency and sensitivity of the automated methods, indicate that the automated methods are valid techniques for parcellation of the frontal lobe into functionally relevant sub-regions. Thus, the methodology has the potential to increase efficiency, statistical power and reproducibility for population analyses of neuropsychiatric disorders with hypothesized frontal lobe contributions. PMID:23897577
Are validated outcome measures used in distal radial fractures truly valid?
Nienhuis, R. W.; Bhandari, M.; Goslings, J. C.; Poolman, R. W.; Scholtes, V. A. B.
2016-01-01
Objectives Patient-reported outcome measures (PROMs) are often used to evaluate the outcome of treatment in patients with distal radial fractures. Which PROM to select is often based on assessment of measurement properties, such as validity and reliability. Measurement properties are assessed in clinimetric studies, and results are often reviewed without considering the methodological quality of these studies. Our aim was to systematically review the methodological quality of clinimetric studies that evaluated measurement properties of PROMs used in patients with distal radial fractures, and to make recommendations for the selection of PROMs based on the level of evidence of each individual measurement property. Methods A systematic literature search was performed in PubMed, EMbase, CINAHL and PsycINFO databases to identify relevant clinimetric studies. Two reviewers independently assessed the methodological quality of the studies on measurement properties, using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Level of evidence (strong / moderate / limited / lacking) for each measurement property per PROM was determined by combining the methodological quality and the results of the different clinimetric studies. Results In all, 19 out of 1508 identified unique studies were included, in which 12 PROMs were rated. The Patient-rated wrist evaluation (PRWE) and the Disabilities of Arm, Shoulder and Hand questionnaire (DASH) were evaluated on most measurement properties. The evidence for the PRWE is moderate that its reliability, validity (content and hypothesis testing), and responsiveness are good. The evidence is limited that its internal consistency and cross-cultural validity are good, and its measurement error is acceptable. There is no evidence for its structural and criterion validity. The evidence for the DASH is moderate that its responsiveness is good. The evidence is limited that its reliability and the validity on hypothesis testing are good. There is no evidence for the other measurement properties. Conclusion According to this systematic review, there is, at best, moderate evidence that the responsiveness of the PRWE and DASH are good, as are the reliability and validity of the PRWE. We recommend these PROMs in clinical studies in patients with distal radial fractures; however, more clinimetric studies of higher methodological quality are needed to adequately determine the other measurement properties. Cite this article: Dr Y. V. Kleinlugtenbelt. Are validated outcome measures used in distal radial fractures truly valid?: A critical assessment using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Bone Joint Res 2016;5:153–161. DOI: 10.1302/2046-3758.54.2000462. PMID:27132246
NASA Technical Reports Server (NTRS)
Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.
1979-01-01
The appendices for the cross impact methodology are presented. These include: user's guide, telecommunication events, cross impacts, projection of historical trends, and projection of trends in satellite communications.
Abdallah, Faraj W; Yu, Eugene; Cholvisudhi, Phantila; Niazi, Ahtsham U; Chin, Ki J; Abbas, Sherif; Chan, Vincent W
2017-01-01
Ultrasound (US) imaging of the airway may be useful in predicting difficulty of airway management (DAM); but its use is limited by lack of proof of its validity and reliability. We sought to validate US imaging of the airway by comparison to CT-scan, and to assess its inter- and intra-observer reliability. We used submandibular sonographic imaging of the mouth and oropharynx to examine how well the ratio of tongue thickness to oral cavity height correlates with the ratio of tongue volume to oral cavity volume, an established tomographic measure of DAM. A cohort of 34 patients undergoing CT-scan was recruited. Study standardized assessments included CT-measured ratios of tongue volume to oropharyngeal cavity volume; tongue thickness to oral cavity height; and US-measured ratio of tongue thickness to oral cavity height. Two sonographers independently performed US imaging of the airway before and after CT-scan. Our findings indicate that the US-measured ratio of tongue thickness to oral cavity height highly correlates with the CT-measured ratio of tongue volume to oral cavity volume. US measurements also demonstrated strong inter- and intra-observer reliability. This study suggests that US is a valid and reliable tool for imaging the oral and oropharyngeal parts of the airway, as well as for measuring the volumetric relationship between the tongue and oral cavity, and may therefore be a useful predictor of DAM. © 2016 by the American Institute of Ultrasound in Medicine.
Alternative occupied volume integrity (OVI) tests and analyses.
DOT National Transportation Integrated Search
2013-10-01
FRA, supported by the Volpe Center, conducted research on alternative methods of evaluating occupied volume integrity (OVI) in passenger railcars. Guided by this research, an alternative methodology for evaluating OVI that ensures an equivalent or gr...
Methodology update for estimating volume to service flow ratio.
DOT National Transportation Integrated Search
2015-12-01
Volume/service flow ratio (VSF) is calculated by the Highway Performance Monitoring System (HPMS) software as an indicator of peak hour congestion. It is an essential input to the Kentucky Transportation Cabinets (KYTC) key planning applications, ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, S.C.
1993-08-01
This report discusses a field demonstration of a methodology for characterizing an aquifer's geohydrology in the detail required to design an optimum network of wells and/or infiltration galleries for bioreclamation systems. The project work was conducted on a 1-hectare test site at Columbus AFB, Mississippi. The technical report is divided into two volumes. Volume I describes the test site and the well network, the assumptions, and the application of equations that define groundwater flow to a well, the results of three large-scale aquifer tests, and the results of 160 single-pump tests. Volume II describes the bore hole flowmeter tests, themore » tracer tests, the geological investigations, the geostatistical analysis and the guidelines for using groundwater models to design bioreclamation systems. Site characterization, Hydraulic conductivity, Groundwater flow, Geostatistics, Geohydrology, Monitoring wells.« less
Note: Methodology for the analysis of Bluetooth gateways in an implemented scatternet.
Etxaniz, J; Monje, P M; Aranguren, G
2014-03-01
This Note introduces a novel methodology to analyze the time performance of Bluetooth gateways in multi-hop networks, known as scatternets. The methodology is focused on distinguishing between the processing time and the time that each communication between nodes takes along an implemented scatternet. This technique is not only valid for Bluetooth networks but also for other wireless networks that offer access to their middleware in order to include beacons in the operation of the nodes. We show in this Note the results of the tests carried out on a Bluetooth scatternet in order to highlight the reliability and effectiveness of the methodology. The results also validate this technique showing convergence in the results when subtracting the time for the beacons from the delay measurements.
Validating agent oriented methodology (AOM) for netlogo modelling and simulation
NASA Astrophysics Data System (ADS)
WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan
2017-10-01
AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.
[The added value of information summaries supporting clinical decisions at the point-of-care.
Banzi, Rita; González-Lorenzo, Marien; Kwag, Koren Hyogene; Bonovas, Stefanos; Moja, Lorenzo
2016-11-01
Evidence-based healthcare requires the integration of the best research evidence with clinical expertise and patients' values. International publishers are developing evidence-based information services and resources designed to overcome the difficulties in retrieving, assessing and updating medical information as well as to facilitate a rapid access to valid clinical knowledge. Point-of-care information summaries are defined as web-based medical compendia that are specifically designed to deliver pre-digested, rapidly accessible, comprehensive, and periodically updated information to health care providers. Their validity must be assessed against marketing claims that they are evidence-based. We periodically evaluate the content development processes of several international point-of-care information summaries. The number of these products has increased along with their quality. The last analysis done in 2014 identified 26 products and found that three of them (Best Practice, Dynamed e Uptodate) scored the highest across all evaluated dimensions (volume, quality of the editorial process and evidence-based methodology). Point-of-care information summaries as stand-alone products or integrated with other systems, are gaining ground to support clinical decisions. The choice of one product over another depends both on the properties of the service and the preference of users. However, even the most innovative information system must rely on transparent and valid contents. Individuals and institutions should regularly assess the value of point-of-care summaries as their quality changes rapidly over time.
OPUS: Optimal Projection for Uncertain Systems. Volume 1
1991-09-01
unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of
Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L
2018-02-01
Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.
1984-08-01
produce even the most basic binary cloud data and methodologies needed to support the evaluation programs." In view of this recognized deficiency, the...There was an exchange of information with non - DoD agencies, with presentations made by NASA and NOAA (see pp. 537, 569). A brief report by the steering...on cloud data bases and methodologies for users. To achieve these actions requires explicit support. *See classified supplementary volume. vi CONTENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake`s ground motion is a function of the earthquake`s magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is amore » stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. This document, Volume II, contains Appendices 2, 3, 5, 6, and 7 covering the following topics: Eastern North American Empirical Ground Motion Data; Examination of Variance of Seismographic Network Data; Soil Amplification and Vertical-to-Horizontal Ratios from Analysis of Strong Motion Data From Active Tectonic Regions; Revision and Calibration of Ou and Herrmann Method; Generalized Ray Procedure for Modeling Ground Motion Attenuation; Crustal Models for Velocity Regionalization; Depth Distribution Models; Development of Generic Site Effects Model; Validation and Comparison of One-Dimensional Site Response Methodologies; Plots of Amplification Factors; Assessment of Coupling Between Vertical & Horizontal Motions in Nonlinear Site Response Analysis; and Modeling of Dynamic Soil Properties.« less
Optimized Design and Analysis of Sparse-Sampling fMRI Experiments
Perrachione, Tyler K.; Ghosh, Satrajit S.
2013-01-01
Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power. PMID:23616742
Optimized design and analysis of sparse-sampling FMRI experiments.
Perrachione, Tyler K; Ghosh, Satrajit S
2013-01-01
Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power.
DOT National Transportation Integrated Search
2000-04-01
This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...
Tunnel and Station Cost Methodology Volume II: Stations
DOT National Transportation Integrated Search
1981-01-01
The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...
Understanding socio-economic impacts of geohazards aided by cyber-enabled systems
NASA Astrophysics Data System (ADS)
Klose, C. D.; Webersik, C.
2008-12-01
Due to an increase in the volume of geohazards worldwide, not only are impoverished regions in less developed countries such as Haiti, vulnerable to risk but also low income regions in industrialized countries, e.g. USA, as well. This has been exemplified once again by Hurricanes Gustav, Hanna and Ike and the impact on the Caribbean countries during the summer of 2008. To date, extensive research has been conducted to improve the monitoring of human-nature coupled systems. However, there is little emphasis on improving and developing methodologies to a) interpret multi-dimensional and complex data and b) validate prediction and modeling results. This presentation tries to motivate more research initiatives to address the aforementioned issues, bringing together two academic disciplines, earth and social sciences, to research the relationship between natural and socio-economic processes. Results are presented where cyber-enabled methods based on artificial intelligence are applied to different geohazards and regions in the world. They include 1) modeling of public health risks associated with volcanic gas hazards, 2) prediction and validation of potential areas of mining-triggered earthquakes, and 3) modeling of socio-economic risks associated with tropical storms in Haiti and the Dominican Republic.
Validation of Storm Water Management Model Storm Control Measures Modules
NASA Astrophysics Data System (ADS)
Simon, M. A.; Platz, M. C.
2017-12-01
EPA's Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance. Many municipalities are relying on SWMM results to design multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970's, EPA and others have developed five major releases, the most recent ones containing storm control measures modules for green infrastructure. The main objective of this study was to quantify the accuracy with which SWMM v5.1.10 simulates the hydrologic activity of previously monitored low impact developments. Model performance was evaluated with a mathematical comparison of outflow hydrographs and total outflow volumes, using empirical data and a multi-event, multi-objective calibration method. The calibration methodology utilized PEST++ Version 3, a parameter estimation tool, which aided in the selection of unmeasured hydrologic parameters. From the validation study and sensitivity analysis, several model improvements were identified to advance SWMM LID Module performance for permeable pavements, infiltration units and green roofs, and these were performed and reported herein. Overall, it was determined that SWMM can successfully simulate low impact development controls given accurate model confirmation, parameter measurement, and model calibration.
NASA Astrophysics Data System (ADS)
Schrooyen, Pierre; Chatelain, Philippe; Hillewaert, Koen; Magin, Thierry E.
2014-11-01
The atmospheric entry of spacecraft presents several challenges in simulating the aerothermal flow around the heat shield. Predicting an accurate heat-flux is a complex task, especially regarding the interaction between the flow in the free stream and the erosion of the thermal protection material. To capture this interaction, a continuum approach is developed to go progressively from the region fully occupied by fluid to a receding porous medium. The volume averaged Navier-Stokes equations are used to model both phases in the same computational domain considering a single set of conservation laws. The porosity is itself a variable of the computation, allowing to take volumetric ablation into account through adequate source terms. This approach is implemented within a computational tool based on a high-order discontinuous Galerkin discretization. The multi-dimensional tool has already been validated and has proven its efficient parallel implementation. Within this platform, a fully implicit method was developed to simulate multi-phase reacting flows. Numerical results to verify and validate the methodology are considered within this work. Interactions between the flow and the ablated geometry are also presented. Supported by Fund for Research Training in Industry and Agriculture.
Gartner, Joseph E.; Cannon, Susan H.; Santi, Paul M
2014-01-01
Debris flows and sediment-laden floods in the Transverse Ranges of southern California pose severe hazards to nearby communities and infrastructure. Frequent wildfires denude hillslopes and increase the likelihood of these hazardous events. Debris-retention basins protect communities and infrastructure from the impacts of debris flows and sediment-laden floods and also provide critical data for volumes of sediment deposited at watershed outlets. In this study, we supplement existing data for the volumes of sediment deposited at watershed outlets with newly acquired data to develop new empirical models for predicting volumes of sediment produced by watersheds located in the Transverse Ranges of southern California. The sediment volume data represent a broad sample of conditions found in Ventura, Los Angeles and San Bernardino Counties, California. The measured volumes of sediment, watershed morphology, distributions of burn severity within each watershed, the time since the most recent fire, triggering storm rainfall conditions, and engineering soil properties were analyzed using multiple linear regressions to develop two models. A “long-term model” was developed for predicting volumes of sediment deposited by both debris flows and floods at various times since the most recent fire from a database of volumes of sediment deposited by a combination of debris flows and sediment-laden floods with no time limit since the most recent fire (n = 344). A subset of this database was used to develop an “emergency assessment model” for predicting volumes of sediment deposited by debris flows within two years of a fire (n = 92). Prior to developing the models, 32 volumes of sediment, and related parameters for watershed morphology, burn severity and rainfall conditions were retained to independently validate the long-term model. Ten of these volumes of sediment were deposited by debris flows within two years of a fire and were used to validate the emergency assessment model. The models were validated by comparing predicted and measured volumes of sediment. These validations were also performed for previously developed models and identify that the models developed here best predict volumes of sediment for burned watersheds in comparison to previously developed models.
Leighton, Angela; Weinborn, Michael; Maybery, Murray
2014-10-01
Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.
Volume, Conservation and Instruction: A Classroom Based Solomon Four Group Study of Conflict.
ERIC Educational Resources Information Center
Rowell, J. A.; Dawson, C. J.
1981-01-01
Summarizes a study to widen the applicability of Piagetian theory-based conflict methodology from individual situations to entire classes. A Solomon four group design was used to study effects of conflict instruction on students' (N=127) ability to conserve volume of noncompressible matter and to apply that knowledge to gas volume. (Author/JN)
ERIC Educational Resources Information Center
Weinberg, Jessica P., Ed.; O'Bryan, Erin L., Ed.; Moll, Laura A., Ed.; Haugan, Jason D., Ed.
The five papers included in this volume approach the study of American Indian languages from a diverse array of methodological and theoretical approaches to linguistics. Two papers focus on approaches that come from the applied linguistics tradition, emphasizing ethnolinguistics and discourse analysis: Sonya Bird's paper "A Cross Cultural…
ERIC Educational Resources Information Center
Schaumann, Leif
Intended as a companion piece to volume 7 in the Method Series, Pharmaceutical Supply System Planning (CE 024 234), this fifth of six volumes in the International Health Planning Reference Series is a combined literature review and annotated bibliography dealing with alternative methodologies for planning and analyzing pharmaceutical supply…
ERIC Educational Resources Information Center
Tao, Fumiyo; And Others
This volume contains technical and supporting materials that supplement Volume I, which describes upward mobility programs for disadvantaged and dislocated workers in the service sector. Appendix A is a detailed description of the project methodology, including data collection methods and information on data compilation, processing, and analysis.…
NASA Astrophysics Data System (ADS)
Richardson, Chris T.; Kannappan, Sheila; Bittner, Ashley; Isaac, Rohan; RESOLVE
2017-01-01
We present a novel methodology for modeling emission line galaxy samples that span the entire BPT diagram. Our methodology has several advantages over current modeling schemes: the free variables in the model are identical for both AGN and SF galaxies; these free variables are more closely linked to observable galaxy properties; and the ionizing spectra including an AGN and starlight are handled self-consistently rather than empirically. We show that our methodology is capable of fitting the vast majority of SDSS galaxies that fall within the traditional regions of galaxy classification on the BPT diagram. We also present current results for relaxing classification boundaries and extending our galaxies into the dwarf regime, using the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey and the Environmental COntext (ECO) catalog, with special attention to compact blue E/S0s. We compare this methodology to PCA decomposition of the spectra. This work is supported by National Science Foundation awards AST-0955368 and CISE/ACI-1156614.
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
The development and validation of the Bronchiectasis Health Questionnaire.
Spinou, Arietta; Siegert, Richard J; Guan, Wei-Jie; Patel, Amit S; Gosker, Harry R; Lee, Kai K; Elston, Caroline; Loebinger, Michael R; Wilson, Robert; Garrod, Rachel; Birring, Surinder S
2017-05-01
Health-related quality of life or health status is significantly impaired in bronchiectasis. There is a paucity of brief, simple-to-use, disease-specific health status measures. The aim of this study was to develop and validate the Bronchiectasis Health Questionnaire (BHQ), a new health status measure that is brief and generates a single overall score.Patients with bronchiectasis were recruited from two outpatient clinics, during a clinically stable stage. The development of the questionnaire followed three phases: item generation and item reduction using Rasch analysis, validation, and repeatability testing. The BHQ was translated into 11 languages using standardised methodology.206 patients with bronchiectasis completed a preliminary 65-item questionnaire. 55 items were removed due to redundancy or poor fit to the Rasch model. The final version of the BHQ consisted of 10 items. Internal consistency was good (Cronbach's α=0.85). Convergent validity of the BHQ with the St George's Respiratory Questionnaire was high (r= -0.82; p<0.001) and moderate with lung function (forced expiratory volume in 1 s % predicted r= -0.27; p=0.001). There was a significant association between BHQ scores and number of exacerbations of bronchiectasis in the last 12 months (p<0.001), hospital admissions (p=0.001) and computed tomography scan bronchiectasis pulmonary lobe counts (p<0.001). BHQ scores were significantly worse in patients with sputum bacterial colonisation versus no colonisation (p=0.048). The BHQ was highly repeatable after 2 weeks (intraclass correlation coefficient 0.89).The BHQ is a brief, valid and repeatable, self-completed health status questionnaire for bronchiectasis that generates a single total score. It can be used in the clinic to assess bronchiectasis from the patient's perspective. Copyright ©ERS 2017.
Determining blood and plasma volumes using bioelectrical response spectroscopy
NASA Technical Reports Server (NTRS)
Siconolfi, S. F.; Nusynowitz, M. L.; Suire, S. S.; Moore, A. D. Jr; Leig, J.
1996-01-01
We hypothesized that an electric field (inductance) produced by charged blood components passing through the many branches of arteries and veins could assess total blood volume (TBV) or plasma volume (PV). Individual (N = 29) electrical circuits (inductors, two resistors, and a capacitor) were determined from bioelectrical response spectroscopy (BERS) using a Hewlett Packard 4284A Precision LCR Meter. Inductance, capacitance, and resistance from the circuits of 19 subjects modeled TBV (sum of PV and computed red cell volume) and PV (based on 125I-albumin). Each model (N = 10, cross validation group) had good validity based on 1) mean differences (-2.3 to 1.5%) between the methods that were not significant and less than the propagated errors (+/- 5.2% for TBV and PV), 2) high correlations (r > 0.92) with low SEE (< 7.7%) between dilution and BERS assessments, and 3) Bland-Altman pairwise comparisons that indicated "clinical equivalency" between the methods. Given the limitation of this study (10 validity subjects), we concluded that BERS models accurately assessed TBV and PV. Further evaluations of the models' validities are needed before they are used in clinical or research settings.
Measuring Standards in Primary English: The Validity of PIRLS--A Response to Mary Hilton
ERIC Educational Resources Information Center
Whetton, Chris; Twist, Liz; Sainsbury, Marian
2007-01-01
Hilton (2006) criticises the PIRLS (Progress in International Reading Literacy Study) tests and the survey conduct, raising questions about the validity of international surveys of reading. Her criticisms fall into four broad areas: cultural validity, methodological issues, construct validity and the survey in England. However, her criticisms are…
Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI
ERIC Educational Resources Information Center
Forer, Barry; Zumbo, Bruno D.
2011-01-01
The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…
Somatic Sensitivity and Reflexivity as Validity Tools in Qualitative Research
ERIC Educational Resources Information Center
Green, Jill
2015-01-01
Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…
Pacilio, M; Basile, C; Shcherbinin, S; Caselli, F; Ventroni, G; Aragno, D; Mango, L; Santini, E
2011-06-01
Positron emission tomography (PET) and single-photon emission computed tomography (SPECT) imaging play an important role in the segmentation of functioning parts of organs or tumours, but an accurate and reproducible delineation is still a challenging task. In this work, an innovative iterative thresholding method for tumour segmentation has been proposed and implemented for a SPECT system. This method, which is based on experimental threshold-volume calibrations, implements also the recovery coefficients (RC) of the imaging system, so it has been called recovering iterative thresholding method (RIThM). The possibility to employ Monte Carlo (MC) simulations for system calibration was also investigated. The RIThM is an iterative algorithm coded using MATLAB: after an initial rough estimate of the volume of interest, the following calculations are repeated: (i) the corresponding source-to-background ratio (SBR) is measured and corrected by means of the RC curve; (ii) the threshold corresponding to the amended SBR value and the volume estimate is then found using threshold-volume data; (iii) new volume estimate is obtained by image thresholding. The process goes on until convergence. The RIThM was implemented for an Infinia Hawkeye 4 (GE Healthcare) SPECT/CT system, using a Jaszczak phantom and several test objects. Two MC codes were tested to simulate the calibration images: SIMIND and SimSet. For validation, test images consisting of hot spheres and some anatomical structures of the Zubal head phantom were simulated with SIMIND code. Additional test objects (flasks and vials) were also imaged experimentally. Finally, the RIThM was applied to evaluate three cases of brain metastases and two cases of high grade gliomas. Comparing experimental thresholds and those obtained by MC simulations, a maximum difference of about 4% was found, within the errors (+/- 2% and +/- 5%, for volumes > or = 5 ml or < 5 ml, respectively). Also for the RC data, the comparison showed differences (up to 8%) within the assigned error (+/- 6%). ANOVA test demonstrated that the calibration results (in terms of thresholds or RCs at various volumes) obtained by MC simulations were indistinguishable from those obtained experimentally. The accuracy in volume determination for the simulated hot spheres was between -9% and 15% in the range 4-270 ml, whereas for volumes less than 4 ml (in the range 1-3 ml) the difference increased abruptly reaching values greater than 100%. For the Zubal head phantom, errors ranged between 9% and 18%. For the experimental test images, the accuracy level was within +/- 10%, for volumes in the range 20-110 ml. The preliminary test of application on patients evidenced the suitability of the method in a clinical setting. The MC-guided delineation of tumor volume may reduce the acquisition time required for the experimental calibration. Analysis of images of several simulated and experimental test objects, Zubal head phantom and clinical cases demonstrated the robustness, suitability, accuracy, and speed of the proposed method. Nevertheless, studies concerning tumors of irregular shape and/or nonuniform distribution of the background activity are still in progress.
Load and resistance factor rating (LRFR) in New York State : volume II.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...
Load and resistance factor rating (LRFR) in NYS : volume II final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
Load and resistance factor rating (LRFR) in NYS : volume I final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
ERIC Educational Resources Information Center
Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen
2017-01-01
Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…
1998 motor vehicle occupant safety survey. Volume 1, methodology report
DOT National Transportation Integrated Search
2000-03-01
This is the Methodology Report for the 1998 Motor Vehicle Occupant Safety Survey. The survey is conducted on a biennial basis (initiated in 1994), and is administered by telephone to a randomly selected national sample. Two questionnaires are used, e...
Load and resistance factor rating (LRFR) in New York State : volume I.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...
National survey of drinking and driving attitudes and behaviors : 2008. Volume 3, methodology report
DOT National Transportation Integrated Search
2010-08-01
This report presents the details of the methodology used for the 2008 National Survey of Drinking and Driving Attitudes and Behaviors conducted by Gallup, Inc. for : the National Highway Traffic Safety Administration (NHTSA). This survey represents t...
Chen, Xin-Lin; Zhong, Liang-Huan; Wen, Yi; Liu, Tian-Wen; Li, Xiao-Ying; Hou, Zheng-Kun; Hu, Yue; Mo, Chuan-Wei; Liu, Feng-Bin
2017-09-15
This review aims to critically appraise and compare the measurement properties of inflammatory bowel disease (IBD)-specific health-related quality of life instruments. Medline, EMBASE and ISI Web of Knowledge were searched from their inception to May 2016. IBD-specific instruments for patients with Crohn's disease, ulcerative colitis or IBD were enrolled. The basic characteristics and domains of the instruments were collected. The methodological quality of measurement properties and measurement properties of the instruments were assessed. Fifteen IBD-specific instruments were included, which included twelve instruments for adult IBD patients and three for paediatric IBD patients. All of the instruments were developed in North American and European countries. The following common domains were identified: IBD-related symptoms, physical, emotional and social domain. The methodological quality was satisfactory for content validity; fair in internal consistency, reliability, structural validity, hypotheses testing and criterion validity; and poor in measurement error, cross-cultural validity and responsiveness. For adult IBD patients, the IBDQ-32 and its short version (SIBDQ) had good measurement properties and were the most widely used worldwide. For paediatric IBD patients, the IMPACT-III had good measurement properties and had more translated versions. Most methodological quality should be promoted, especially measurement error, cross-cultural validity and responsiveness. The IBDQ-32 was the most widely used instrument with good reliability and validity, followed by the SIBDQ and IMPACT-III. Further validation studies are necessary to support the use of other instruments.
Measurement properties of tools measuring mental health knowledge: a systematic review.
Wei, Yifeng; McGrath, Patrick J; Hayden, Jill; Kutcher, Stan
2016-08-23
Mental health literacy has received great attention recently to improve mental health knowledge, decrease stigma and enhance help-seeking behaviors. We conducted a systematic review to critically appraise the qualities of studies evaluating the measurement properties of mental health knowledge tools and the quality of included measurement properties. We searched PubMed, PsycINFO, EMBASE, CINAHL, the Cochrane Library, and ERIC for studies addressing psychometrics of mental health knowledge tools and published in English. We applied the COSMIN checklist to assess the methodological quality of each study as "excellent", "good", "fair", or "indeterminate". We ranked the level of evidence of the overall quality of each measurement property across studies as "strong", "moderate", "limited", "conflicting", or "unknown". We identified 16 mental health knowledge tools in 17 studies, addressing reliability, validity, responsiveness or measurement errors. The methodological quality of included studies ranged from "poor" to "excellent" including 6 studies addressing the content validity, internal consistency or structural validity demonstrating "excellent" quality. We found strong evidence of the content validity or internal consistency of 6 tools; moderate evidence of the internal consistency, the content validity or the reliability of 8 tools; and limited evidence of the reliability, the structural validity, the criterion validity, or the construct validity of 12 tools. Both the methodological qualities of included studies and the overall evidence of measurement properties are mixed. Based on the current evidence, we recommend that researchers consider using tools with measurement properties of strong or moderate evidence that also reached the threshold for positive ratings according to COSMIN checklist.
Aldekhayel, Salah A; Alselaim, Nahar A; Magzoub, Mohi Eldin; Al-Qattan, Mohammad M; Al-Namlah, Abdullah M; Tamim, Hani; Al-Khayal, Abdullah; Al-Habdan, Sultan I; Zamakhshary, Mohammed F
2012-10-24
Script Concordance Test (SCT) is a new assessment tool that reliably assesses clinical reasoning skills. Previous descriptions of developing SCT-question banks were merely subjective. This study addresses two gaps in the literature: 1) conducting the first phase of a multistep validation process of SCT in Plastic Surgery, and 2) providing an objective methodology to construct a question bank based on SCT. After developing a test blueprint, 52 test items were written. Five validation questions were developed and a validation survey was established online. Seven reviewers were asked to answer this survey. They were recruited from two countries, Saudi Arabia and Canada, to improve the test's external validity. Their ratings were transformed into percentages. Analysis was performed to compare reviewers' ratings by looking at correlations, ranges, means, medians, and overall scores. Scores of reviewers' ratings were between 76% and 95% (mean 86% ± 5). We found poor correlations between reviewers (Pearson's: +0.38 to -0.22). Ratings of individual validation questions ranged between 0 and 4 (on a scale 1-5). Means and medians of these ranges were computed for each test item (mean: 0.8 to 2.4; median: 1 to 3). A subset of test items comprising 27 items was generated based on a set of inclusion and exclusion criteria. This study proposes an objective methodology for validation of SCT-question bank. Analysis of validation survey is done from all angles, i.e., reviewers, validation questions, and test items. Finally, a subset of test items is generated based on a set of criteria.
Prabhu, David; Mehanna, Emile; Gargesha, Madhusudhana; Brandt, Eric; Wen, Di; van Ditzhuijzen, Nienke S; Chamie, Daniel; Yamamoto, Hirosada; Fujino, Yusuke; Alian, Ali; Patel, Jaymin; Costa, Marco; Bezerra, Hiram G; Wilson, David L
2016-04-01
Evidence suggests high-resolution, high-contrast, [Formula: see text] intravascular optical coherence tomography (IVOCT) can distinguish plaque types, but further validation is needed, especially for automated plaque characterization. We developed experimental and three-dimensional (3-D) registration methods to provide validation of IVOCT pullback volumes using microscopic, color, and fluorescent cryo-image volumes with optional registered cryo-histology. A specialized registration method matched IVOCT pullback images acquired in the catheter reference frame to a true 3-D cryo-image volume. Briefly, an 11-parameter registration model including a polynomial virtual catheter was initialized within the cryo-image volume, and perpendicular images were extracted, mimicking IVOCT image acquisition. Virtual catheter parameters were optimized to maximize cryo and IVOCT lumen overlap. Multiple assessments suggested that the registration error was better than the [Formula: see text] spacing between IVOCT image frames. Tests on a digital synthetic phantom gave a registration error of only [Formula: see text] (signed distance). Visual assessment of randomly presented nearby frames suggested registration accuracy within 1 IVOCT frame interval ([Formula: see text]). This would eliminate potential misinterpretations confronted by the typical histological approaches to validation, with estimated 1-mm errors. The method can be used to create annotated datasets and automated plaque classification methods and can be extended to other intravascular imaging modalities.
Broadband Fan Noise Prediction System for Turbofan Engines. Volume 3; Validation and Test Cases
NASA Technical Reports Server (NTRS)
Morin, Bruce L.
2010-01-01
Pratt & Whitney has developed a Broadband Fan Noise Prediction System (BFaNS) for turbofan engines. This system computes the noise generated by turbulence impinging on the leading edges of the fan and fan exit guide vane, and noise generated by boundary-layer turbulence passing over the fan trailing edge. BFaNS has been validated on three fan rigs that were tested during the NASA Advanced Subsonic Technology Program (AST). The predicted noise spectra agreed well with measured data. The predicted effects of fan speed, vane count, and vane sweep also agreed well with measurements. The noise prediction system consists of two computer programs: Setup_BFaNS and BFaNS. Setup_BFaNS converts user-specified geometry and flow-field information into a BFaNS input file. From this input file, BFaNS computes the inlet and aft broadband sound power spectra generated by the fan and FEGV. The output file from BFaNS contains the inlet, aft and total sound power spectra from each noise source. This report is the third volume of a three-volume set documenting the Broadband Fan Noise Prediction System: Volume 1: Setup_BFaNS User s Manual and Developer s Guide; Volume 2: BFaNS User s Manual and Developer s Guide; and Volume 3: Validation and Test Cases. The present volume begins with an overview of the Broadband Fan Noise Prediction System, followed by validation studies that were done on three fan rigs. It concludes with recommended improvements and additional studies for BFaNS.
Marinus, Nastasia; Bervoets, Liene; Massa, Guy; Verboven, Kenneth; Stevens, An; Takken, Tim; Hansen, Dominique
2017-12-01
Cardiopulmonary exercise testing is advised ahead of exercise intervention in obese adolescents to assess medical safety of exercise and physical fitness. Optimal validity and reliability of test results are required to identify maximal exercise effort. As fat oxidation during exercise is disturbed in obese individuals, it remains an unresolved methodological issue whether the respiratory gas exchange ratio (RER) is a valid marker for maximal effort during exercise testing in this population. RER during maximal exercise testing (RERpeak), and RER trajectories, was compared between obese and lean adolescents and relationships between RERpeak, RER slope and subject characteristics (age, gender, Body Mass Index [BMI], Tanner stage, physical activity level) were explored. Thirty-four obese (BMI: 35.1±5.1 kg/m²) and 18 lean (BMI: 18.8±1.9 kg/m²) adolescents (aged 12-18 years) performed a maximal cardiopulmonary exercise test on bike, with comparison of oxygen uptake (VO2), heart rate (HR), expiratory volume (VE), carbon dioxide output (VCO2), and cycling power output (W). RERpeak (1.09±0.06 vs. 1.14±0.06 in obese vs. lean adolescents, respectively) and RER slope (0.03±0.01 vs. 0.05±0.01 per 10% increase in VO2, in obese vs. lean adolescents, respectively) was significantly lower in obese adolescents, and independently related to BMI (P<0.05). Adjusted for HRpeak and VEpeak, RERpeak and RER slope remained significantly lower in obese adolescents (P<0.05). RER trajectories (in relation to %VO2peak and %Wpeak) were significantly different between groups (P<0.001). RERpeak is significantly lowered in obese adolescents. This may have important methodological implications for cardiopulmonary exercise testing in this population.
NASA Astrophysics Data System (ADS)
Reynerson, Charles Martin
This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays
Wu, Hao; Tang, Biao; Hayes, Robert A.; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu
2016-01-01
Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities. PMID:28773826
Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays.
Wu, Hao; Tang, Biao; Hayes, Robert A; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu
2016-08-19
Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities.
Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin
2015-07-10
Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology.
Detection of MRI artifacts produced by intrinsic heart motion using a saliency model
NASA Astrophysics Data System (ADS)
Salguero, Jennifer; Velasco, Nelson; Romero, Eduardo
2017-11-01
Cardiac Magnetic Resonance (CMR) requires synchronization with the ECG to correct many types of noise. However, the complex heart motion frequently produces displaced slices that have to be either ignored or manually corrected since the ECG correction is useless in this case. This work presents a novel methodology that detects the motion artifacts in CMR using a saliency method that highlights the region where the heart chambers are located. Once the Region of Interest (RoI) is set, its center of gravity is determined for the set of slices composing the volume. The deviation of the gravity center is an estimation of the coherence between the slices and is used to find out slices with certain displacement. Validation was performed with distorted real images where a slice is artificially misaligned with respect to set of slices. The displaced slice is found with a Recall of 84% and F Score of 68%.
Towards the Development of a Smart Flying Sensor: Illustration in the Field of Precision Agriculture
Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin
2015-01-01
Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology. PMID:26184205
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...
2017-03-28
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Murovec, Boštjan; Kolbl, Sabina; Stres, Blaž
2015-01-01
The aim of this study was to develop and validate a community supported online infrastructure and bioresource for methane yield data and accompanying metadata collected from published literature. In total, 1164 entries described by 15,749 data points were assembled. Analysis of data collection showed little congruence in reporting of methodological approaches. The largest identifiable source of variation in reported methane yields was represented by authorship (i.e. substrate batches within particular substrate class) within which experimental scales (volumes (0.02-5l), incubation temperature (34-40 °C) and % VS of substrate played an important role (p < 0.05, npermutations = 999) as well. The largest fraction of variability, however, remained unaccounted for and thus unexplained (> 63%). This calls for reconsideration of accepted approaches to reporting data in currently published literature to increase capacity to service industrial decision making to a greater extent. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Borthwick, J.; Knight, B.; Bender, A.; Loveder, P.
These two volumes provide information on the scope of adult and community education (ACE) in Australia and implications for improved data collection and reporting. Volume 1 begins with a glossary. Chapter 1 addresses project objectives and processes and methodology. Chapter 2 analyzes the scope and diversity of ACE in terms of what is currently…
Surrogate Plant Data Base : Volume 2. Appendix C : Facilities Planning Baseline Data
DOT National Transportation Integrated Search
1983-05-01
This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...
Surrogate Plant Data Base : Volume 4. Appendix E : Medium and Heavy Truck Manufacturing
DOT National Transportation Integrated Search
1983-05-01
This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...
Supplementary Computer Generated Cueing to Enhance Air Traffic Controller Efficiency
2013-03-01
assess the complexity of air traffic control (Mogford, Guttman, Morrow, & Kopardekar, 1995; Laudeman, Shelden, Branstrom, & Brasil , 1998). Controllers...Behaviorial Sciences: Volume 1: Methodological Issues Volume 2: Statistical Issues, 1, 257. Laudeman, I. V., Shelden, S. G., Branstrom, R., & Brasil
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Complexity, Representation and Practice: Case Study as Method and Methodology
ERIC Educational Resources Information Center
Miles, Rebecca
2015-01-01
While case study is considered a common approach to examining specific and particular examples in research disciplines such as law, medicine and psychology, in the social sciences case study is often treated as a lesser, flawed or undemanding methodology which is less valid, reliable or theoretically rigorous than other methodologies. Building on…
Prediction of resource volumes at untested locations using simple local prediction models
Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.
2006-01-01
This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.
Mansutti, Irene; Saiani, Luisa; Grassetti, Luca; Palese, Alvisa
2017-03-01
The clinical learning environment is fundamental to nursing education paths, capable of affecting learning processes and outcomes. Several instruments have been developed in nursing education, aimed at evaluating the quality of the clinical learning environments; however, no systematic review of the psychometric properties and methodological quality of these studies has been performed to date. The aims of the study were: 1) to identify validated instruments evaluating the clinical learning environments in nursing education; 2) to evaluate critically the methodological quality of the psychometric property estimation used; and 3) to compare psychometric properties across the instruments available. A systematic review of the literature (using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines) and an evaluation of the methodological quality of psychometric properties (using the COnsensus-based Standards for the selection of health Measurement INstruments guidelines). The Medline and CINAHL databases were searched. Eligible studies were those that satisfied the following criteria: a) validation studies of instruments evaluating the quality of clinical learning environments; b) in nursing education; c) published in English or Italian; d) before April 2016. The included studies were evaluated for the methodological quality of the psychometric properties measured and then compared in terms of both the psychometric properties and the methodological quality of the processes used. The search strategy yielded a total of 26 studies and eight clinical learning environment evaluation instruments. A variety of psychometric properties have been estimated for each instrument, with differing qualities in the methodology used. Concept and construct validity were poorly assessed in terms of their significance and rarely judged by the target population (nursing students). Some properties were rarely considered (e.g., reliability, measurement error, criterion validity), whereas others were frequently estimated, but using different coefficients and statistical analyses (e.g., internal consistency, structural validity), thus rendering comparison across instruments difficult. Moreover, the methodological quality adopted in the property assessments was poor or fair in most studies, compromising the goodness of the psychometric values estimated. Clinical learning placements represent the key strategies in educating the future nursing workforce: instruments evaluating the quality of the settings, as well as their capacity to promote significant learning, are strongly recommended. Studies estimating psychometric properties, using an increased quality of research methodologies are needed in order to support nursing educators in the process of clinical placements accreditation and quality improvement. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Bateman, Donald R.; Zidonis, Frank J.
In the introduction to this volume of a two volume document (See also TE 002 131.) written for curriculum developers, Donald Bateman identifies the recent periods in the development of linguistic thought and methodology, and presents language curriculum development as the continuing exploration of the processes of evolving linguistic structures.…
A prototype software methodology for the rapid evaluation of biomanufacturing process options.
Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli
2007-10-01
A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.
Bayesian cross-entropy methodology for optimal design of validation experiments
NASA Astrophysics Data System (ADS)
Jiang, X.; Mahadevan, S.
2006-07-01
An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.
Validation of source approval of HMA surface mix aggregate : final report.
DOT National Transportation Integrated Search
2016-04-01
The main focus of this research project was to develop methodologies for the validation of source approval of hot : mix asphalt surface mix aggregate. In order to further enhance the validation process, a secondary focus was also to : create a spectr...
34 CFR 462.11 - What must an application contain?
Code of Federal Regulations, 2010 CFR
2010-07-01
... the methodology and procedures used to measure the reliability of the test. (h) Construct validity... previous test, and results from validity, reliability, and equating or standard-setting studies undertaken... NRS educational functioning levels (content validity). Documentation of the extent to which the items...
Temporal validation for landsat-based volume estimation model
Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan
2015-01-01
Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
NASA Astrophysics Data System (ADS)
Campanelli, Monica; Mascitelli, Alessandra; Sanò, Paolo; Diémoz, Henri; Estellés, Victor; Federico, Stefano; Iannarelli, Anna Maria; Fratarcangeli, Francesca; Mazzoni, Augusto; Realini, Eugenio; Crespi, Mattia; Bock, Olivier; Martínez-Lozano, Jose A.; Dietrich, Stefano
2018-01-01
The estimation of the precipitable water vapour content (W) with high temporal and spatial resolution is of great interest to both meteorological and climatological studies. Several methodologies based on remote sensing techniques have been recently developed in order to obtain accurate and frequent measurements of this atmospheric parameter. Among them, the relative low cost and easy deployment of sun-sky radiometers, or sun photometers, operating in several international networks, allowed the development of automatic estimations of W from these instruments with high temporal resolution. However, the great problem of this methodology is the estimation of the sun-photometric calibration parameters. The objective of this paper is to validate a new methodology based on the hypothesis that the calibration parameters characterizing the atmospheric transmittance at 940 nm are dependent on vertical profiles of temperature, air pressure and moisture typical of each measurement site. To obtain the calibration parameters some simultaneously seasonal measurements of W, from independent sources, taken over a large range of solar zenith angle and covering a wide range of W, are needed. In this work yearly GNSS/GPS datasets were used for obtaining a table of photometric calibration constants and the methodology was applied and validated in three European ESR-SKYNET network sites, characterized by different atmospheric and climatic conditions: Rome, Valencia and Aosta. Results were validated against the GNSS/GPS and AErosol RObotic NETwork (AERONET) W estimations. In both the validations the agreement was very high, with a percentage RMSD of about 6, 13 and 8 % in the case of GPS intercomparison at Rome, Aosta and Valencia, respectively, and of 8 % in the case of AERONET comparison in Valencia. Analysing the results by W classes, the present methodology was found to clearly improve W estimation at low W content when compared against AERONET in terms of % bias, bringing the agreement with the GPS (considered the reference one) from a % bias of 5.76 to 0.52.
QESA: Quarantine Extraterrestrial Sample Analysis Methodology
NASA Astrophysics Data System (ADS)
Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.
2018-04-01
Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.
Assessment of Component-level Emission Measurements ...
Oil and natural gas (ONG) production facilities have the potential to emit a substantial amount of greenhouse gasses, hydrocarbons and hazardous air pollutants into the atmosphere. These emissions come from a wide variety of sources including engine exhaust, combustor gases, atmospheric venting from uncontrolled tanks and leaks. Engine exhaust, combustor gases and atmospheric tank venting are included in the initial estimation of a production facilities cumulative emissions. However, there is a large amount of uncertainty associated with magnitude and composition of leaks at these facilities. In order to understand the environmental impacts of these emissions we must first be able characterize the emission flow rate and chemical composition of these leaks/venting. A number of recent publications regarding emission flow rate measurements of components at ONG production facilities have brought into question the validity of such measurements and the sampling methodology. An accurate methodology for quantifying hydrocarbon leaks/venting is needed to support both emission inventories and environmental compliance. This interim report will summarize recent results from a small leak survey completed at ONG production facilities in Utah to characterize their flow rate and chemical composition using a suite of instruments using a high volume sampler (Bacharach Hi Flow Sampler; Bacharach, Inc.), as well as infrared (IR) cameras, a photoionization detector (PID), a fl
Valavanis, Ioannis; Pilalis, Eleftherios; Georgiadis, Panagiotis; Kyrtopoulos, Soterios; Chatziioannou, Aristotelis
2015-01-01
DNA methylation profiling exploits microarray technologies, thus yielding a wealth of high-volume data. Here, an intelligent framework is applied, encompassing epidemiological genome-scale DNA methylation data produced from the Illumina’s Infinium Human Methylation 450K Bead Chip platform, in an effort to correlate interesting methylation patterns with cancer predisposition and, in particular, breast cancer and B-cell lymphoma. Feature selection and classification are employed in order to select, from an initial set of ~480,000 methylation measurements at CpG sites, predictive cancer epigenetic biomarkers and assess their classification power for discriminating healthy versus cancer related classes. Feature selection exploits evolutionary algorithms or a graph-theoretic methodology which makes use of the semantics information included in the Gene Ontology (GO) tree. The selected features, corresponding to methylation of CpG sites, attained moderate-to-high classification accuracies when imported to a series of classifiers evaluated by resampling or blindfold validation. The semantics-driven selection revealed sets of CpG sites performing similarly with evolutionary selection in the classification tasks. However, gene enrichment and pathway analysis showed that it additionally provides more descriptive sets of GO terms and KEGG pathways regarding the cancer phenotypes studied here. Results support the expediency of this methodology regarding its application in epidemiological studies. PMID:27600245
An economic analysis of robotically assisted hysterectomy.
Wright, Jason D; Ananth, Cande V; Tergas, Ana I; Herzog, Thomas J; Burke, William M; Lewin, Sharyn N; Lu, Yu-Shiang; Neugut, Alfred I; Hershman, Dawn L
2014-05-01
To perform an econometric analysis to examine the influence of procedure volume, variation in hospital accounting methodology, and use of various analytic methodologies on cost of robotically assisted hysterectomy for benign gynecologic disease and endometrial cancer. A national sample was used to identify women who underwent laparoscopic or robotically assisted hysterectomy for benign indications or endometrial cancer from 2006 to 2012. Surgeon and hospital volume were classified as the number of procedures performed before the index surgery. Total costs as well as fixed and variable costs were modeled using multivariable quantile regression methodology. A total of 180,230 women, including 169,324 women who underwent minimally invasive hysterectomy for benign indications and 10,906 patients whose hysterectomy was performed for endometrial cancer, were identified. The unadjusted median cost of robotically assisted hysterectomy for benign indications was $8,152 (interquartile range [IQR] $6,011-10,932) compared with $6,535 (IQR $5,127-8,357) for laparoscopic hysterectomy (P<.001). The cost differential decreased with increasing surgeon and hospital volume. The unadjusted median cost of robotically assisted hysterectomy for endometrial cancer was $9,691 (IQR $7,591-12,428) compared with $8,237 (IQR $6,400-10,807) for laparoscopic hysterectomy (P<.001). The cost differential decreased with increasing hospital volume from $2,471 for the first 5 to 15 cases to $924 for more than 50 cases. Based on surgeon volume, robotically assisted hysterectomy for endometrial cancer was $1,761 more expensive than laparoscopy for those who had performed fewer than five cases; the differential declined to $688 for more than 50 procedures compared with laparoscopic hysterectomy. The cost of robotic gynecologic surgery decreases with increased procedure volume. However, in all of the scenarios modeled, robotically assisted hysterectomy remained substantially more costly than laparoscopic hysterectomy.
DOT National Transportation Integrated Search
2009-08-01
The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...
DOT National Transportation Integrated Search
2009-08-01
The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...
DOT National Transportation Integrated Search
2009-08-01
The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…
Validation of source approval of HMA surface mix aggregate using spectrometer : final report.
DOT National Transportation Integrated Search
2016-04-01
The main focus of this research project was to develop methodologies for the validation of source approval of hot : mix asphalt surface mix aggregate. In order to further enhance the validation process, a secondary focus was also to : create a spectr...
Kanter, Michael H; Huang, Yii-Chieh; Kally, Zina; Gordon, Margo A; Meltzer, Charles
2018-06-01
A well-documented association exists between higher surgeon volumes and better outcomes for many procedures, but surgeons may be reluctant to change practice patterns without objective, credible, and near real-time data on their performance. In addition, published thresholds for procedure volumes may be biased or perceived as arbitrary; typical reports compare surgeons grouped into discrete procedure volume categories, even though the volume-outcomes relationship is likely continuous. The concentration curves methodology, which has been used to analyze whether health outcomes vary with socioeconomic status, was adapted to explore the association between procedure volume and outcomes as a continuous relationship so that data for all surgeons within a health care organization could be included. Using widely available software and requiring minimal analytic expertise, this approach plots cumulative percentages of two variables of interest against each other and assesses the characteristics of the resulting curve. Organization-specific relationships between surgeon volumes and outcomes were examined for three example types of procedures: uncomplicated hysterectomies, infant circumcisions, and total thyroidectomies. The concentration index was used to assess whether outcomes were equally distributed unrelated to volumes. For all three procedures, the concentration curve methodology identified associations between surgeon procedure volumes and selected outcomes that were specific to the organization. The concentration indices confirmed the higher prevalence of examined outcomes among low-volume surgeons. The curves supported organizational discussions about surgical quality. Concentration curves require minimal resources to identify organization- and procedure-specific relationships between surgeon procedure volumes and outcomes and can support quality improvement. Copyright © 2018 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Automated MRI parcellation of the frontal lobe.
Ranta, Marin E; Chen, Min; Crocetti, Deana; Prince, Jerry L; Subramaniam, Krish; Fischl, Bruce; Kaufmann, Walter E; Mostofsky, Stewart H
2014-05-01
Examination of associations between specific disorders and physical properties of functionally relevant frontal lobe sub-regions is a fundamental goal in neuropsychiatry. Here, we present and evaluate automated methods of frontal lobe parcellation with the programs FreeSurfer(FS) and TOADS-CRUISE(T-C), based on the manual method described in Ranta et al. [2009]: Psychiatry Res 172:147-154 in which sulcal-gyral landmarks were used to manually delimit functionally relevant regions within the frontal lobe: i.e., primary motor cortex, anterior cingulate, deep white matter, premotor cortex regions (supplementary motor complex, frontal eye field, and lateral premotor cortex) and prefrontal cortex (PFC) regions (medial PFC, dorsolateral PFC, inferior PFC, lateral orbitofrontal cortex [OFC] and medial OFC). Dice's coefficient, a measure of overlap, and percent volume difference were used to measure the reliability between manual and automated delineations for each frontal lobe region. For FS, mean Dice's coefficient for all regions was 0.75 and percent volume difference was 21.2%. For T-C the mean Dice's coefficient was 0.77 and the mean percent volume difference for all regions was 20.2%. These results, along with a high degree of agreement between the two automated methods (mean Dice's coefficient = 0.81, percent volume difference = 12.4%) and a proof-of-principle group difference analysis that highlights the consistency and sensitivity of the automated methods, indicate that the automated methods are valid techniques for parcellation of the frontal lobe into functionally relevant sub-regions. Thus, the methodology has the potential to increase efficiency, statistical power and reproducibility for population analyses of neuropsychiatric disorders with hypothesized frontal lobe contributions. Copyright © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Hendricks, S.; Hoppmann, M.; Hunkeler, P. A.; Kalscheuer, T.; Gerdes, R.
2015-12-01
In Antarctica, ice crystals (platelets) form and grow in supercooled waters below ice shelves. These platelets rise and accumulate beneath nearby sea ice to form a several meter thick sub-ice platelet layer. This special ice type is a unique habitat, influences sea-ice mass and energy balance, and its volume can be interpreted as an indicator for ice - ocean interactions. Although progress has been made in determining and understanding its spatio-temporal variability based on point measurements, an investigation of this phenomenon on a larger scale remains a challenge due to logistical constraints and a lack of suitable methodology. In the present study, we applied a lateral constrained Marquardt-Levenberg inversion to a unique multi-frequency electromagnetic (EM) induction sounding dataset obtained on the ice-shelf influenced fast-ice regime of Atka Bay, eastern Weddell Sea. We adapted the inversion algorithm to incorporate a sensor specific signal bias, and confirmed the reliability of the algorithm by performing a sensitivity study using synthetic data. We inverted the field data for sea-ice and sub-ice platelet-layer thickness and electrical conductivity, and calculated ice-volume fractions from platelet-layer conductivities using Archie's Law. The thickness results agreed well with drill-hole validation datasets within the uncertainty range, and the ice-volume fraction also yielded plausible results. Our findings imply that multi-frequency EM induction sounding is a suitable approach to efficiently map sea-ice and platelet-layer properties. However, we emphasize that the successful application of this technique requires a break with traditional EM sensor calibration strategies due to the need of absolute calibration with respect to a physical forward model.
Miguel, Susana; Caldeira, Sílvia; Vieira, Margarida
2018-04-01
This article describes the adequacy of the Q methodology as a new option for the validation of nursing diagnoses related to subjective foci. Discussion paper about the characteristics of the Q methodology. This method has been used in nursing research particularly related to subjective concepts and includes both a quantitative and qualitative dimension. The Q methodology seems to be an adequate and innovative method for the clinical validation of nursing diagnoses. The validation of nursing diagnoses related to subjective foci using the Q methodology could improve the level of evidence and provide nurses with clinical indicators for clinical reasoning and for the planning of effective interventions. Descrever a adequação da metodologia Q como uma nova opção para a validação clínica de diagnósticos de enfermagem relacionados com focos subjetivos. MÉTODOS: Artigo de discussão sobre as características da metodologia Q. Este método tem sido utilizado na pesquisa em enfermagem relacionada com conceitos subjetivos e inclui em simultâneo uma vertente qualitativa e quantitativa. CONCLUSÕES: A metodologia Q parece ser uma opção metodológica adequada para a validação clínica de diagnósticos de enfermagem. IMPLICAÇÕES PARA A PRÁTICA: A utilização da metodologia Q na validação clínica de diagnósticos de enfermagem relacionados com focos subjetivos pode melhorar o nível e evidência e facilitar o raciocínio clínico dos enfermeiros, ao providenciar indicadores clínicos também necessários ao desenvolvimento de intervenções efetivas. © 2016 NANDA International, Inc.
Psychometric evaluation of commonly used game-specific skills tests in rugby: A systematic review
Oorschot, Sander; Chiwaridzo, Matthew; CM Smits-Engelsman, Bouwien
2017-01-01
Objectives To (1) give an overview of commonly used game-specific skills tests in rugby and (2) evaluate available psychometric information of these tests. Methods The databases PubMed, MEDLINE CINAHL and Africa Wide information were systematically searched for articles published between January 1995 and March 2017. First, commonly used game-specific skills tests were identified. Second, the available psychometrics of these tests were evaluated and the methodological quality of the studies assessed using the Consensus-based Standards for the selection of health Measurement Instruments checklist. Studies included in the first step had to report detailed information on the construct and testing procedure of at least one game-specific skill, and studies included in the second step had additionally to report at least one psychometric property evaluating reliability, validity or responsiveness. Results 287 articles were identified in the first step, of which 30 articles met the inclusion criteria and 64 articles were identified in the second step of which 10 articles were included. Reactive agility, tackling and simulated rugby games were the most commonly used tests. All 10 studies reporting psychometrics reported reliability outcomes, revealing mainly strong evidence. However, all studies scored poor or fair on methodological quality. Four studies reported validity outcomes in which mainly moderate evidence was indicated, but all articles had fair methodological quality. Conclusion Game-specific skills tests indicated mainly high reliability and validity evidence, but the studies lacked methodological quality. Reactive agility seems to be a promising domain, but the specific tests need further development. Future high methodological quality studies are required in order to develop valid and reliable test batteries for rugby talent identification. Trial registration number PROSPERO CRD42015029747. PMID:29259812
NASA Technical Reports Server (NTRS)
Ferber, R. (Editor); Evans, D. (Editor)
1978-01-01
The background, objectives and methodology used for the Small Power Systems Solar Electric Workshop are described, and a summary of the results and conclusions developed at the workshop regarding small solar thermal electric power systems is presented.
DOT National Transportation Integrated Search
2012-05-05
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the ICM AMS methodology successfully and effectively. It provides a step-by-step approach to ...
Russkij jazyk za rubezom. Jahrgang 1974 ("The Russian Language Abroad." Volume 1974)
ERIC Educational Resources Information Center
Huebner, Wolfgang
1975-01-01
Articles in the 1974 volume of this periodical are briefly reviewed, preponderantly under the headings of teaching materials, methodology, linguistics, scientific reports, and chronicle. Reviews and supplements, tapes and other materials are also included. (Text is in German.) (IFS/WGA)
Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E
2013-06-07
In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.
Three-dimensional analysis of anisotropic spatially reinforced structures
NASA Technical Reports Server (NTRS)
Bogdanovich, Alexander E.
1993-01-01
The material-adaptive three-dimensional analysis of inhomogeneous structures based on the meso-volume concept and application of deficient spline functions for displacement approximations is proposed. The general methodology is demonstrated on the example of a brick-type mosaic parallelepiped arbitrarily composed of anisotropic meso-volumes. A partition of each meso-volume into sub-elements, application of deficient spline functions for a local approximation of displacements and, finally, the use of the variational principle allows one to obtain displacements, strains, and stresses at anypoint within the structural part. All of the necessary external and internal boundary conditions (including the conditions of continuity of transverse stresses at interfaces between adjacent meso-volumes) can be satisfied with requisite accuracy by increasing the density of the sub-element mesh. The application of the methodology to textile composite materials is described. Several numerical examples for woven and braided rectangular composite plates and stiffened panels under transverse bending are considered. Some typical effects of stress concentrations due to the material inhomogeneities are demonstrated.
Matias-Guiu, Pau; Rodríguez-Bencomo, Juan José; Pérez-Correa, José R; López, Francisco
2018-04-15
Developing new distillation strategies can help the spirits industry to improve quality, safety and process efficiency. Batch stills equipped with a packed column and an internal partial condenser are an innovative experimental system, allowing a fast and flexible management of the rectification. In this study, the impact of four factors (heart-cut volume, head-cut volume, pH and cooling flow rate of the internal partial condenser during the head-cut fraction) on 18 major volatile compounds of Muscat spirits was optimized using response surface methodology and desirability function approaches. Results have shown that high rectification at the beginning of the heart-cut enhances the overall positive aroma compounds of the product, reducing off-flavor compounds. In contrast, optimum levels of heart-cut volume, head-cut volume and pH factors varied depending on the process goal. Finally, three optimal operational conditions (head off-flavors reduction, flowery terpenic enhancement and fruity ester enhancement) were evaluated by chemical and sensory analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Methodological challenges when doing research that includes ethnic minorities: a scoping review.
Morville, Anne-Le; Erlandsson, Lena-Karin
2016-11-01
There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.
Experimental Validation of an Integrated Controls-Structures Design Methodology
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.
1996-01-01
The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.
NASA Astrophysics Data System (ADS)
Pawar, Sumedh; Sharma, Atul
2018-01-01
This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.
Payload training methodology study
NASA Technical Reports Server (NTRS)
1990-01-01
The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.
Patrick D. Miles; Andrew D. Hill
2010-01-01
The U.S. Forest Service's Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. This report documents the methodology used to estimate live-tree gross, net, and sound volume for the 24 States inventoried by the Northern Research Station's (NRS) FIA unit. Sound volume is of particular interest...
Current Concerns in Validity Theory.
ERIC Educational Resources Information Center
Kane, Michael
Validity is concerned with the clarification and justification of the intended interpretations and uses of observed scores. It has not been easy to formulate a general methodology set of principles for validation, but progress has been made, especially as the field has moved from relatively limited criterion-related models to sophisticated…
Validation of SMAP surface soil moisture products with core validation sites
USDA-ARS?s Scientific Manuscript database
The NASA Soil Moisture Active Passive (SMAP) mission has utilized a set of core validation sites as the primary methodology in assessing the soil moisture retrieval algorithm performance. Those sites provide well-calibrated in situ soil moisture measurements within SMAP product grid pixels for diver...
Initial Development and Validation of the Global Citizenship Scale
ERIC Educational Resources Information Center
Morais, Duarte B.; Ogden, Anthony C.
2011-01-01
The purpose of this article is to report on the initial development of a theoretically grounded and empirically validated scale to measure global citizenship. The methodology employed is multi-faceted, including two expert face validity trials, extensive exploratory and confirmatory factor analyses with multiple datasets, and a series of three…
Ver Elst, K; Vermeiren, S; Schouwers, S; Callebaut, V; Thomson, W; Weekx, S
2013-12-01
CLSI recommends a minimal citrate tube fill volume of 90%. A validation protocol with clinical and analytical components was set up to determine the tube fill threshold for international normalized ratio of prothrombin time (PT-INR), activated partial thromboplastin time (aPTT) and fibrinogen. Citrated coagulation samples from 16 healthy donors and eight patients receiving vitamin K antagonists (VKA) were evaluated. Eighty-nine tubes were filled to varying volumes of >50%. Coagulation tests were performed on ACL TOP 500 CTS(®) . Receiver Operating Characteristic (ROC) plot, with Total error (TE) and critical difference (CD) as possible acceptance criteria, was used to determine the fill threshold. Receiving Operating Characteristic was the most accurate with CD for PT-INR and TE for aPTT resulting in thresholds of 63% for PT and 80% for aPTT. By adapted ROC, based on threshold setting at a point of 100% sensitivity at a maximum specificity, CD was best for PT and TE for aPTT resulting in thresholds of 73% for PT and 90% for aPTT. For fibrinogen, the method was only valid with the TE criterion at a 63% fill volume. In our study, we validated the minimal citrate tube fill volumes of 73%, 90% and 63% for PT-INR, aPTT and fibrinogen, respectively. © 2013 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…
Positive Deviance: Learning from Positive Anomalies
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Gale, Dick
2017-01-01
Purpose: This paper is one of seven in this volume, each elaborating different approaches to quality improvement in education. The purpose of this paper is to delineate a methodology called positive deviance. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an…
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
NASA Technical Reports Server (NTRS)
Greitzer, E. M.; Bonnefoy, P. A.; delaRosaBlanco, E.; Dorbian, C. S.; Drela, M.; Hall, D. K.; Hansman, R. J.; Hileman, J. I.; Liebeck, R. H.; Lovegren, J.;
2010-01-01
Appendices A to F present the theory behind the TASOPT methodology and code. Appendix A describes the bulk of the formulation, while Appendices B to F develop the major sub-models for the engine, fuselage drag, BLI accounting, etc.
Predeployment validation of fault-tolerant systems through software-implemented fault insertion
NASA Technical Reports Server (NTRS)
Czeck, Edward W.; Siewiorek, Daniel P.; Segall, Zary Z.
1989-01-01
Fault injection-based automated testing (FIAT) environment, which can be used to experimentally characterize and evaluate distributed realtime systems under fault-free and faulted conditions is described. A survey is presented of validation methodologies. The need for fault insertion based on validation methodologies is demonstrated. The origins and models of faults, and motivation for the FIAT concept are reviewed. FIAT employs a validation methodology which builds confidence in the system through first providing a baseline of fault-free performance data and then characterizing the behavior of the system with faults present. Fault insertion is accomplished through software and allows faults or the manifestation of faults to be inserted by either seeding faults into memory or triggering error detection mechanisms. FIAT is capable of emulating a variety of fault-tolerant strategies and architectures, can monitor system activity, and can automatically orchestrate experiments involving insertion of faults. There is a common system interface which allows ease of use to decrease experiment development and run time. Fault models chosen for experiments on FIAT have generated system responses which parallel those observed in real systems under faulty conditions. These capabilities are shown by two example experiments each using a different fault-tolerance strategy.
Repetitive deliberate fires: Development and validation of a methodology to detect series.
Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi
2017-08-01
The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Reverse Engineering Validation using a Benchmark Synthetic Gene Circuit in Human Cells
Kang, Taek; White, Jacob T.; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-01-01
Multi-component biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network. PMID:23654266
Reverse engineering validation using a benchmark synthetic gene circuit in human cells.
Kang, Taek; White, Jacob T; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-05-17
Multicomponent biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network.
Using remote sensing for volumetric analyses of soil degradation by erosion
NASA Astrophysics Data System (ADS)
Vlacilova, Marketa; Krasa, Josef; Kavka, Petr
2014-05-01
Soil degradation by erosion can be effectively monitored or quantified by modern tools of remote sensing with variable level of detail accessible. The presented study deals with rill erosion assessment using stereoscopic images and orthophotos obtained by UAV (unmanned aerial vehicle). Advantages of UAVs are data in high resolution (1-10 cm/pixel), flexibility of data acquisition and price in comparison with standard aerial photography. Location attacked by intensive rainfall event in the spring 2013 was selected for this study of volumetric assessment of soil degradation by erosion. After the storm, rills and ephemeral gullies in different scales were detected on several fields in the target area. The study was focused on a single parcel catchment (12.5 ha) which attach to the main ephemeral gully in the monitored field. DEM of the location was obtained from UAV stereo images and official LIDAR data. At the same time, in-situ monitoring was effected for comparison and validation of methodology. The field measurement consisted of soil sampling and taking detailed stereo photographs of erosion rills. The photographs were processed by PhotoModeler Scanner software to obtain detailed surface data (TIN) of particular rills. The model for automatic and precise volumetric assessment of single rills was developed within ArcGIS. The whole study area DEM obtained from UAV was also analysed in ArcGIS using similar methodology for computation of rill volumes. The UAV DEM detected most rill bottoms and shapes however the level of detail was too low for actual sediment transport volume estimate. Therefore the volume obtained from UAV DEM was calibrated by the detailed models of single rills acquired by field measurement. Prior the calibration the UAV DEM volume was underestimated by 40-85% based on the rill size. Afterwards the target area was split into twelve separated regions defined by intensity and form of soil degradation (orthophoto-classified rill density). Equally, at least one representative square plot in each section was created. Next, the volume of erosion rills in each square plot was calculated and corrected by referenced relation. These results were extrapolated to the whole of the study catchment. The study contains volumetric evaluation of actual soil loss by rill erosion in detailed scale and in addition, there is a model for rill volume evaluation in highly detached fields. The results illustrate that the volume of soil loss can reach extreme values in detached areas after only one intensive rainfall event. Hundreds of cubic metres of soil can be transported in rills and ephemeral gullies from a single hectare of arable land. Findings are useful for development and verification of procedures for the identification and evaluation of actual degradation of agricultural land by water erosion. The research has been supported by the project No. QJ330118 "Using Remote Sensing for Monitoring of Soil Degradation by Erosion and Erosion Effects".
Validation of urban freeway models.
DOT National Transportation Integrated Search
2015-01-01
This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... Program the ability to conduct pretests which evaluate the validity and reliability of information... the proposed collection of information, including the validity of the methodology and assumptions used...
Multiple reaction monitoring (MRM) of plasma proteins in cardiovascular proteomics.
Dardé, Verónica M; Barderas, Maria G; Vivanco, Fernando
2013-01-01
Different methodologies have been used through years to discover new potential biomarkers related with cardiovascular risk. The conventional proteomic strategy involves a discovery phase that requires the use of mass spectrometry (MS) and a validation phase, usually on an alternative platform such as immunoassays that can be further implemented in clinical practice. This approach is suitable for a single biomarker, but when large panels of biomarkers must be validated, the process becomes inefficient and costly. Therefore, it is essential to find an alternative methodology to perform the biomarker discovery, validation, and -quantification. The skills provided by quantitative MS turn it into an extremely attractive alternative to antibody-based technologies. Although it has been traditionally used for quantification of small molecules in clinical chemistry, MRM is now emerging as an alternative to traditional immunoassays for candidate protein biomarker validation.
Validation of virtual learning object to support the teaching of nursing care systematization.
Salvador, Pétala Tuani Candido de Oliveira; Mariz, Camila Maria Dos Santos; Vítor, Allyne Fortes; Ferreira Júnior, Marcos Antônio; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira
2018-01-01
to describe the content validation process of a Virtual Learning Object to support the teaching of nursing care systematization to nursing professionals. methodological study, with quantitative approach, developed according to the methodological reference of Pasquali's psychometry and conducted from March to July 2016, from two-stage Delphi procedure. in the Delphi 1 stage, eight judges evaluated the Virtual Object; in Delphi 2 stage, seven judges evaluated it. The seven screens of the Virtual Object were analyzed as to the suitability of its contents. The Virtual Learning Object to support the teaching of nursing care systematization was considered valid in its content, with a Total Content Validity Coefficient of 0.96. it is expected that the Virtual Object can support the teaching of nursing care systematization in light of appropriate and effective pedagogical approaches.
Snodgrass, Melinda R; Chung, Moon Y; Meadan, Hedda; Halle, James W
2018-03-01
Single-case research (SCR) has been a valuable methodology in special education research. Montrose Wolf (1978), an early pioneer in single-case methodology, coined the term "social validity" to refer to the social importance of the goals selected, the acceptability of procedures employed, and the effectiveness of the outcomes produced in applied investigations. Since 1978, many contributors to SCR have included social validity as a feature of their articles and several authors have examined the prevalence and role of social validity in SCR. We systematically reviewed all SCR published in six highly-ranked special education journals from 2005 to 2016 to establish the prevalence of social validity assessments and to evaluate their scientific rigor. We found relatively low, but stable prevalence with only 28 publications addressing all three factors of the social validity construct (i.e., goals, procedures, outcomes). We conducted an in-depth analysis of the scientific rigor of these 28 publications. Social validity remains an understudied construct in SCR, and the scientific rigor of social validity assessments is often lacking. Implications and future directions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, C; Yin, Y
Purpose: The purpose of this study was to compare a radiation therapy treatment planning that would spare active bone marrow and whole pelvic bone marrow using 18F FLT PET/CT image. Methods: We have developed an IMRT planning methodology to incorporate functional PET imaging using 18F FLT/CT scans. Plans were generated for two cervical cancer patients, where pelvicactive bone marrow region was incorporated as avoidance regions based on the range: SUV>2., another region was whole pelvic bone marrow. Dose objectives were set to reduce the volume of active bone marrow and whole bone marraw. The volumes of received 10 (V10) andmore » 20 (V20) Gy for active bone marrow were evaluated. Results: Active bone marrow regions identified by 18F FLT with an SUV>2 represented an average of 48.0% of the total osseous pelvis for the two cases studied. Improved dose volume histograms for identified bone marrow SUV volumes and decreases in V10(average 18%), and V20(average 14%) were achieved without clinically significant changes to PTV or OAR doses. Conclusion: Incorporation of 18F FLT/CT PET in IMRT planning provides a methodology to reduce radiation dose to active bone marrow without compromising PTV or OAR dose objectives in cervical cancer.« less
A Validated Methodology for Genetic Identification of Tuna Species (Genus Thunnus)
Viñas, Jordi; Tudela, Sergi
2009-01-01
Background Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. Methodology After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Conclusions Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned. PMID:19898615
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niedzielski, J; Martel, M; Tucker, S
2014-06-15
Purpose: Radiation induces an inflammatory response in the esophagus, discernible on CT studies. This work objectively quantifies the voxel esophageal radiation-response for patients with acute esophagitis. This knowledge is an important first-step towards predicting the effect of complex dose distributions on patient esophagitis symptoms. Methods: A previously validated voxel-based methodology of quantifying radiation esophagitis severity was used to identify the voxel dose-response for 18 NSCLC patients with severe esophagitis (CTCAE grading criteria, grade2 or higher). The response is quantified as percent voxel volume change for a given dose. During treatment (6–8 weeks), patients had weekly 4DCT studies and esophagitis scoring.more » Planning CT esophageal contours were deformed to each weekly CT using a demons DIR algorithm. An algorithm using the Jacobian Map from the DIR of the planning CT to all weekly CTs was used to quantify voxel-volume change, along with corresponding delivered voxel dose, to the planning voxel. Dose for each voxel for each time-point was calculated on each previous weekly CT image, and accumulated using DIR. Thus, for each voxel, the volume-change and delivered dose was calculated for each time-point. The data was binned according to when the volume-change first increased by a threshold volume (10%–100%, in 10% increments), and the average delivered dose calculated for each bin. Results: The average dose resulting in a voxel volume increase of 10–100% was 21.6 to 45.9Gy, respectively. The mean population dose to give a 50% volume increase was 36.3±4.4Gy, (range:29.8 to 43.5Gy). The average week of 50% response was 4.1 (range:4.9 to 2.8 weeks). All 18 patients showed similar dose to first response curves, showing a common trend in the initial inflammatoryresponse. Conclusion: We extracted the dose-response curve of the esophagus on a voxel-to-voxel level. This may be useful for estimating the esophagus response (and patient symptoms) to complicated dose distributions.« less
NASA Astrophysics Data System (ADS)
Hoppmann, Mario; Hunkeler, Priska A.; Hendricks, Stefan; Kalscheuer, Thomas; Gerdes, Rüdiger
2016-04-01
In Antarctica, ice crystals (platelets) form and grow in supercooled waters below ice shelves. These platelets rise, accumulate beneath nearby sea ice, and subsequently form a several meter thick, porous sub-ice platelet layer. This special ice type is a unique habitat, influences sea-ice mass and energy balance, and its volume can be interpreted as an indicator of the health of an ice shelf. Although progress has been made in determining and understanding its spatio-temporal variability based on point measurements, an investigation of this phenomenon on a larger scale remains a challenge due to logistical constraints and a lack of suitable methodology. In the present study, we applied a lateral constrained Marquardt-Levenberg inversion to a unique multi-frequency electromagnetic (EM) induction sounding dataset obtained on the ice-shelf influenced fast-ice regime of Atka Bay, eastern Weddell Sea. We adapted the inversion algorithm to incorporate a sensor specific signal bias, and confirmed the reliability of the algorithm by performing a sensitivity study using synthetic data. We inverted the field data for sea-ice and platelet-layer thickness and electrical conductivity, and calculated ice-volume fractions within the platelet layer using Archie's Law. The thickness results agreed well with drillhole validation datasets within the uncertainty range, and the ice-volume fraction yielded results comparable to other studies. Both parameters together enable an estimation of the total ice volume within the platelet layer, which was found to be comparable to the volume of landfast sea ice in this region, and corresponded to more than a quarter of the annual basal melt volume of the nearby Ekström Ice Shelf. Our findings show that multi-frequency EM induction sounding is a suitable approach to efficiently map sea-ice and platelet-layer properties, with important implications for research into ocean/ice-shelf/sea-ice interactions. However, a successful application of this technique requires a break with traditional EM sensor calibration strategies due to the need of absolute calibration with respect to a physical forward model.
DOT National Transportation Integrated Search
1982-09-01
This project provides information about norms and attitudes related to alcohol use and driving. This volume reports the methodology, findings, discussions, and conclusions of individual interviews conducted with early adolescents (ages 13-14), middle...
DOT National Transportation Integrated Search
1983-05-01
This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...
DOT National Transportation Integrated Search
2014-04-01
This report describes the methodology and results of analyses performed to identify and evaluate : alternative methods to control traffic entering a lane closure on a two-lane, two-way road from low-volume : access points. Researchers documented the ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dokhane, A.; Canepa, S.; Ferroukhi, H.
For stability analyses of the Swiss operating Boiling-Water-Reactors (BWRs), the methodology employed and validated so far at the Paul Scherrer Inst. (PSI) was based on the RAMONA-3 code with a hybrid upstream static lattice/core analysis approach using CASMO-4 and PRESTO-2. More recently, steps were undertaken towards a new methodology based on the SIMULATE-3K (S3K) code for the dynamical analyses combined with the CMSYS system relying on the CASMO/SIMULATE-3 suite of codes and which was established at PSI to serve as framework for the development and validation of reference core models of all the Swiss reactors and operated cycles. This papermore » presents a first validation of the new methodology on the basis of a benchmark recently organised by a Swiss utility and including the participation of several international organisations with various codes/methods. Now in parallel, a transition from CASMO-4E (C4E) to CASMO-5M (C5M) as basis for the CMSYS core models was also recently initiated at PSI. Consequently, it was considered adequate to address the impact of this transition both for the steady-state core analyses as well as for the stability calculations and to achieve thereby, an integral approach for the validation of the new S3K methodology. Therefore, a comparative assessment of C4 versus C5M is also presented in this paper with particular emphasis on the void coefficients and their impact on the downstream stability analysis results. (authors)« less
Assessing validity of observational intervention studies - the Benchmarking Controlled Trials.
Malmivaara, Antti
2016-09-01
Benchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations. To create and pilot test a checklist for appraising methodological validity of a BCT. The checklist was created by extracting the most essential elements from the comprehensive set of criteria in the previous paper on BCTs. Also checklists and scientific papers on observational studies and respective systematic reviews were utilized. Ten BCTs published in the Lancet and in the New England Journal of Medicine were used to assess feasibility of the created checklist. The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies. The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies. However, the piloted checklist should be validated in further studies. Key messages Benchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations. This paper presents a checklist for appraising methodological validity of BCTs and pilot-tests the checklist with ten BCTs published in leading medical journals. The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies. The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies.
NPS Transit System Passenger Boardings Study: Converting Ticket Sales to Passenger Boardings.
DOT National Transportation Integrated Search
2016-01-01
This report examines the reporting of passenger boardings (unlinked passenger trips) by NPS transit systems that use a ticket sales conversion methodology. By studying and validating the park units' passenger boarding methodology from converting tick...
High-Order Moving Overlapping Grid Methodology in a Spectral Element Method
NASA Astrophysics Data System (ADS)
Merrill, Brandon E.
A moving overlapping mesh methodology that achieves spectral accuracy in space and up to second-order accuracy in time is developed for solution of unsteady incompressible flow equations in three-dimensional domains. The targeted applications are in aerospace and mechanical engineering domains and involve problems in turbomachinery, rotary aircrafts, wind turbines and others. The methodology is built within the dual-session communication framework initially developed for stationary overlapping meshes. The methodology employs semi-implicit spectral element discretization of equations in each subdomain and explicit treatment of subdomain interfaces with spectrally-accurate spatial interpolation and high-order accurate temporal extrapolation, and requires few, if any, iterations, yet maintains the global accuracy and stability of the underlying flow solver. Mesh movement is enabled through the Arbitrary Lagrangian-Eulerian formulation of the governing equations, which allows for prescription of arbitrary velocity values at discrete mesh points. The stationary and moving overlapping mesh methodologies are thoroughly validated using two- and three-dimensional benchmark problems in laminar and turbulent flows. The spatial and temporal global convergence, for both methods, is documented and is in agreement with the nominal order of accuracy of the underlying solver. Stationary overlapping mesh methodology was validated to assess the influence of long integration times and inflow-outflow global boundary conditions on the performance. In a turbulent benchmark of fully-developed turbulent pipe flow, the turbulent statistics are validated against the available data. Moving overlapping mesh simulations are validated on the problems of two-dimensional oscillating cylinder and a three-dimensional rotating sphere. The aerodynamic forces acting on these moving rigid bodies are determined, and all results are compared with published data. Scaling tests, with both methodologies, show near linear strong scaling, even for moderately large processor counts. The moving overlapping mesh methodology is utilized to investigate the effect of an upstream turbulent wake on a three-dimensional oscillating NACA0012 extruded airfoil. A direct numerical simulation (DNS) at Reynolds Number 44,000 is performed for steady inflow incident upon the airfoil oscillating between angle of attack 5.6° and 25° with reduced frequency k=0.16. Results are contrasted with subsequent DNS of the same oscillating airfoil in a turbulent wake generated by a stationary upstream cylinder.
2014-01-01
To describe flow or transport phenomena in porous media, relations between aquifer hydraulic conductivity and effective porosity can prove useful, avoiding the need to perform expensive and time consuming measurements. The practical applications generally require the determination of this parameter at field scale, while most of the empirical and semiempirical formulas, based on grain size analysis and allowing determination of the hydraulic conductivity from the porosity, are related to the laboratory scale and thus are not representative of the aquifer volumes to which one refers. Therefore, following the grain size distribution methodology, a new experimental relation between hydraulic conductivity and effective porosity, representative of aquifer volumes at field scale, is given for a confined aquifer. The experimental values used to determine this law were obtained for both parameters using only field measurements methods. The experimental results found, also if in the strict sense valid only for the investigated aquifer, can give useful suggestions for other alluvial aquifers with analogous characteristics of grain-size distribution. Limited to the investigated range, a useful comparison with the best known empirical formulas based on grain size analysis was carried out. The experimental data allowed also investigation of the existence of a scaling behaviour for both parameters considered. PMID:25180202
Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K
2015-01-01
Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708
van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre
2017-09-01
Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.
Jones, Jeryl C; Appt, Susan E; Werre, Stephen R; Tan, Joshua C; Kaplan, Jay R
2010-06-01
The purpose of this study was to validate low radiation dose, contrast-enhanced, multi-detector computed tomography (MDCT) as a non-invasive method for measuring ovarian volume in macaques. Computed tomography scans of four known-volume phantoms and nine mature female cynomolgus macaques were acquired using a previously described, low radiation dose scanning protocol, intravenous contrast enhancement, and a 32-slice MDCT scanner. Immediately following MDCT, ovaries were surgically removed and the ovarian weights were measured. The ovarian volumes were determined using water displacement. A veterinary radiologist who was unaware of actual volumes measured ovarian CT volumes three times, using a laptop computer, pen display tablet, hand-traced regions of interest, and free image analysis software. A statistician selected and performed all tests comparing the actual and CT data. Ovaries were successfully located in all MDCT scans. The iliac arteries and veins, uterus, fallopian tubes, cervix, ureters, urinary bladder, rectum, and colon were also consistently visualized. Large antral follicles were detected in six ovaries. Phantom mean CT volume was 0.702+/-SD 0.504 cc and the mean actual volume was 0.743+/-SD 0.526 cc. Ovary mean CT volume was 0.258+/-SD 0.159 cc and mean water displacement volume was 0.257+/-SD 0.145 cc. For phantoms, the mean coefficient of variation for CT volumes was 2.5%. For ovaries, the least squares mean coefficient of variation for CT volumes was 5.4%. The ovarian CT volume was significantly associated with actual ovarian volume (ICC coefficient 0.79, regression coefficient 0.5, P=0.0006) and the actual ovarian weight (ICC coefficient 0.62, regression coefficient 0.6, P=0.015). There was no association between the CT volume accuracy and mean ovarian CT density (degree of intravenous contrast enhancement), and there was no proportional or fixed bias in the CT volume measurements. Findings from this study indicate that MDCT is a valid non-invasive technique for measuring the ovarian volume in macaques.
Validation of the ULCEAT methodology by applying it in retrospect to the Roboticbed.
Nakamura, Mio; Suzurikawa, Jun; Tsukada, Shohei; Kume, Yohei; Kawakami, Hideo; Inoue, Kaoru; Inoue, Takenobu
2015-01-01
In answer to the increasing demand for care by the Japanese oldest portion of the population, an extensive programme of life support robots is under development, advocated by the Japanese government. Roboticbed® (RB) is developed to facilitate patients in their daily life in making independent transfers from and to the bed. The bed is intended both for elderly and persons with a disability. The purpose of this study is to examine the validity of the user and user's life centred clinical evaluation of assistive technology (ULCEAT) methodology. To support user centred development of life support robots the ULCEAT method was developed. By means of the ULCEAT method the target users and the use environment were re-established in an earlier study. The validity of the method is tested by re-evaluating the development of RB in retrospect. Six participants used the first prototype of RB (RB1) and eight participants used the second prototype of RB (RB2). The results indicated that the functionality was improved owing to the end-user evaluations. Therefore, we confirmed the content validity of the proposed ULCEAT method. In this study we confirmed the validation of the ULCEAT methodology by applying it in retrospect to RB using development process. This method will be used for the development of Life-support robots and prototype assistive technologies.
ERIC Educational Resources Information Center
Grasso, Janet; Fosburg, Steven
Fifth in a series of seven volumes reporting the design, methodology, and findings of the 4-year National Day Care Home Study (NDCHS), this volume presents a descriptive and statistical analysis of the day care institutions that administer day care systems. These systems, such as Learning Unlimited in Los Angeles and the family day care program of…
NASA Astrophysics Data System (ADS)
Michelon, M. F.; Antonelli, A.
2010-03-01
We have developed a methodology to study the thermodynamics of order-disorder transformations in n -component substitutional alloys that combines nonequilibrium methods, which can efficiently compute free energies, with Monte Carlo simulations, in which configurational and vibrational degrees of freedom are simultaneously considered on an equal footing basis. Furthermore, with this methodology one can easily perform simulations in the canonical and in the isobaric-isothermal ensembles, which allow the investigation of the bulk volume effect. We have applied this methodology to calculate configurational and vibrational contributions to the entropy of the Ni3Al alloy as functions of temperature. The simulations show that when the volume of the system is kept constant, the vibrational entropy does not change upon transition while constant-pressure calculations indicate that the volume increase at the order-disorder transition causes a vibrational entropy increase of 0.08kB/atom . This is significant when compared to the configurational entropy increase of 0.27kB/atom . Our calculations also indicate that the inclusion of vibrations reduces in about 30% the order-disorder transition temperature determined solely considering the configurational degrees of freedom.
Child Language Research: Building on the Past, Looking to the Future.
ERIC Educational Resources Information Center
Perera, Katharine
1994-01-01
Outlines descriptive, theoretical, and methodological advances in child language research since the first volume of the "Journal of Child Language" was published. Papers in this volume build on earlier research, point the way to new research avenues, and open new lines of inquiry. (Contains 36 references.) (JP)
DOT National Transportation Integrated Search
1983-03-01
This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...
DOT National Transportation Integrated Search
1982-09-01
This project provides information about norms and attitudes related to alcohol usage and driving. This volume reports the methodology, findings, discussion and conclusions of three focus groups: two with parents of teenaged drivers and one with adult...
Air Pollution. Part A: Analysis.
ERIC Educational Resources Information Center
Ledbetter, Joe O.
Two facets of the engineering control of air pollution (the analysis of possible problems and the application of effective controls) are covered in this two-volume text. Part A covers Analysis, and Part B, Prevention and Control. (This review is concerned with Part A only.) This volume deals with the terminology, methodology, and symptomatology…
Language Learners in Study Abroad Contexts. Second Language Acquisition
ERIC Educational Resources Information Center
DuFon, Margaret A., Ed.; Churchill, Eton, Ed.
2006-01-01
Examining the overseas experience of language learners in diverse contexts through a variety of theoretical and methodological approaches, studies in this volume look at the acquisition of language use, socialization processes, learner motivation, identity and learning strategies. In this way, the volume offers a privileged window into learner…
METHANE EMISSIONS FROM THE NATURAL GAS INDUSTRY VOLUME 3: GENERAL METHODOLOGY
The 15-volume report summarizes the results of a comprehensive program to quantify methane (CH4) emissions from the U.S. natural gas industry for the base year. The objective was to determine CH4 emissions from the wellhead and ending downstream at the customer's meter. The accur...
METHANE EMISSIONS FROM THE NATURAL GAS INDUSTRY VOLUME 4: STATISTICAL METHODOLOGY
The 15-volume report summarizes the results of a comprehensive program to quantify methane (CH4) emissions from the U.S. natural gas industry for the base year. The objective was to determine CH4 emissions from the wellhead and ending downstream at the customer's meter. The accur...
Tax Wealth in Fifty States. 1977 Supplement.
ERIC Educational Resources Information Center
Halstead, D. Kent; Weldon, H. Kent
This first supplement to the basic volume presents tax capacity, effort, and collected revenue data for state and local governments for 1977. Planned for issuance every other year, the supplement consists of computer printout tables with the earlier basic volume continuing to serve as reference for theory, analysis, and methodology. Figures for…
ERIC Educational Resources Information Center
System Development Corp., Santa Monica, CA.
A national data program for the marine environment is recommended. Volume 2 includes: (1) objectives, scope, and methodology; (2) summary of the technical development plan; (3) agency development plans - Great Lakes and coastal development and (4) marine data network development plans. (Author)
Empirical Assessment of the Mean Block Volume of Rock Masses Intersected by Four Joint Sets
NASA Astrophysics Data System (ADS)
Morelli, Gian Luca
2016-05-01
The estimation of a representative value for the rock block volume ( V b) is of huge interest in rock engineering in regards to rock mass characterization purposes. However, while mathematical relationships to precisely estimate this parameter from the spacing of joints can be found in literature for rock masses intersected by three dominant joint sets, corresponding relationships do not actually exist when more than three sets occur. In these cases, a consistent assessment of V b can only be achieved by directly measuring the dimensions of several representative natural rock blocks in the field or by means of more sophisticated 3D numerical modeling approaches. However, Palmström's empirical relationship based on the volumetric joint count J v and on a block shape factor β is commonly used in the practice, although strictly valid only for rock masses intersected by three joint sets. Starting from these considerations, the present paper is primarily intended to investigate the reliability of a set of empirical relationships linking the block volume with the indexes most commonly used to characterize the degree of jointing in a rock mass (i.e. the J v and the mean value of the joint set spacings) specifically applicable to rock masses intersected by four sets of persistent discontinuities. Based on the analysis of artificial 3D block assemblies generated using the software AutoCAD, the most accurate best-fit regression has been found between the mean block volume (V_{{{{b}}_{{m}} }}) of tested rock mass samples and the geometric mean value of the spacings of the joint sets delimiting blocks; thus, indicating this mean value as a promising parameter for the preliminary characterization of the block size. Tests on field outcrops have demonstrated that the proposed empirical methodology has the potential of predicting the mean block volume of multiple-set jointed rock masses with an acceptable accuracy for common uses in most practical rock engineering applications.
Focus control enhancement and on-product focus response analysis methodology
NASA Astrophysics Data System (ADS)
Kim, Young Ki; Chen, Yen-Jen; Hao, Xueli; Samudrala, Pavan; Gomez, Juan-Manuel; Mahoney, Mark O.; Kamalizadeh, Ferhad; Hanson, Justin K.; Lee, Shawn; Tian, Ye
2016-03-01
With decreasing CDOF (Critical Depth Of Focus) for 20/14nm technology and beyond, focus errors are becoming increasingly critical for on-product performance. Current on product focus control techniques in high volume manufacturing are limited; It is difficult to define measurable focus error and optimize focus response on product with existing methods due to lack of credible focus measurement methodologies. Next to developments in imaging and focus control capability of scanners and general tool stability maintenance, on-product focus control improvements are also required to meet on-product imaging specifications. In this paper, we discuss focus monitoring, wafer (edge) fingerprint correction and on-product focus budget analysis through diffraction based focus (DBF) measurement methodology. Several examples will be presented showing better focus response and control on product wafers. Also, a method will be discussed for a focus interlock automation system on product for a high volume manufacturing (HVM) environment.
Environmental exposure effects on composite materials for commercial aircraft
NASA Technical Reports Server (NTRS)
Hoffman, D. J.
1978-01-01
Activities reported include completion of the program design tasks, resolution of a high fiber volume problem and resumption of specimen fabrication, fixture fabrication, and progress on the analysis methodology and definition of the typical aircraft environment. Program design activities including test specimens, specimen holding fixtures, flap-track fairing tailcones, and ground exposure racks were completed. The problem experienced in obtaining acceptable fiber volume fraction results on two of the selected graphite epoxy material systems was resolved with an alteration to the bagging procedure called out in BAC 5562. The revised bagging procedure, involving lower numbers of bleeder plies, produces acceptable results. All required laminates for the contract have now been laid up and cured. Progress in the area of analysis methodology has been centered about definition of the environment that a commercial transport aircraft undergoes. The selected methodology is analagous to fatigue life assessment.
Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory
ERIC Educational Resources Information Center
Long, Haiying
2017-01-01
Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…
Situating Standard Setting within Argument-Based Validity
ERIC Educational Resources Information Center
Papageorgiou, Spiros; Tannenbaum, Richard J.
2016-01-01
Although there has been substantial work on argument-based approaches to validation as well as standard-setting methodologies, it might not always be clear how standard setting fits into argument-based validity. The purpose of this article is to address this lack in the literature, with a specific focus on topics related to argument-based…
Optimal Modality Selection for Cooperative Human-Robot Task Completion.
Jacob, Mithun George; Wachs, Juan P
2016-12-01
Human-robot cooperation in complex environments must be fast, accurate, and resilient. This requires efficient communication channels where robots need to assimilate information using a plethora of verbal and nonverbal modalities such as hand gestures, speech, and gaze. However, even though hybrid human-robot communication frameworks and multimodal communication have been studied, a systematic methodology for designing multimodal interfaces does not exist. This paper addresses the gap by proposing a novel methodology to generate multimodal lexicons which maximizes multiple performance metrics over a wide range of communication modalities (i.e., lexicons). The metrics are obtained through a mixture of simulation and real-world experiments. The methodology is tested in a surgical setting where a robot cooperates with a surgeon to complete a mock abdominal incision and closure task by delivering surgical instruments. Experimental results show that predicted optimal lexicons significantly outperform predicted suboptimal lexicons (p <; 0.05) in all metrics validating the predictability of the methodology. The methodology is validated in two scenarios (with and without modeling the risk of a human-robot collision) and the differences in the lexicons are analyzed.
Neuroimaging correlates of parent ratings of working memory in typically developing children
Mahone, E. Mark; Martin, Rebecca; Kates, Wendy R.; Hay, Trisha; Horská, Alena
2009-01-01
The purpose of the present study was to investigate construct validity of parent ratings of working memory in children, using a multi-trait/multi-method design including neuroimaging, rating scales, and performance-based measures. Thirty-five typically developing children completed performance-based tests of working memory and nonexecutive function (EF) skills, received volumetric MRI, and were rated by parents on both EF-specific and broad behavior rating scales. After controlling for total cerebral volume and age, parent ratings of working memory were significantly correlated with frontal gray, but not temporal, parietal, or occipital gray, or any lobar white matter volumes. Performance-based measures of working memory were also moderately correlated with frontal lobe gray matter volume; however, non-EF parent ratings and non-EF performance-based measures were not correlated with frontal lobe volumes. Results provide preliminary support for the convergent and discriminant validity of parent ratings of working memory, and emphasize their utility in exploring brain–behavior relationships in children. Rating scales that directly examine EF skills may potentially have ecological validity, not only for “everyday” function, but also as correlates of brain volume. PMID:19128526
Steppan, Martin; Kraus, Ludwig; Piontek, Daniela; Siciliano, Valeria
2013-01-01
Prevalence estimation of cannabis use is usually based on self-report data. Although there is evidence on the reliability of this data source, its cross-cultural validity is still a major concern. External objective criteria are needed for this purpose. In this study, cannabis-related search engine query data are used as an external criterion. Data on cannabis use were taken from the 2007 European School Survey Project on Alcohol and Other Drugs (ESPAD). Provincial data came from three Italian nation-wide studies using the same methodology (2006-2008; ESPAD-Italia). Information on cannabis-related search engine query data was based on Google search volume indices (GSI). (1) Reliability analysis was conducted for GSI. (2) Latent measurement models of "true" cannabis prevalence were tested using perceived availability, web-based cannabis searches and self-reported prevalence as indicators. (3) Structure models were set up to test the influences of response tendencies and geographical position (latitude, longitude). In order to test the stability of the models, analyses were conducted on country level (Europe, US) and on provincial level in Italy. Cannabis-related GSI were found to be highly reliable and constant over time. The overall measurement model was highly significant in both data sets. On country level, no significant effects of response bias indicators and geographical position on perceived availability, web-based cannabis searches and self-reported prevalence were found. On provincial level, latitude had a significant positive effect on availability indicating that perceived availability of cannabis in northern Italy was higher than expected from the other indicators. Although GSI showed weaker associations with cannabis use than perceived availability, the findings underline the external validity and usefulness of search engine query data as external criteria. The findings suggest an acceptable relative comparability of national (provincial) prevalence estimates of cannabis use that are based on a common survey methodology. Search engine query data are a too weak indicator to base prevalence estimations on this source only, but in combination with other sources (waste water analysis, sales of cigarette paper) they may provide satisfactory estimates. Copyright © 2012. Published by Elsevier B.V.
Method validation for methanol quantification present in working places
NASA Astrophysics Data System (ADS)
Muna, E. D. M.; Bizarri, C. H. B.; Maciel, J. R. M.; da Rocha, G. P.; de Araújo, I. O.
2015-01-01
Given the widespread use of methanol by different industry sectors and high toxicity associated with this substance, it is necessary to use an analytical method able to determine in a sensitive, precise and accurate levels of methanol in the air of working environments. Based on the methodology established by the National Institute for Occupational Safety and Health (NIOSH), it was validated a methodology for determination of methanol in silica gel tubes which had demonstrated its effectiveness based on the participation of the international collaborative program sponsored by the American Industrial Hygiene Association (AIHA).
System for the Analysis of Global Energy Markets - Vol. I, Model Documentation
2003-01-01
Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.
ERIC Educational Resources Information Center
Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter
2010-01-01
Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…
DOT National Transportation Integrated Search
1980-06-01
This report presents the findings of a workshop on epidemiology in drugs and highway safety. A cross-disciplinary panel of experts (1) identified methodological issues and constraints present in research to define the nature and magnitude of the drug...
Challenges in Rotorcraft Acoustic Flight Prediction and Validation
NASA Technical Reports Server (NTRS)
Boyd, D. Douglas, Jr.
2003-01-01
Challenges associated with rotorcraft acoustic flight prediction and validation are examined. First, an outline of a state-of-the-art rotorcraft aeroacoustic prediction methodology is presented. Components including rotorcraft aeromechanics, high resolution reconstruction, and rotorcraft acoustic prediction arc discussed. Next, to illustrate challenges and issues involved, a case study is presented in which an analysis of flight data from a specific XV-15 tiltrotor acoustic flight test is discussed in detail. Issues related to validation of methodologies using flight test data are discussed. Primary flight parameters such as velocity, altitude, and attitude are discussed and compared for repeated flight conditions. Other measured steady state flight conditions are examined for consistency and steadiness. A representative example prediction is presented and suggestions are made for future research.
de Las Hazas, María Carmen López; Motilva, Maria José; Piñol, Carme; Macià, Alba
2016-10-01
In this study, a fast and simple blood sampling and sample pre-treatment method based on the use of the dried blood spot (DBS) cards and ultra-performance liquid chromatography coupled to tandem mass spectrometry (UPLC-MS/MS) for the quantification of olive oil phenolic metabolites in human blood was developed and validated. After validation, the method was applied to determine hydroxytyrosol metabolites in human blood samples after the acute intake of an olive oil phenolic extract. Using the FTA DMPK-A DBS card under optimum conditions, with 20µL as the blood solution volume, 100µL of methanol/Milli-Q water (50/50, v/v) as the extraction solvent and 7 disks punched out from the card, the main hydroxytyrosol metabolites (hydroxytyrosol-3-O-sulphate and hydroxytyrosol acetate sulphate) were identified and quantified. The developed methodology allowed detecting and quantifying the generated metabolites at low μM levels. The proposed method is a significant improvement over existing methods to determine phenolic metabolites circulating in blood and plasma samples, thus making blood sampling possible with the volunteer pricking their own finger, and the subsequent storage of the blood in the DBS cards prior to chromatographic analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Malaei, Reyhane; Ramezani, Amir M; Absalan, Ghodratollah
2018-05-04
A sensitive and reliable ultrasound-assisted dispersive liquid-liquid microextraction (UA-DLLME) procedure was developed and validated for extraction and analysis of malondialdehyde (MDA) as an important lipids-peroxidation biomarker in human plasma. In this methodology, to achieve an applicable extraction procedure, the whole optimization processes were performed in human plasma. To convert MDA into readily extractable species, it was derivatized to hydrazone structure-base by 2,4-dinitrophenylhydrazine (DNPH) at 40 °C within 60 min. Influences of experimental variables on the extraction process including type and volume of extraction and disperser solvents, amount of derivatization agent, temperature, pH, ionic strength, sonication and centrifugation times were evaluated. Under the optimal experimental conditions, the enhancement factor and extraction recovery were 79.8 and 95.8%, respectively. The analytical signal linearly (R 2 = 0.9988) responded over a concentration range of 5.00-4000 ng mL -1 with a limit of detection of 0.75 ng mL -1 (S/N = 3) in the plasma sample. To validate the developed procedure, the recommend guidelines of Food and Drug Administration for bioanalytical analysis have been employed. Copyright © 2018. Published by Elsevier B.V.
Podometrics as a Potential Clinical Tool for Glomerular Disease Management.
Kikuchi, Masao; Wickman, Larysa; Hodgin, Jeffrey B; Wiggins, Roger C
2015-05-01
Chronic kidney disease culminating in end-stage kidney disease is a major public health problem costing in excess of $40 billion per year with high morbidity and mortality. Current tools for glomerular disease monitoring lack precision and contribute to poor outcome. The podocyte depletion hypothesis describes the major mechanisms underlying the progression of glomerular diseases, which are responsible for more than 80% of cases of end-stage kidney disease. The question arises of whether this new knowledge can be used to improve outcomes and reduce costs. Podocytes have unique characteristics that make them an attractive monitoring tool. Methodologies for estimating podocyte number, size, density, glomerular volume and other parameters in routine kidney biopsies, and the rate of podocyte detachment from glomeruli into urine (podometrics) now have been developed and validated. They potentially fill important gaps in the glomerular disease monitoring toolbox. The application of these tools to glomerular disease groups shows good correlation with outcome, although data validating their use for individual decision making is not yet available. Given the urgency of the clinical problem, we argue that the time has come to focus on testing these tools for application to individualized clinical decision making toward more effective progression prevention. Copyright © 2015 Elsevier Inc. All rights reserved.
A Methodological Critique of the ProPublica Surgeon Scorecard
Friedberg, Mark W.; Pronovost, Peter J.; Shahian, David M.; Safran, Dana Gelb; Bilimoria, Karl Y.; Elliott, Marc N.; Damberg, Cheryl L.; Dimick, Justin B.; Zaslavsky, Alan M.
2016-01-01
Abstract On July 14, 2015, ProPublica published its Surgeon Scorecard, which displays “Adjusted Complication Rates” for individual, named surgeons for eight surgical procedures performed in hospitals. Public reports of provider performance have the potential to improve the quality of health care that patients receive. A valid performance report can drive quality improvement and usefully inform patients' choices of providers. However, performance reports with poor validity and reliability are potentially damaging to all involved. This article critiques the methods underlying the Scorecard and identifies opportunities for improvement. Until these opportunities are addressed, the authors advise users of the Scorecard—most notably, patients who might be choosing their surgeons—not to consider the Scorecard a valid or reliable predictor of the health outcomes any individual surgeon is likely to provide. The authors hope that this methodological critique will contribute to the development of more-valid and more-reliable performance reports in the future. PMID:28083411
From field data to volumes: constraining uncertainties in pyroclastic eruption parameters
NASA Astrophysics Data System (ADS)
Klawonn, Malin; Houghton, Bruce F.; Swanson, Donald A.; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.
2014-07-01
In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal ( s = 62 %) and distal field ( s = 53 %) and small for the densely sampled intermediate deposit ( s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.
Methodological Issues in Curriculum-Based Reading Assessment.
ERIC Educational Resources Information Center
Fuchs, Lynn S.; And Others
1984-01-01
Three studies involving elementary students examined methodological issues in curriculum-based reading assessment. Results indicated that (1) whereas sample duration did not affect concurrent validity, increasing duration reduced performance instability and increased performance slopes and (2) domain size was related inversely to performance slope…
Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics
NASA Astrophysics Data System (ADS)
Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph
2011-11-01
Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Paula D.; Rudeen, David Keith; Lord, David L.
2014-08-01
SANSMIC is solution mining software that was developed and utilized by SNL in its role as geotechnical advisor to the US DOE SPR for planning purposes. Three SANSMIC leach modes - withdrawal, direct, and reverse leach - have been revalidated with multiple test cases for each mode. The withdrawal mode was validated using high quality data from recent leach activity while the direct and reverse modes utilized data from historical cavern completion reports. Withdrawal results compared very well with observed data, including the location and size of shelves due to string breaks with relative leached volume differences ranging from 6more » - 10% and relative radius differences from 1.5 - 3%. Profile comparisons for the direct mode were very good with relative leached volume differences ranging from 6 - 12% and relative radius differences from 5 - 7%. First, second, and third reverse configurations were simulated in order to validate SANSMIC over a range of relative hanging string and OBI locations. The first-reverse was simulated reasonably well with relative leached volume differences ranging from 1 - 9% and relative radius differences from 5 - 12%. The second-reverse mode showed the largest discrepancies in leach profile. Leached volume differences ranged from 8 - 12% and relative radius differences from 1 - 10%. In the third-reverse, relative leached volume differences ranged from 10 - 13% and relative radius differences were %7E4 %. Comparisons to historical reports were quite good, indicating that SANSMIC is essentially the same as documented and validated in the early 1980's.« less
Registration of in vivo MR to histology of rodent brains using blockface imaging
NASA Astrophysics Data System (ADS)
Uberti, Mariano; Liu, Yutong; Dou, Huanyu; Mosley, R. Lee; Gendelman, Howard E.; Boska, Michael
2009-02-01
Registration of MRI to histopathological sections can enhance bioimaging validation for use in pathobiologic, diagnostic, and therapeutic evaluations. However, commonly used registration methods fall short of this goal due to tissue shrinkage and tearing after brain extraction and preparation. In attempts to overcome these limitations we developed a software toolbox using 3D blockface imaging as the common space of reference. This toolbox includes a semi-automatic brain extraction technique using constraint level sets (CLS), 3D reconstruction methods for the blockface and MR volume, and a 2D warping technique using thin-plate splines with landmark optimization. Using this toolbox, the rodent brain volume is first extracted from the whole head MRI using CLS. The blockface volume is reconstructed followed by 3D brain MRI registration to the blockface volume to correct the global deformations due to brain extraction and fixation. Finally, registered MRI and histological slices are warped to corresponding blockface images to correct slice specific deformations. The CLS brain extraction technique was validated by comparing manual results showing 94% overlap. The image warping technique was validated by calculating target registration error (TRE). Results showed a registration accuracy of a TRE < 1 pixel. Lastly, the registration method and the software tools developed were used to validate cell migration in murine human immunodeficiency virus type one encephalitis.
Magnetic Resonance Imaging of Human Tissue-Engineered Adipose Substitutes
Proulx, Maryse; Aubin, Kim; Lagueux, Jean; Audet, Pierre; Auger, Michèle
2015-01-01
Adipose tissue (AT) substitutes are being developed to answer the strong demand in reconstructive surgery. To facilitate the validation of their functional performance in vivo, and to avoid resorting to excessive number of animals, it is crucial at this stage to develop biomedical imaging methodologies, enabling the follow-up of reconstructed AT substitutes. Until now, biomedical imaging of AT substitutes has scarcely been reported in the literature. Therefore, the optimal parameters enabling good resolution, appropriate contrast, and graft delineation, as well as blood perfusion validation, must be studied and reported. In this study, human adipose substitutes produced from adipose-derived stem/stromal cells using the self-assembly approach of tissue engineering were implanted into athymic mice. The fate of the reconstructed AT substitutes implanted in vivo was successfully followed by magnetic resonance imaging (MRI), which is the imaging modality of choice for visualizing soft ATs. T1-weighted images allowed clear delineation of the grafts, followed by volume integration. The magnetic resonance (MR) signal of reconstructed AT was studied in vitro by proton nuclear magnetic resonance (1H-NMR). This confirmed the presence of a strong triglyceride peak of short longitudinal proton relaxation time (T1) values (200±53 ms) in reconstructed AT substitutes (total T1=813±76 ms), which establishes a clear signal difference between adjacent muscle, connective tissue, and native fat (total T1 ∼300 ms). Graft volume retention was followed up to 6 weeks after implantation, revealing a gradual resorption rate averaging at 44% of initial substitute's volume. In addition, vascular perfusion measured by dynamic contrast-enhanced-MRI confirmed the graft's vascularization postimplantation (14 and 21 days after grafting). Histological analysis of the grafted tissues revealed the persistence of numerous adipocytes without evidence of cysts or tissue necrosis. This study describes the in vivo grafting of human adipose substitutes devoid of exogenous matrix components, and for the first time, the optimal parameters necessary to achieve efficient MRI visualization of grafted tissue-engineered adipose substitutes. PMID:25549069
Systematic Review of Childhood Sedentary Behavior Questionnaires: What do We Know and What is Next?
Hidding, Lisan M; Altenburg, Teatske M; Mokkink, Lidwine B; Terwee, Caroline B; Chinapaw, Mai J M
2017-04-01
Accurate measurement of child sedentary behavior is necessary for monitoring trends, examining health effects, and evaluating the effectiveness of interventions. We therefore aimed to summarize studies examining the measurement properties of self-report or proxy-report sedentary behavior questionnaires for children and adolescents under the age of 18 years. Additionally, we provided an overview of the characteristics of the evaluated questionnaires. We performed systematic literature searches in the EMBASE, PubMed, and SPORTDiscus electronic databases. Studies had to report on at least one measurement property of a questionnaire assessing sedentary behavior. Questionnaire data were extracted using a standardized checklist, i.e. the Quality Assessment of Physical Activity Questionnaire (QAPAQ) checklist, and the methodological quality of the included studies was rated using a standardized tool, i.e. the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Forty-six studies on 46 questionnaires met our inclusion criteria, of which 33 examined test-retest reliability, nine examined measurement error, two examined internal consistency, 22 examined construct validity, eight examined content validity, and two examined structural validity. The majority of the included studies were of fair or poor methodological quality. Of the studies with at least a fair methodological quality, six scored positive on test-retest reliability, and two scored positive on construct validity. None of the questionnaires included in this review were considered as both valid and reliable. High-quality studies on the most promising questionnaires are required, with more attention to the content validity of the questionnaires. PROSPERO registration number: CRD42016035963.
Cerebrospinal fluid volume measurements in hydrocephalic rats.
Basati, Sukhraaj; Desai, Bhargav; Alaraj, Ali; Charbel, Fady; Linninger, Andreas
2012-10-01
Object Experimental data about the evolution of intracranial volume and pressure in cases of hydrocephalus are limited due to the lack of available monitoring techniques. In this study, the authors validate intracranial CSF volume measurements within the lateral ventricle, while simultaneously using impedance sensors and pressure transducers in hydrocephalic animals. Methods A volume sensor was fabricated and connected to a catheter that was used as a shunt to withdraw CSF. In vitro bench-top calibration experiments were created to provide data for the animal experiments and to validate the sensors. To validate the measurement technique in a physiological system, hydrocephalus was induced in weanling rats by kaolin injection into the cisterna magna. At 28 days after induction, the sensor was implanted into the lateral ventricles. After sealing the skull using dental cement, an acute CSF drainage/infusion protocol consisting of 4 sequential phases was performed with a pump. Implant location was confirmed via radiography using intraventricular iohexol contrast administration. Results Controlled CSF shunting in vivo with hydrocephalic rats resulted in precise and accurate sensor measurements (r = 0.98). Shunting resulted in a 17.3% maximum measurement error between measured volume and actual volume as assessed by a Bland-Altman plot. A secondary outcome confirmed that both ventricular volume and intracranial pressure decreased during CSF shunting and increased during infusion. Ventricular enlargement consistent with successful hydrocephalus induction was confirmed using imaging, as well as postmortem. These results indicate that volume monitoring is feasible for clinical cases of hydrocephalus. Conclusions This work marks a departure from traditional shunting systems currently used to treat hydrocephalus. The overall clinical application is to provide alternative monitoring and treatment options for patients. Future work includes development and testing of a chronic (long-term) volume monitoring system.
Drake, David; Kennedy, Rodney; Wallace, Eric
2017-12-01
Researchers and practitioners working in sports medicine and science require valid tests to determine the effectiveness of interventions and enhance understanding of mechanisms underpinning adaptation. Such decision making is influenced by the supportive evidence describing the validity of tests within current research. The objective of this study is to review the validity of lower body isometric multi-joint tests ability to assess muscular strength and determine the current level of supporting evidence. Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were followed in a systematic fashion to search, assess and synthesize existing literature on this topic. Electronic databases such as Web of Science, CINAHL and PubMed were searched up to 18 March 2015. Potential inclusions were screened against eligibility criteria relating to types of test, measurement instrument, properties of validity assessed and population group and were required to be published in English. The Consensus-based Standards for the Selection of health Measurement Instruments (COSMIN) checklist was used to assess methodological quality and measurement property rating of included studies. Studies rated as fair or better in methodological quality were included in the best evidence synthesis. Fifty-nine studies met the eligibility criteria for quality appraisal. The ten studies that rated fair or better in methodological quality were included in the best evidence synthesis. The most frequently investigated lower body isometric multi-joint tests for validity were the isometric mid-thigh pull and isometric squat. The validity of each of these tests was strong in terms of reliability and construct validity. The evidence for responsiveness of tests was found to be moderate for the isometric squat test and unknown for the isometric mid-thigh pull. No tests using the isometric leg press met the criteria for inclusion in the best evidence synthesis. Researchers and practitioners can use the isometric squat and isometric mid-thigh pull with confidence in terms of reliability and construct validity. Further work to investigate other validity components such as criterion validity, smallest detectable change and responsiveness to resistance exercise interventions may be beneficial to the current level of evidence.
NASA Astrophysics Data System (ADS)
Prabhu, David; Mehanna, Emile; Gargesha, Madhusudhana; Wen, Di; Brandt, Eric; van Ditzhuijzen, Nienke S.; Chamie, Daniel; Yamamoto, Hirosada; Fujino, Yusuke; Farmazilian, Ali; Patel, Jaymin; Costa, Marco; Bezerra, Hiram G.; Wilson, David L.
2016-03-01
High resolution, 100 frames/sec intravascular optical coherence tomography (IVOCT) can distinguish plaque types, but further validation is needed, especially for automated plaque characterization. We developed experimental and 3D registration methods, to provide validation of IVOCT pullback volumes using microscopic, brightfield and fluorescent cryoimage volumes, with optional, exactly registered cryo-histology. The innovation was a method to match an IVOCT pullback images, acquired in the catheter reference frame, to a true 3D cryo-image volume. Briefly, an 11-parameter, polynomial virtual catheter was initialized within the cryo-image volume, and perpendicular images were extracted, mimicking IVOCT image acquisition. Virtual catheter parameters were optimized to maximize cryo and IVOCT lumen overlap. Local minima were possible, but when we started within reasonable ranges, every one of 24 digital phantom cases converged to a good solution with a registration error of only +1.34+/-2.65μm (signed distance). Registration was applied to 10 ex-vivo cadaver coronary arteries (LADs), resulting in 10 registered cryo and IVOCT volumes yielding a total of 421 registered 2D-image pairs. Image overlays demonstrated high continuity between vascular and plaque features. Bland- Altman analysis comparing cryo and IVOCT lumen area, showed mean and standard deviation of differences as 0.01+/-0.43 mm2. DICE coefficients were 0.91+/-0.04. Finally, visual assessment on 20 representative cases with easily identifiable features suggested registration accuracy within one frame of IVOCT (+/-200μm), eliminating significant misinterpretations introduced by 1mm errors in the literature. The method will provide 3D data for training of IVOCT plaque algorithms and can be used for validation of other intravascular imaging modalities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1981-10-29
This volume is the software description for the National Utility Regulatory Model (NUREG). This is the third of three volumes provided by ICF under contract number DEAC-01-79EI-10579. These three volumes are: a manual describing the NUREG methodology; a users guide; and a description of the software. This manual describes the software which has been developed for NUREG. This includes a listing of the source modules. All computer code has been written in FORTRAN.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
Assessing validity of observational intervention studies – the Benchmarking Controlled Trials
Malmivaara, Antti
2016-01-01
Abstract Background: Benchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations. Aims: To create and pilot test a checklist for appraising methodological validity of a BCT. Methods: The checklist was created by extracting the most essential elements from the comprehensive set of criteria in the previous paper on BCTs. Also checklists and scientific papers on observational studies and respective systematic reviews were utilized. Ten BCTs published in the Lancet and in the New England Journal of Medicine were used to assess feasibility of the created checklist. Results: The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies. Conclusions: The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies. However, the piloted checklist should be validated in further studies.Key messagesBenchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations.This paper presents a checklist for appraising methodological validity of BCTs and pilot-tests the checklist with ten BCTs published in leading medical journals. The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies.The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies. PMID:27238631
Assessing Construct Validity Using Multidimensional Item Response Theory.
ERIC Educational Resources Information Center
Ackerman, Terry A.
The concept of a user-specified validity sector is discussed. The idea of the validity sector combines the work of M. D. Reckase (1986) and R. Shealy and W. Stout (1991). Reckase developed a methodology to represent an item in a multidimensional latent space as a vector. Item vectors are computed using multidimensional item response theory item…
A Review of Validation Research on Psychological Variables Used in Hiring Police Officers.
ERIC Educational Resources Information Center
Malouff, John M.; Schutte Nicola S.
This paper reviews the methods and findings of published research on the validity of police selection procedures. As a preface to the review, the typical police officer selection process is briefly described. Several common methodological deficiencies of the validation research are identified and discussed in detail: (1) use of past-selection…
FIELD VALIDATION OF EXPOSURE ASSESSMENT MODELS. VOLUME 1. DATA
This is the first of two volumes describing work done to evaluate the PAL-DS model, a Gaussian diffusion code modified to account for dry deposition and settling. This first volume describes the experimental techniques employed to dispense, collect, and measure depositing (zinc s...
Handbook of the Economics of Education. Volume 4
ERIC Educational Resources Information Center
Hanushek, Erik A., Ed.; Machin, Stephen J., Ed.; Woessmann, Ludger, Ed.
2011-01-01
What is the value of an education? Volume 4 of the Handbooks in the Economics of Education combines recent data with new methodologies to examine this and related questions from diverse perspectives. School choice and school competition, educator incentives, the college premium, and other considerations help make sense of the investments and…
Handbook of the Economics of Education. Volume 3
ERIC Educational Resources Information Center
Hanushek, Eric A., Ed.; Machin, Stephen J., Ed.; Woessmann, Ludger, Ed.
2011-01-01
How does education affect economic and social outcomes, and how can it inform public policy? Volume 3 of the Handbooks in the Economics of Education uses newly available high quality data from around the world to address these and other core questions. With the help of new methodological approaches, contributors cover econometric methods and…
Race and Ethnicity in Research Methods. Sage Focus Editions, Volume 157.
ERIC Educational Resources Information Center
Stanfield, John H., II, Ed.; Dennis, Rutledge M., Ed.
The contributions in this volume examine the array of methods used in quantitative, qualitative, and comparative and historical research to show how research sensitive to ethnic issues can best be conducted. Rethinking and revising traditional methodologies and applying new ones can portray racial and ethnic issues as they really exist. The…
ERIC Educational Resources Information Center
Schwarzer, David, Ed.; Petron, Mary, Ed.; Luke, Christopher, Ed.
2011-01-01
"Research Informing Practice--Practice Informing Research: Innovative Teaching Methodologies for World Language Educators" is an edited volume that focuses on innovative, nontraditional methods of teaching and learning world languages. Using teacher-research projects, each author in the volume guides readers through their own personal…
Persian Basic Course: Volume I, Lesson 1-18.
ERIC Educational Resources Information Center
Defense Language Inst., Monterey, CA.
The first of 10 volumes of a basic course in Persian is presented that is designed for use in the Defense Language Institute's intensive programs. The course, employing the audiolingual methodology, is designed to train native English speakers to level three proficiency in comprehension and speaking and level two proficiency in reading and writing…
Becoming Life-Long Learners--"A Pedagogy for Learning about Visionary Leadership"
ERIC Educational Resources Information Center
McNeil, Mary, Ed.; Nevin, Ann, Ed.
2014-01-01
In this volume we apply a personal narrative methodology to understanding what we have learned about visionary leadership. Authors in this volume developed their reflections of life-long learning as they investigated existing leadership theories and theories about future leadership. Graduate program faculty and authors read and critically reviewed…
Exact finite volume expectation values of local operators in excited states
NASA Astrophysics Data System (ADS)
Pozsgay, B.; Szécsényi, I. M.; Takács, G.
2015-04-01
We present a conjecture for the exact expression of finite volume expectation values in excited states in integrable quantum field theories, which is an extension of an earlier conjecture to the case of general diagonal factorized scattering with bound states and a nontrivial bootstrap structure. The conjectured expression is a spectral expansion which uses the exact form factors and the excited state thermodynamic Bethe Ansatz as building blocks. The conjecture is proven for the case of the trace of the energy-moment tensor. Concerning its validity for more general operators, we provide numerical evidence using the truncated conformal space approach. It is found that the expansion fails to be well-defined for small values of the volume in cases when the singularity structure of the TBA equations undergoes a non-trivial rearrangement under some critical value of the volume. Despite these shortcomings, the conjectured expression is expected to be valid for all volumes for most of the excited states, and as an expansion above the critical volume for the rest.
Validation of equations for pleural effusion volume estimation by ultrasonography.
Hassan, Maged; Rizk, Rana; Essam, Hatem; Abouelnour, Ahmed
2017-12-01
To validate the accuracy of previously published equations that estimate pleural effusion volume using ultrasonography. Only equations using simple measurements were tested. Three measurements were taken at the posterior axillary line for each case with effusion: lateral height of effusion ( H ), distance between collapsed lung and chest wall ( C ) and distance between lung and diaphragm ( D ). Cases whose effusion was aspirated to dryness were included and drained volume was recorded. Intra-class correlation coefficient (ICC) was used to determine the predictive accuracy of five equations against the actual volume of aspirated effusion. 46 cases with effusion were included. The most accurate equation in predicting effusion volume was ( H + D ) × 70 (ICC 0.83). The simplest and yet accurate equation was H × 100 (ICC 0.79). Pleural effusion height measured by ultrasonography gives a reasonable estimate of effusion volume. Incorporating distance between lung base and diaphragm into estimation improves accuracy from 79% with the first method to 83% with the latter.
Relationship of Temporal Lobe Volumes to Neuropsychological Test Performance in Healthy Children
Wells, Carolyn T.; Matson, Melissa A.; Kates, Wendy R.; Hay, Trisha; Horska, Alena
2008-01-01
Ecological validity of neuropsychological assessment includes the ability of tests to predict real-world functioning and/or covary with brain structures. Studies have examined the relationship between adaptive skills and test performance, with less focus on the association between regional brain volumes and neurobehavioral function in healthy children. The present study examined the relationship between temporal lobe gray matter volumes and performance on two neuropsychological tests hypothesized to measure temporal lobe functioning (Visual Perception-VP; Peabody Picture Vocabulary Test, Third Edition-PPVT-III) in 48 healthy children ages 5-18 years. After controlling for age and gender, left and right temporal and left occipital volumes were significant predictors of VP. Left and right frontal and temporal volumes were significant predictors of PPVT-III. Temporal volume emerged as the strongest lobar correlate with both tests. These results provide convergent and discriminant validity supporting VP as a measure of the “what” system; but suggest the PPVT-III as a complex measure of receptive vocabulary, potentially involving executive function demands. PMID:18513844
Appendix B: Methodology. [2014 Teacher Prep Review
ERIC Educational Resources Information Center
Greenberg, Julie; Walsh, Kate; McKee, Arthur
2014-01-01
The "NCTQ Teacher Prep Review" evaluates the quality of programs that provide preservice preparation of public school teachers. This appendix describes the scope, methodology, timeline, staff, and standards involved in the production of "Teacher Prep Review 2014." Data collection, validation, and analysis for the report are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weitz, R.; Thomas, C.; Klemm, J.
1982-03-03
External radiation doses are reconstructed for crews of support and target ships of Joint Task Force One at Operation CROSSROADS, 1946. Volume I describes the reconstruction methodology, which consists of modeling the radiation environment, to include the radioactivity of lagoon water, target ships, and support ship contamination; retracing ship paths through this environment; and calculating the doses to shipboard personnel. The USS RECLAIMER, a support ship, is selected as a representative ship to demonstrate this methodology. Doses for all other ships are summarized. Volume II (Appendix A) details the results for target ship personnel. Volume III (Appendix B) details themore » results for support ship personnel. Calculated doses for more than 36,000 personnel aboard support ships while at Bikini range from zero to 1.7 rem. Of those approximately 34,000 are less than 0.5 rem. From the models provided, doses due to target ship reboarding and doses accrued after departure from Bikini can be calculated, based on the individual circumstances of exposure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weitz, R.; Thomas, C.; Klemm, J.
1982-03-03
External radiation doses are reconstructed for crews of support and target ships of Joint Task Force One at Operation CROSSROADS, 1946. Volume I describes the reconstruction methodology, which consists of modeling the radiation environment, to include the radioactivity of lagoon water, target ships, and support ship contamination; retracing ship paths through this environment; and calculating the doses to shipboard personnel. The USS RECLAIMER, a support ship, is selected as a representative ship to demonstrate this methodology. Doses for all other ships are summarized. Volume II (Appendix A) details the results for target ship personnel. Volume III (Appendix B) details themore » results for support ship personnel. Calculated doses for more than 36,000 personnel aboard support ships while at Bikini range from zero to 1.7 rem. Of those, approximately 34,000 are less than 0.5 rem. From the models provided, doses due to target ship reboarding and doses accrued after departure from Bikini can be calculated, based on the individual circumstances of exposure.« less
An, Yonghao; Wood, Brandon C.; Ye, Jianchao; ...
2015-06-08
Although crystalline silicon (c-Si) anodes promise very high energy densities in Li-ion batteries, their practical use is complicated by amorphization, large volume expansion and severe plastic deformation upon lithium insertion. Recent experiments have revealed the existence of a sharp interface between crystalline Si (c-Si) and the amorphous Li xSi alloy during lithiation, which propagates with a velocity that is orientation dependent; the resulting anisotropic swelling generates substantial strain concentrations that initiate cracks even in nanostructured Si. Here we describe a novel strategy to mitigate lithiation-induced fracture by using pristine c-Si structures with engineered anisometric morphologies that are deliberately designed tomore » counteract the anisotropy in the crystalline/amorphous interface velocity. This produces a much more uniform volume expansion, significantly reducing strain concentration. Based on a new, validated methodology that improves previous models of anisotropic swelling of c-Si, we propose optimal morphological designs for c-Si pillars and particles. The advantages of the new morphologies are clearly demonstrated by mesoscale simulations and verified by experiments on engineered c-Si micropillars. The results of this study illustrate that morphological design is effective in improving the fracture resistance of micron-sized Si electrodes, which will facilitate their practical application in next-generation Li-ion batteries. In conclusion, the model and design approach present in this paper also have general implications for the study and mitigation of mechanical failure of electrode materials that undergo large anisotropic volume change upon ion insertion and extraction.« less
Contribution of dental tissues to sex determination in modern human populations.
García-Campos, Cecilia; Martinón-Torres, María; Martín-Francés, Laura; Martínez de Pinillos, Marina; Modesto-Mata, Mario; Perea-Pérez, Bernardo; Zanolli, Clément; Labajo González, Elena; Sánchez Sánchez, José Antonio; Ruiz Mediavilla, Elena; Tuniz, Claudio; Bermúdez de Castro, José María
2018-02-20
Accurate sex estimation is an essential step for the reconstruction of the biological profile of human remains. Earlier studies have shown that elements of the human permanent dentition are sexually dimorphic. The aims of this study are to determine the degree of sexual dimorphism in the dental tissue volumes and surface areas of mandibular canines and to explore its potential for reliable sex determination. The teeth included in this study (n = 69) were selected from anthropological collections from Spain, South Africa and Sudan. In all cases, the sex of the individuals was known. The teeth were scanned and three-dimensional (3D) measurements (volumes and surfaces areas) were obtained. Finally, a dsicriminant function analysis was applied. Our results showed that sexual dimorphism in canine size is due to males having greater amounts of dentine, whereas enamel volume does not contribute significantly to overall tooth size dimorphism. Classification accuracy of the multivariable equations tested on slightly worn teeth ranged from 78 to 90.2% for the crossvalidation, and from 71.43 to 84.62% for the hold-out sample validation. When all functions were applied together, the sex was correctly assigned 92.30% of the time. Our results suggest that the 3D variables from mandibular canine dental tissues are useful for sex determination as they present a high degree of dimorphism. The results obtained show the importance of 3D dental tissue measurements as a methodology in sex determination, which application should be considered as a supplemental method to others. © 2018 Wiley Periodicals, Inc.
Development of a weight/sizing design synthesis computer program. Volume 1: Program formulation
NASA Technical Reports Server (NTRS)
Garrison, J. M.
1973-01-01
The development of a weight/sizing design synthesis methodology for use in support of the main line space shuttle program is discussed. The methodology has a minimum number of data inputs and quick turn around capabilities. The methodology makes it possible to: (1) make weight comparisons between current shuttle configurations and proposed changes, (2) determine the effects of various subsystems trades on total systems weight, and (3) determine the effects of weight on performance and performance on weight.
1983-12-30
AD-Ri46 57? ARCHITECTURE DESIGN AND SYSTEM; PERFORMANCE ASSESSMENT i/i AND DEVELOPMENT ME..(U) NAVAL SURFACE WEAPONS CENTER SILYER SPRING MD J...AD-A 146 577 NSIWC TR 83-324 ARCHITECTURE , DESIGN , AND SYSTEM; PERFORMANCE ASSESSMENT AND DEVELOPMENT METHODOLOGY...REPORT NUMBER 12. GOVT ACCESSION NO.3. RECIPIENT’S CATALOG NUMBER NSWC TR 83-324 10- 1 1 51’ 4. ?ITLE (and subtitle) ARCHITECTURE , DESIGN , AND SYSTEM; S
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A
2017-01-15
Researchers often rely on simple methods to identify involvement of neurons in a particular motor task. The historical approach has been to inspect large groups of neurons and subjectively separate neurons into groups based on the expertise of the investigator. In cases where neuron populations are small it is reasonable to inspect these neuronal recordings and their firing rates carefully to avoid data omissions. In this paper, a new methodology is presented for automatic objective classification of neurons recorded in association with behavioral tasks into groups. By identifying characteristics of neurons in a particular group, the investigator can then identify functional classes of neurons based on their relationship to the task. The methodology is based on integration of a multiple signal classification (MUSIC) algorithm to extract relevant features from the firing rate and an expectation-maximization Gaussian mixture algorithm (EM-GMM) to cluster the extracted features. The methodology is capable of identifying and clustering similar firing rate profiles automatically based on specific signal features. An empirical wavelet transform (EWT) was used to validate the features found in the MUSIC pseudospectrum and the resulting signal features captured by the methodology. Additionally, this methodology was used to inspect behavioral elements of neurons to physiologically validate the model. This methodology was tested using a set of data collected from awake behaving non-human primates. Copyright © 2016 Elsevier B.V. All rights reserved.
Remote sensing for site characterization
Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.; Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.
2000-01-01
This volume, Remote Sensing for Site Characterization, describes the feasibility of aircraft- and satellite-based methods of revealing environmental-geological problems. A balanced ratio between explanations of the methodological/technical side and presentations of case studies is maintained. The comparison of case studies from North America and Germany show how the respective territorial conditions lead to distinct methodological approaches.
Quality in the Basic Grant Delivery System: Volume 3, Methodology.
ERIC Educational Resources Information Center
Advanced Technology, Inc., McLean, VA.
The research methodology of a study to assess 1980-1981 award accuracy of the Basic Educational Opportunity Grants (BEOG), or Pell grants, is described. The study is the first stage of a three-stage quality control project. During the spring of 1981 a nationally representative sample of 305 public, private, and proprietary institutions was…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martens, Milou H., E-mail: mh.martens@hotmail.com; Department of Surgery, Maastricht University Medical Center, Maastricht; GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht
2015-12-01
Purpose: To review the available literature on tumor size/volume measurements on magnetic resonance imaging for response assessment after chemoradiotherapy, and validate these cut-offs in an independent multicenter patient cohort. Methods and Materials: The study included 2 parts. (1) Review of the literature: articles were included that assessed the accuracy of tumor size/volume measurements on magnetic resonance imaging for tumor response assessment. Size/volume cut-offs were extracted; (2) Multicenter validation: extracted cut-offs from the literature were tested in a multicenter cohort (n=146). Accuracies were calculated and compared with reported results from the literature. Results: The review included 14 articles, in which 3more » different measurement methods were assessed: (1) tumor length; (2) 3-dimensonial tumor size; and (3) whole volume. Study outcomes consisted of (1) complete response (ypT0) versus residual tumor; (2) tumor regression grade 1 to 2 versus 3 to 5; and (3) T-downstaging (ypT« less
Predicting Vessel Trajectories from Ais Data Using R
2017-06-01
future position at the expectation level set by the user, therefore producing a valid methodology for both estimating the future vessel location and... methodology for both estimating the future vessel location and for assessing anomalous vessel behavior. vi THIS PAGE INTENTIONALLY LEFT BLANK vii... methodology , that brings them one step closer to attaining these goals. A key idea in the current literature is that the series of vessel locations
Climate change vulnerability for species-Assessing the assessments.
Wheatley, Christopher J; Beale, Colin M; Bradbury, Richard B; Pearce-Higgins, James W; Critchlow, Rob; Thomas, Chris D
2017-09-01
Climate change vulnerability assessments are commonly used to identify species at risk from global climate change, but the wide range of methodologies available makes it difficult for end users, such as conservation practitioners or policymakers, to decide which method to use as a basis for decision-making. In this study, we evaluate whether different assessments consistently assign species to the same risk categories and whether any of the existing methodologies perform well at identifying climate-threatened species. We compare the outputs of 12 climate change vulnerability assessment methodologies, using both real and simulated species, and validate the methods using historic data for British birds and butterflies (i.e. using historical data to assign risks and more recent data for validation). Our results show that the different vulnerability assessment methods are not consistent with one another; different risk categories are assigned for both the real and simulated sets of species. Validation of the different vulnerability assessments suggests that methods incorporating historic trend data into the assessment perform best at predicting distribution trends in subsequent time periods. This study demonstrates that climate change vulnerability assessments should not be used interchangeably due to the poor overall agreement between methods when considering the same species. The results of our validation provide more support for the use of trend-based rather than purely trait-based approaches, although further validation will be required as data become available. © 2017 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
The Methodology of Selecting the Transport Mode for Companies on the Slovak Transport Market
NASA Astrophysics Data System (ADS)
Černá, Lenka; Zitrický, Vladislav; Daniš, Jozef
2017-03-01
Transport volume in the Slovak Republic is growing continuously every year. This rising trend is influenced by the development of car industry and its suppliers. Slovak republic has also a geographic strategy position in middle Europe from the side of transport corridors (east-west and north-south). The development of transport volume in freight transport depends on the transport and business processes between the European Union and China and it is an opportunity for Slovak republic to obtain transit transport flows. In the Slovak Republic, road transport has a dominant position in the transport market. The volume of road transport has gradually increased over the past years. The increase of road transport is reflected on the highways and speed roads in regions which have higher economic potential. The increase of rail transport as seen on the main rail corridors is not as significant as in road transport. Trade globalization also has an influence on the increase of transport volume in intermodal transport. Predicted increase in transport volume for this transport mode is from 2,3 mil ton per year at present to 8 mil ton in the year 2020. Selection of transport mode and carrier is an important aspect for logistic management, because companies (customers) want to reduce the number of carriers which they trade and they create the system of several key carriers. Bigger transport volume and more qualitative transport service give a possibility to reduce transport costs. This trend is positive for carriers too, because the carriers can focus only on the selected customers and provide more qualitative services. The paper is focused on the selection of transport mode based on the proposed methodology. The aims of the paper are, definition of criteria which directly influence the selection of transport modes, determination of criteria based on the subjectively methods, creation of process for the selection of transport modes and practical application of proposed methodology.
Igras, Susan; Diakité, Mariam; Lundgren, Rebecka
2017-07-01
In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
Diffusion and decay chain of radioisotopes in stagnant water in saturated porous media.
Guzmán, Juan; Alvarez-Ramirez, Jose; Escarela-Pérez, Rafael; Vargas, Raúl Alejandro
2014-09-01
The analysis of the diffusion of radioisotopes in stagnant water in saturated porous media is important to validate the performance of barrier systems used in radioactive repositories. In this work a methodology is developed to determine the radioisotope concentration in a two-reservoir configuration: a saturated porous medium with stagnant water is surrounded by two reservoirs. The concentrations are obtained for all the radioisotopes of the decay chain using the concept of overvalued concentration. A methodology, based on the variable separation method, is proposed for the solution of the transport equation. The novelty of the proposed methodology involves the factorization of the overvalued concentration in two factors: one that describes the diffusion without decay and another one that describes the decay without diffusion. It is possible with the proposed methodology to determine the required time to obtain equal injective and diffusive concentrations in reservoirs. In fact, this time is inversely proportional to the diffusion coefficient. In addition, the proposed methodology allows finding the required time to get a linear and constant space distribution of the concentration in porous mediums. This time is inversely proportional to the diffusion coefficient. In order to validate the proposed methodology, the distributions in the radioisotope concentrations are compared with other experimental and numerical works. Copyright © 2014 Elsevier Ltd. All rights reserved.
Development of Metabolic Function Biomarkers in the Common Marmoset, Callithrix jacchus
Ziegler, Toni E.; Colman, Ricki J.; Tardif, Suzette D.; Sosa, Megan E.; Wegner, Fredrick H.; Wittwer, Daniel J.; Shrestha, Hemanta
2013-01-01
Metabolic assessment of a nonhuman primate model of metabolic syndrome and obesity requires the necessary biomarkers specific to the species. While the rhesus monkey has a number of specific assays for assessing metabolic syndrome, the marmoset does not. Furthermore, the common marmoset (Callithrix jacchus) has a small blood volume that necessitates using a single blood volume for multiple analyses. The common marmoset holds a great potential as an alternative primate model for the study of human disease but assay methods need to be developed and validated for the biomarkers of metabolic syndrome. Here we report on the adaptation, development and validation of commercially available immunoassays for common marmoset samples in small volumes. We have performed biological validations for insulin, adiponectin, leptin, and ghrelin to demonstrate the use of these biomarkers in examining metabolic syndrome and other related diseases in the common marmoset. PMID:23447060
Mani, Suresh; Sharma, Shobha; Omar, Baharudin; Paungmali, Aatit; Joseph, Leonard
2017-04-01
Purpose The purpose of this review is to systematically explore and summarise the validity and reliability of telerehabilitation (TR)-based physiotherapy assessment for musculoskeletal disorders. Method A comprehensive systematic literature review was conducted using a number of electronic databases: PubMed, EMBASE, PsycINFO, Cochrane Library and CINAHL, published between January 2000 and May 2015. The studies examined the validity, inter- and intra-rater reliabilities of TR-based physiotherapy assessment for musculoskeletal conditions were included. Two independent reviewers used the Quality Appraisal Tool for studies of diagnostic Reliability (QAREL) and the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool to assess the methodological quality of reliability and validity studies respectively. Results A total of 898 hits were achieved, of which 11 articles based on inclusion criteria were reviewed. Nine studies explored the concurrent validity, inter- and intra-rater reliabilities, while two studies examined only the concurrent validity. Reviewed studies were moderate to good in methodological quality. The physiotherapy assessments such as pain, swelling, range of motion, muscle strength, balance, gait and functional assessment demonstrated good concurrent validity. However, the reported concurrent validity of lumbar spine posture, special orthopaedic tests, neurodynamic tests and scar assessments ranged from low to moderate. Conclusion TR-based physiotherapy assessment was technically feasible with overall good concurrent validity and excellent reliability, except for lumbar spine posture, orthopaedic special tests, neurodynamic testa and scar assessment.
Hulteen, Ryan M; Lander, Natalie J; Morgan, Philip J; Barnett, Lisa M; Robertson, Samuel J; Lubans, David R
2015-10-01
It has been suggested that young people should develop competence in a variety of 'lifelong physical activities' to ensure that they can be active across the lifespan. The primary aim of this systematic review is to report the methodological properties, validity, reliability, and test duration of field-based measures that assess movement skill competency in lifelong physical activities. A secondary aim was to clearly define those characteristics unique to lifelong physical activities. A search of four electronic databases (Scopus, SPORTDiscus, ProQuest, and PubMed) was conducted between June 2014 and April 2015 with no date restrictions. Studies addressing the validity and/or reliability of lifelong physical activity tests were reviewed. Included articles were required to assess lifelong physical activities using process-oriented measures, as well as report either one type of validity or reliability. Assessment criteria for methodological quality were adapted from a checklist used in a previous review of sport skill outcome assessments. Movement skill assessments for eight different lifelong physical activities (badminton, cycling, dance, golf, racquetball, resistance training, swimming, and tennis) in 17 studies were identified for inclusion. Methodological quality, validity, reliability, and test duration (time to assess a single participant), for each article were assessed. Moderate to excellent reliability results were found in 16 of 17 studies, with 71% reporting inter-rater reliability and 41% reporting intra-rater reliability. Only four studies in this review reported test-retest reliability. Ten studies reported validity results; content validity was cited in 41% of these studies. Construct validity was reported in 24% of studies, while criterion validity was only reported in 12% of studies. Numerous assessments for lifelong physical activities may exist, yet only assessments for eight lifelong physical activities were included in this review. Generalizability of results may be more applicable if more heterogeneous samples are used in future research. Moderate to excellent levels of inter- and intra-rater reliability were reported in the majority of studies. However, future work should look to establish test-retest reliability. Validity was less commonly reported than reliability, and further types of validity other than content validity need to be established in future research. Specifically, predictive validity of 'lifelong physical activity' movement skill competency is needed to support the assertion that such activities provide the foundation for a lifetime of activity.
Tobacco industry manipulation of data on and press coverage of the illicit tobacco trade in the UK
Rowell, A; Evans-Reeves, K; Gilmore, A B
2014-01-01
Background In the UK, transnational tobacco companies (TTCs) have been arguing that levels of illicit trade are high and increasing and will rise further if standardised packaging is implemented. This paper examines trends in and accuracy of media reporting of, and industry data on, illicit tobacco in the UK. Methods Quantification of the volume, nature and quality of press articles citing industry data on illicit tobacco in UK newspapers from March 2008 to March 2013. Examination of published TTC data on illicit, including a comparison with independent data and of TTC reporting of Her Majesty's Revenue and Customs data on illicit. Results Media stories citing industry data on illicit tobacco began in June 2011, 2 months after the Tobacco Control Plan for England, which heralded standardised packaging, was published. The majority of data cited are based on industry Empty Pack Surveys for which no methodology is available. For almost all parts of the country where repeat data were cited in press stories, they indicated an increase, often substantial, in non-domestic/illicit cigarettes that is not supported by independent data. Similarly, national data from two published industry sources show a sudden large increase in non-domestic product between 2011 and 2012. Yet the methodology of one report changes over this period and the other provides no published methodology. In contrast, independent data show steady declines in non-domestic and illicit cigarette penetration from 2006 to 2012 and either a continued decline or small increase to 2013. Conclusions Industry claims that use of Non-UK Duty Paid/illicit cigarettes in the UK is sharply increasing are inconsistent with historical trends and recent independent data. TTCs are exaggerating the threat of illicit tobacco by commissioning surveys whose methodology and validity remain uncertain, planting misleading stories and misquoting government data. Industry data on levels of illicit should be treated with extreme caution. PMID:24614041
SU-E-T-278: Dose Conformity Index for the Target in a Multitarget Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harikrishnaperumal, Sudahar
2015-06-15
Purpose: The existing conformity index formulations are failing when multiple targets present outside the target of interest with same or different dose prescriptions. In the present study a novel methodology is introduced to solve this issue. Methods: The conformity index used by Nakamura et al (Int J Radiat Oncol Biol Phys 2001; 51(5):1313–1319) is taken as the base for this methodology. In this proposal, the prescription isodose volume (PIV) which normally includes the normal tissue and other target regions is restricted as PIV in annular regions of different thickness around the target of interest. The graphical line plotted between themore » thickness of annular region and the corresponding conformity index, will increase in the beginning and will reach a flat region, then it will increase again. The second increase in the conformity index depends basically on the distance between the targets, dose prescriptions, and size of the targets. The conformity index in the flat region should be the conformity index of the target of interest. This methodology was validated on dual target environment on a skull phantom in Multiplan planning system (Accuray Inc. Sunnyvale, USA) Results: When the surrounding target’s (sphere) size is changed from 1.5cm to 6cm diameter, the conformity index of the target of interest (3cm diameter) changed from 1.09 to 1.25. When the distance between the targets changed from 7.5cm to 2.5cm, the conformity index changed from 1.10 to 1.17. Similarly, when the prescribed dose changed from 25Gy to 50Gy the conformity index changed from 1.09 to 1.42. These values were above 2.0 when Nakamura et al formula was used. Conclusion: The proposed conformity index methodology eliminates the influence of surrounding targets to a greater extend. However, the limitations of this method should be studied further. Application of this method in clinical situations is the future scope.« less
Development and Validation of a Translation Test.
ERIC Educational Resources Information Center
Ghonsooly, Behzad
1993-01-01
Translation testing methodology has been criticized for its subjective character. No real strides have so far been made in developing an objective translation test. In this paper, certain detailed procedures including various phases of pretesting have been performed to achieve objectivity and scorability in translation testing methodology. In…
ERIC Educational Resources Information Center
Brooks, Keith; And Others
1979-01-01
Discusses the benefits of the International Communication Association Communication Audit as a methodology for evaluation of organizational communication processes and outcomes. An "after" survey of 16 audited organizations confirmed the audit as a valid diagnostic methodology and organization development intervention technique which…
Bookbinder, Marilyn; Hugodot, Amandine; Freeman, Katherine; Homel, Peter; Santiago, Elisabeth; Riggs, Alexa; Gavin, Maggie; Chu, Alice; Brady, Ellen; Lesage, Pauline; Portenoy, Russell K
2018-02-01
Quality improvement in end-of-life care generally acquires data from charts or caregivers. "Tracer" methodology, which assesses real-time information from multiple sources, may provide complementary information. The objective of this study was to develop a valid brief audit tool that can guide assessment and rate care when used in a clinician tracer to evaluate the quality of care for the dying patient. To identify items for a brief audit tool, 248 items were created to evaluate overall quality, quality in specific content areas (e.g., symptom management), and specific practices. Collected into three instruments, these items were used to interview professional caregivers and evaluate the charts of hospitalized patients who died. Evidence that this information could be validly captured using a small number of items was obtained through factor analyses, canonical correlations, and group comparisons. A nurse manager field tested tracer methodology using candidate items to evaluate the care provided to other patients who died. The survey of 145 deaths provided chart data and data from 445 interviews (26 physicians, 108 nurses, 18 social workers, and nine chaplains). The analyses yielded evidence of construct validity for a small number of items, demonstrating significant correlations between these items and content areas identified as latent variables in factor analyses. Criterion validity was suggested by significant differences in the ratings on these items between the palliative care unit and other units. The field test evaluated 127 deaths, demonstrated the feasibility of tracer methodology, and informed reworking of the candidate items into the 14-item Tracer EoLC v1. The Tracer EoLC v1 can be used with tracer methodology to guide the assessment and rate the quality of end-of-life care. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Differences in regional grey matter volumes in currently ill patients with anorexia nervosa.
Phillipou, Andrea; Rossell, Susan Lee; Gurvich, Caroline; Castle, David Jonathan; Abel, Larry Allen; Nibbs, Richard Grant; Hughes, Matthew Edward
2018-01-01
Neurobiological findings in anorexia nervosa (AN) are inconsistent, including differences in regional grey matter volumes. Methodological limitations often contribute to the inconsistencies reported. The aim of this study was to improve on these methodologies by utilising voxel-based morphometry (VBM) analysis with the use of diffeomorphic anatomic registration through an exponentiated lie algebra algorithm (DARTEL), in a relatively large group of individuals with AN. Twenty-six individuals with AN and 27 healthy controls underwent a T1-weighted magnetic resonance imaging (MRI) scan. AN participants were found to have reduced grey matter volumes in a number of areas including regions of the basal ganglia (including the ventral striatum), and parietal and temporal cortices. Body mass index (BMI) and global scores on the Eating Disorder Examination Questionnaire (EDE-Q) were also found to correlate with grey matter volumes in a region of the brainstem (including the substantia nigra and ventral tegmental area) in AN, and predicted 56% of the variance in grey matter volumes in this area. The brain regions associated with grey matter reductions in AN are consistent with regions responsible for cognitive deficits associated with the illness including anhedonia, deficits in affect perception and saccadic eye movement abnormalities. Overall, the findings suggest reduced grey matter volumes in AN that are associated with eating disorder symptomatology. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Socas-Rodríguez, Bárbara; González-Sálamo, Javier; Hernández-Borges, Javier; Rodríguez Delgado, Miguel Ángel
2016-05-01
In this work, a simple and environmental friendly methodology has been developed for the analysis of a group of six mycotoxins with estrogenic activity produced by Fusarium species (i.e. zearalanone, zearalenone, α-zearalanol, β-zearalanol, α-zearalenol, and β-zearalenol), using microdispersive SPE the symbol micro should de before dSPE with multiwalled carbon nanotubes as sorbent. Separation, determination, and quantification were achieved by HPLC coupled to ion trap MS with an ESI interface. Parameters affecting the extraction efficiency of µ-dSPE such as pH of the sample, amount of multiwalled carbon nanotubes, and type and volume of elution solvent, were studied and optimized. The methodology was validated for mineral, pond, and wastewater as well as for powdered infant milk using 17β-estradiol-2,4,16,16,17-d5 (17β-E2 -D5 ) as internal standard, obtaining recoveries ranging from 85 to 120% for the three types of water samples and from 77 to 115% for powdered infant milk. RSD values were lower than 10%. The LOQs achieved were in the range 0.05-2.90 μg/L for water samples and 2.02-31.9 μg/L for powdered infant milk samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Ladbury, Ray
2018-01-01
In 1972, when engineers at Hughes Aircraft Corporation discovered that errors in their satellite avionics were being caused by cosmic rays (so-called single-event effects, or SEE), Moore's Law was only 7 years old. Now, more than 45 years on, the scaling that drove Moore's Law for its first 35 years has reached its limits. However, electronics technology continues to evolve exponentially and SEE remain a formidable issue for use of electronics in space. SEE occur when a single ionizing particle passes through a sensitive volume in an active semiconductor device and generates sufficient charge to cause anomalous behavior or failure in the device. Because SEE can occur at any time during the mission, the emphasis of SEE risk management methodologies is ensuring that all SEE modes in a device under test are detected by the test. Because a particle's probability of causing an SEE generally increases as the particle becomes more ionizing, heavy-ion beams have been and remain the preferred tools for elucidating SEE vulnerabilities. In this talk we briefly discuss space radiation environments and SEE mechanisms, describe SEE test methodologies and discuss current and future challenges for use of heavy-ion beams for SEE testing in an era when the continued validity of Moore's law depends on innovation rather than CMOS scaling.
A validated methodology for genetic identification of tuna species (genus Thunnus).
Viñas, Jordi; Tudela, Sergi
2009-10-27
Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned.
Investigation of Effective Material Properties of Stony Meteorites
NASA Technical Reports Server (NTRS)
Agrawal, Parul; Carlozzi, Alex; Bryson, Kathryn
2016-01-01
To assess the threat posed by an asteroid entering Earth's atmosphere, one must predict if, when, and how it fragments during entry. A comprehensive understanding of the Asteroid material properties is needed to achieve this objective. At present, the meteorite material found on Earth are the only objects from an entering asteroid that can be used as representative material and be tested inside a laboratory setting. Therefore, unit cell models are developed to determine the effective material properties of stony meteorites and in turn deduce the properties of asteroids. The unit cell is representative volume that accounts for diverse minerals, porosity, and matrix composition inside a meteorite. The various classes under investigation includes H-class, L-class, and LL-class chondrites. The effective mechanical properties such as Young's Modulus and Poisson's Ratio of the unit cell are calculated by performing several hundreds of Monte-Carlo simulations. Terrestrial analogs such as Basalt and Gabbro are being used to validate the unit cell methodology.
Phase 2 STS new user development program. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Mcdowell, J. R.
1976-01-01
A methodology for developing new users for STS other than NASA and DoD, thereby maximizing the use of the STS system was developed. The approach to user development, reflected in the implementation plan, and attendant informational material to be used were evaluated by conducting a series of test cases with selected user organizations. These test case organizations were, in effect, used as consultants to evaluate the effectiveness, the needs, the completeness, and the adequacy of the user development approach and informational material. The selection of the test cases provided a variety of potential STS users covering industry, other government agencies, and the educational sector. The test cases covered various use areas and provided a mix of user organization types. A summary of the actual test cases conducted is given. The conduct of the test cases verified the general approach of the implementation plan, the validity of the user development strategy prepared for each test case organization and the effectiveness of the STS basic and user customized informational material.
Fluid-structure interaction of turbulent boundary layer over a compliant surface
NASA Astrophysics Data System (ADS)
Anantharamu, Sreevatsa; Mahesh, Krishnan
2016-11-01
Turbulent flows induce unsteady loads on surfaces in contact with them, which affect material stresses, surface vibrations and far-field acoustics. We are developing a numerical methodology to study the coupled interaction of a turbulent boundary layer with the underlying surface. The surface is modeled as a linear elastic solid, while the fluid follows the spatially filtered incompressible Navier-Stokes equations. An incompressible Large Eddy Simulation finite volume flow approach based on the algorithm of Mahesh et al. is used in the fluid domain. The discrete kinetic energy conserving property of the method ensures robustness at high Reynolds number. The linear elastic model in the solid domain is integrated in space using finite element method and in time using the Newmark time integration method. The fluid and solid domain solvers are coupled using both weak and strong coupling methods. Details of the algorithm, validation, and relevant results will be presented. This work is supported by NSWCCD, ONR.
Tian, Jing; Varga, Boglarka; Tatrai, Erika; Fanni, Palya; Somfai, Gabor Mark; Smiddy, William E.
2016-01-01
Over the past two decades a significant number of OCT segmentation approaches have been proposed in the literature. Each methodology has been conceived for and/or evaluated using specific datasets that do not reflect the complexities of the majority of widely available retinal features observed in clinical settings. In addition, there does not exist an appropriate OCT dataset with ground truth that reflects the realities of everyday retinal features observed in clinical settings. While the need for unbiased performance evaluation of automated segmentation algorithms is obvious, the validation process of segmentation algorithms have been usually performed by comparing with manual labelings from each study and there has been a lack of common ground truth. Therefore, a performance comparison of different algorithms using the same ground truth has never been performed. This paper reviews research-oriented tools for automated segmentation of the retinal tissue on OCT images. It also evaluates and compares the performance of these software tools with a common ground truth. PMID:27159849
Hoeffelin, H; Jacquemin, D; Defaweux, V; Nizet, J L
2014-01-01
Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients) and we compared this system to other techniques (CT scanning and Archimedes' principle) under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress) and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection) of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery.
Hoeffelin, H.; Jacquemin, D.; Defaweux, V.; Nizet, J L.
2014-01-01
Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients) and we compared this system to other techniques (CT scanning and Archimedes' principle) under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress) and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection) of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery. PMID:24511536
Methods to determine hydration states of minerals and cement hydrates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baquerizo, Luis G., E-mail: luis.baquerizoibarra@holcim.com; Matschei, Thomas; Scrivener, Karen L.
2014-11-15
This paper describes a novel approach to the quantitative investigation of the impact of varying relative humidity (RH) and temperature on the structure and thermodynamic properties of salts and crystalline cement hydrates in different hydration states (i.e. varying molar water contents). The multi-method approach developed here is capable of deriving physico-chemical boundary conditions and the thermodynamic properties of hydrated phases, many of which are currently missing from or insufficiently reported in the literature. As an example the approach was applied to monosulfoaluminate, a phase typically found in hydrated cement pastes. New data on the dehydration and rehydration of monosulfoaluminate aremore » presented. Some of the methods used were validated with the system Na{sub 2}SO{sub 4}–H{sub 2}O and new data related to the absorption of water by anhydrous sodium sulfate are presented. The methodology and data reported here should permit better modeling of the volume stability of cementitious systems exposed to various different climatic conditions.« less
Current Status of 3-Dimensional Speckle Tracking Echocardiography: A Review from Our Experiences
Ishizu, Tomko; Aonuma, Kazutaka
2014-01-01
Cardiac function analysis is the main focus of echocardiography. Left ventricular ejection fraction (LVEF) has been the clinical standard, however, LVEF is not enough to investigate myocardial function. For the last decade, speckle tracking echocardiography (STE) has been the novel clinical tool for regional and global myocardial function analysis. However, 2-dimensional imaging methods have limitations in assessing 3-dimensional (3D) cardiac motion. In contrast, 3D echocardiography also has been widely used, in particular, to measure LV volume measurements and assess valvular diseases. Joining the technology bandwagon, 3D-STE was introduced in 2008. Experimental studies and clinical investigations revealed the reliability and feasibility of 3D-STE-derived data. In addition, 3D-STE provides a novel deformation parameter, area change ratio, which have the potential for more accurate assessment of overall and regional myocardial function. In this review, we introduced the features of the methodology, validation, and clinical application of 3D-STE based on our experiences for 7 years. PMID:25031794
Temperature modelling and prediction for activated sludge systems.
Lippi, S; Rosso, D; Lubello, C; Canziani, R; Stenstrom, M K
2009-01-01
Temperature is an important factor affecting biomass activity, which is critical to maintain efficient biological wastewater treatment, and also physiochemical properties of mixed liquor as dissolved oxygen saturation and settling velocity. Controlling temperature is not normally possible for treatment systems but incorporating factors impacting temperature in the design process, such as aeration system, surface to volume ratio, and tank geometry can reduce the range of temperature extremes and improve the overall process performance. Determining how much these design or up-grade options affect the tank temperature requires a temperature model that can be used with existing design methodologies. This paper presents a new steady state temperature model developed by incorporating the best aspects of previously published models, introducing new functions for selected heat exchange paths and improving the method for predicting the effects of covering aeration tanks. Numerical improvements with embedded reference data provide simpler formulation, faster execution, easier sensitivity analyses, using an ordinary spreadsheet. The paper presents several cases to validate the model.
Cerebral magnetic resonance changes associated with fibromyalgia syndrome.
Murga, Iñigo; Guillen, Virginia; Lafuente, José-Vicente
2017-06-07
Fibromyalgia syndrome is a chronic disease, of unknown origin, whose diagnostic criteria were established in 1990 by the American College of Rheumatology. New criteria were proposed in 2010 that have not yet been validated. It is characterized by a generalized chronic musculoskeletal pain, accompanied by hyperalgesia and allodynia, as well as other motor, vegetative, cognitive and affective symptoms and signs. We have reviewed a set of studies with cerebral magnetic resonance (morphometry, connectivity and spectroscopy) that refer to changes in areas involved in pain processing. Modifications in gray and white matter volume, as well as in levels of N-acetylaspartate, choline or glutamate, among other metabolites, have been observed in the hippocampus, insula, prefrontal and cingular cortex. Neuroradiological findings are nonspecific and similar to those found in other examples of chronic pain. An increase in the sample size and a standardized methodology would facilitate comparison, allowing the drawing of general conclusions. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
Adjoint-Based Methodology for Time-Dependent Optimization
NASA Technical Reports Server (NTRS)
Yamaleev, N. K.; Diskin, B.; Nielsen, E. J.
2008-01-01
This paper presents a discrete adjoint method for a broad class of time-dependent optimization problems. The time-dependent adjoint equations are derived in terms of the discrete residual of an arbitrary finite volume scheme which approximates unsteady conservation law equations. Although only the 2-D unsteady Euler equations are considered in the present analysis, this time-dependent adjoint method is applicable to the 3-D unsteady Reynolds-averaged Navier-Stokes equations with minor modifications. The discrete adjoint operators involving the derivatives of the discrete residual and the cost functional with respect to the flow variables are computed using a complex-variable approach, which provides discrete consistency and drastically reduces the implementation and debugging cycle. The implementation of the time-dependent adjoint method is validated by comparing the sensitivity derivative with that obtained by forward mode differentiation. Our numerical results show that O(10) optimization iterations of the steepest descent method are needed to reduce the objective functional by 3-6 orders of magnitude for test problems considered.
Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher
2013-01-01
Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902
Methodology and issues of integral experiments selection for nuclear data validation
NASA Astrophysics Data System (ADS)
Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian
2017-09-01
Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).
From field data to volumes: constraining uncertainties in pyroclastic eruption parameters
Klawonn, Malin; Houghton, Bruce F.; Swanson, Don; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.
2014-01-01
In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal (s = 62 %) and distal field (s = 53 %) and small for the densely sampled intermediate deposit (s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.
NASA Technical Reports Server (NTRS)
Chidester, Thomas R.; Kanki, Barbara G.; Helmreich, Robert L.
1989-01-01
The crew-factors research program at NASA Ames has developed a methodology for studying the impact of a variety of variables on the effectiveness of crews flying realistic but high workload simulated trips. The validity of investigations using the methodology is enhanced by careful design of full-mission scenarios, performance assessment using converging sources of data, and recruitment of representative subjects. Recently, portions of this methodology have been adapted for use in assessing the effectiveness of crew coordination among participants in line-oriented flight training.
Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D
2015-09-01
Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
Creating, generating and comparing random network models with NetworkRandomizer.
Tosadori, Gabriele; Bestvina, Ivan; Spoto, Fausto; Laudanna, Carlo; Scardoni, Giovanni
2016-01-01
Biological networks are becoming a fundamental tool for the investigation of high-throughput data in several fields of biology and biotechnology. With the increasing amount of information, network-based models are gaining more and more interest and new techniques are required in order to mine the information and to validate the results. To fill the validation gap we present an app, for the Cytoscape platform, which aims at creating randomised networks and randomising existing, real networks. Since there is a lack of tools that allow performing such operations, our app aims at enabling researchers to exploit different, well known random network models that could be used as a benchmark for validating real, biological datasets. We also propose a novel methodology for creating random weighted networks, i.e. the multiplication algorithm, starting from real, quantitative data. Finally, the app provides a statistical tool that compares real versus randomly computed attributes, in order to validate the numerical findings. In summary, our app aims at creating a standardised methodology for the validation of the results in the context of the Cytoscape platform.
A comprehensive review on the quasi-induced exposure technique.
Jiang, Xinguo; Lyles, Richard W; Guo, Runhua
2014-04-01
The goal is to comprehensively examine the state-of-the-art applications and methodological development of quasi-induced exposure and consequently pinpoint the future research directions in terms of implementation guidelines, limitations, and validity tests. The paper conducts a comprehensive review on approximately 45 published papers relevant to quasi-induced exposure regarding four key topics of interest: applications, responsibility assignment, validation of assumptions, and methodological development. Specific findings include that: (1) there is no systematic data screening procedure in place and how the eliminated crash data will impact the responsibility assignment is generally unknown; (2) there is a lack of necessary efforts to assess the validity of assumptions prior to its application and the validation efforts are mostly restricted to the aggregated levels due to the limited availability of exposure truth; and (3) there is a deficiency of quantitative analyses to evaluate the magnitude and directions of bias as a result of injury risks and crash avoidance ability. The paper points out the future research directions and insights in terms of the validity tests and implementation guidelines. Copyright © 2013 Elsevier Ltd. All rights reserved.
Prideaux, Andrew R.; Song, Hong; Hobbs, Robert F.; He, Bin; Frey, Eric C.; Ladenson, Paul W.; Wahl, Richard L.; Sgouros, George
2010-01-01
Phantom-based and patient-specific imaging-based dosimetry methodologies have traditionally yielded mean organ-absorbed doses or spatial dose distributions over tumors and normal organs. In this work, radiobiologic modeling is introduced to convert the spatial distribution of absorbed dose into biologically effective dose and equivalent uniform dose parameters. The methodology is illustrated using data from a thyroid cancer patient treated with radioiodine. Methods Three registered SPECT/CT scans were used to generate 3-dimensional images of radionuclide kinetics (clearance rate) and cumulated activity. The cumulated activity image and corresponding CT scan were provided as input into an EGSnrc-based Monte Carlo calculation: The cumulated activity image was used to define the distribution of decays, and an attenuation image derived from CT was used to define the corresponding spatial tissue density and composition distribution. The rate images were used to convert the spatial absorbed dose distribution to a biologically effective dose distribution, which was then used to estimate a single equivalent uniform dose for segmented volumes of interest. Equivalent uniform dose was also calculated from the absorbed dose distribution directly. Results We validate the method using simple models; compare the dose-volume histogram with a previously analyzed clinical case; and give the mean absorbed dose, mean biologically effective dose, and equivalent uniform dose for an illustrative case of a pediatric thyroid cancer patient with diffuse lung metastases. The mean absorbed dose, mean biologically effective dose, and equivalent uniform dose for the tumor were 57.7, 58.5, and 25.0 Gy, respectively. Corresponding values for normal lung tissue were 9.5, 9.8, and 8.3 Gy, respectively. Conclusion The analysis demonstrates the impact of radiobiologic modeling on response prediction. The 57% reduction in the equivalent dose value for the tumor reflects a high level of dose nonuniformity in the tumor and a corresponding reduced likelihood of achieving a tumor response. Such analyses are expected to be useful in treatment planning for radionuclide therapy. PMID:17504874
Tankiewicz, Maciej; Biziuk, Marek
2018-02-01
A simple and efficient dispersive liquid-liquid microextraction technique (DLLME) was developed by using a mixture of two solvents: 40 μL of tetrachlorethylene (extraction solvent) and 1.0 mL of methanol (disperser solvent), which was rapidly injected with a syringe into 10 mL of water sample. Some important parameters affecting the extraction efficiency, such as type and volume of solvents, water sample volume, extraction time, temperature, pH adjustment and salt addition effect were investigated. Simultaneous determination of 34 commonly used pesticides was performed by using gas chromatography coupled with mass spectrometry (GC-MS). The procedure has been validated in order to obtain the highest efficiency at the lowest concentration levels of analytes to fulfill the requirements of regulations on maximum residue limits. Under the optimum conditions, the linearity range was within 0.0096-100 μg L -1 . The limits of detection (LODs) of the developed DLLME-GC-MS methodology for all investigated pesticides were in the range of 0.0032 (endrin)-0.0174 (diazinon) μg L -1 and limits of quantification (LOQs) from 0.0096 to 0.052 μg L -1 . At lower concentration of 1 μg L -1 for each pesticide, recoveries ranged between 84% (tebufenpyrad) and 108% (deltamethrin) with relative standard deviations (RSDs) (n = 7) from 1.1% (metconazole) to 11% (parathion-mehtyl). This methodology was successfully applied to check contamination of environmental samples. The procedure has proved to be selective, sensitive and precise for the simultaneous determination of various pesticides. The optimized analytical method is very simple and rapid (less than 5 min). Graphical abstract Analytical procedure for testing water samples consists of dispersive liquid-liquid microextraction (DLLME) and gas chromatography coupled with mass spectrometry (GC-MS).
NASA Technical Reports Server (NTRS)
Williams, Daniel M.
2006-01-01
Described is the research process that NASA researchers used to validate the Small Aircraft Transportation System (SATS) Higher Volume Operations (HVO) concept. The four phase building-block validation and verification process included multiple elements ranging from formal analysis of HVO procedures to flight test, to full-system architecture prototype that was successfully shown to the public at the June 2005 SATS Technical Demonstration in Danville, VA. Presented are significant results of each of the four research phases that extend early results presented at ICAS 2004. HVO study results have been incorporated into the development of the Next Generation Air Transportation System (NGATS) vision and offer a validated concept to provide a significant portion of the 3X capacity improvement sought after in the United States National Airspace System (NAS).
Autocalibration method for non-stationary CT bias correction.
Vegas-Sánchez-Ferrero, Gonzalo; Ledesma-Carbayo, Maria J; Washko, George R; Estépar, Raúl San José
2018-02-01
Computed tomography (CT) is a widely used imaging modality for screening and diagnosis. However, the deleterious effects of radiation exposure inherent in CT imaging require the development of image reconstruction methods which can reduce exposure levels. The development of iterative reconstruction techniques is now enabling the acquisition of low-dose CT images whose quality is comparable to that of CT images acquired with much higher radiation dosages. However, the characterization and calibration of the CT signal due to changes in dosage and reconstruction approaches is crucial to provide clinically relevant data. Although CT scanners are calibrated as part of the imaging workflow, the calibration is limited to select global reference values and does not consider other inherent factors of the acquisition that depend on the subject scanned (e.g. photon starvation, partial volume effect, beam hardening) and result in a non-stationary noise response. In this work, we analyze the effect of reconstruction biases caused by non-stationary noise and propose an autocalibration methodology to compensate it. Our contributions are: 1) the derivation of a functional relationship between observed bias and non-stationary noise, 2) a robust and accurate method to estimate the local variance, 3) an autocalibration methodology that does not necessarily rely on a calibration phantom, attenuates the bias caused by noise and removes the systematic bias observed in devices from different vendors. The validation of the proposed methodology was performed with a physical phantom and clinical CT scans acquired with different configurations (kernels, doses, algorithms including iterative reconstruction). The results confirmed the suitability of the proposed methods for removing the intra-device and inter-device reconstruction biases. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Brophy, Jere, Ed.; Pinnegar, Stefinee, Ed.
2005-01-01
This volume is designed to accomplish three primary purposes: (1) illustrate a variety of qualitative methods that researchers have used to study teaching and teacher education; (2) assess the affordances and constraints of these methods and the ways that they focus and shape explorations of teaching; and (3) illuminate representative questions…
ERIC Educational Resources Information Center
Kridel, Craig, Ed.
This collection examines many influences of biographical inquiry in education and discusses methodological issues from the perspectives of veteran and novice biographers. The section on qualitative research and educational biography contains the following chapters: "Musings on Life Writing: Biography and Case Studies in Teacher Education" (Robert…
DOT National Transportation Integrated Search
1975-01-01
The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...
DOT National Transportation Integrated Search
1975-01-01
The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...
ERIC Educational Resources Information Center
Henderson, Harold L.; And Others
Surveys of 188 transit properties and on-site visits were conducted to determine training needs of operators and mechanics in the urban mass transportation industry. Volume I presents findings and conclusions of the study with reference to survey methodology, site visit interviews and observations, questionnaire results, and specific…
DOT National Transportation Integrated Search
1975-01-01
The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...
DOT National Transportation Integrated Search
1975-01-01
The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...
An Introduction to Advertising Research; A Report from the Communications Research Center.
ERIC Educational Resources Information Center
Haskins, Jack B.
The purpose of this volume is to present, in nontechnical language, most of the basic concepts of advertising research. Since the volume is intended to be comprehensible to the lay person, discussion does not go too deeply into the technical details of advertising or research methodology. However, used as an introduction and outline to be…
ERIC Educational Resources Information Center
Dunn, William N.; And Others
This volume presents in one collection a systematic inventory of research and analytic procedures appropriate for generating information on knowledge production, diffusion, and utilization, gathered by the University of Pittsburgh Program for the Study of Knowledge Use. The main concern is with those procedures that focus on the utilization of…
A Compatible Stem Taper-Volume-Weight System For Intensively Managed Fast Growing Loblolly Pine
Yugia Zhang; Bruce E. Borders; Robert L Bailey
2002-01-01
eometry-oriented methodology yielded a compatible taper-volume-weight system of models whose parameters were estimated using data from intensively managed loblolly pine (Pinus taeda L.) plantations in the lower coastal plain of Georgia. Data analysis showed that fertilization has significantly reduced taper (inside and outside bark) on the upper...
DOT National Transportation Integrated Search
1975-01-01
The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...
Methodological Issues in Measuring the Development of Character
ERIC Educational Resources Information Center
Card, Noel A.
2017-01-01
In this article I provide an overview of the methodological issues involved in measuring constructs relevant to character development and education. I begin with a nontechnical overview of the 3 fundamental psychometric properties of measurement: reliability, validity, and equivalence. Developing and evaluating measures to ensure evidence of all 3…
Complicating Methodological Transparency
ERIC Educational Resources Information Center
Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.
2016-01-01
A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…
A Review of Traditional Cloze Testing Methodology.
ERIC Educational Resources Information Center
Heerman, Charles E.
To analyze the validity of W. L. Taylor's cloze testing methodology, this paper first examines three areas contributing to Taylor's thinking: communications theory, the psychology of speech and communication, and the theory of dispositional mechanisms--or nonessential words--in speech. It then evaluates Taylor's research to determine how he…
Activities for Engaging Schools in Health Promotion
ERIC Educational Resources Information Center
Bardi, Mohammad; Burbank, Andrea; Choi, Wayne; Chow, Lawrence; Jang, Wesley; Roccamatisi, Dawn; Timberley-Berg, Tonia; Sanghera, Mandeep; Zhang, Margaret; Macnab, Andrew J.
2014-01-01
Purpose: The purpose of this paper is to describe activities used to initiate health promotion in the school setting. Design/Methodology/Approach: Description of successful pilot Health Promoting School (HPS) initiatives in Canada and Uganda and the validated measures central to each program. Evaluation methodologies: quantitative data from the…