Science.gov

Sample records for accuracy assessment procedures

  1. Landsat classification accuracy assessment procedures

    USGS Publications Warehouse

    Mead, R. R.; Szajgin, John

    1982-01-01

    A working conference was held in Sioux Falls, South Dakota, 12-14 November, 1980 dealing with Landsat classification Accuracy Assessment Procedures. Thirteen formal presentations were made on three general topics: (1) sampling procedures, (2) statistical analysis techniques, and (3) examples of projects which included accuracy assessment and the associated costs, logistical problems, and value of the accuracy data to the remote sensing specialist and the resource manager. Nearly twenty conference attendees participated in two discussion sessions addressing various issues associated with accuracy assessment. This paper presents an account of the accomplishments of the conference.

  2. Airborne Topographic Mapper Calibration Procedures and Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Martin, Chreston F.; Krabill, William B.; Manizade, Serdar S.; Russell, Rob L.; Sonntag, John G.; Swift, Robert N.; Yungel, James K.

    2012-01-01

    Description of NASA Airborn Topographic Mapper (ATM) lidar calibration procedures including analysis of the accuracy and consistancy of various ATM instrument parameters and the resulting influence on topographic elevation measurements. The ATM elevations measurements from a nominal operating altitude 500 to 750 m above the ice surface was found to be: Horizontal Accuracy 74 cm, Horizontal Precision 14 cm, Vertical Accuracy 6.6 cm, Vertical Precision 3 cm.

  3. Procedural Documentation and Accuracy Assessment of Bathymetric Maps and Area/Capacity Tables for Small Reservoirs

    USGS Publications Warehouse

    Wilson, Gary L.; Richards, Joseph M.

    2006-01-01

    Because of the increasing use and importance of lakes for water supply to communities, a repeatable and reliable procedure to determine lake bathymetry and capacity is needed. A method to determine the accuracy of the procedure will help ensure proper collection and use of the data and resulting products. It is important to clearly define the intended products and desired accuracy before conducting the bathymetric survey to ensure proper data collection. A survey-grade echo sounder and differential global positioning system receivers were used to collect water-depth and position data in December 2003 at Sugar Creek Lake near Moberly, Missouri. Data were collected along planned transects, with an additional set of quality-assurance data collected for use in accuracy computations. All collected data were imported into a geographic information system database. A bathymetric surface model, contour map, and area/capacity tables were created from the geographic information system database. An accuracy assessment was completed on the collected data, bathymetric surface model, area/capacity table, and contour map products. Using established vertical accuracy standards, the accuracy of the collected data, bathymetric surface model, and contour map product was 0.67 foot, 0.91 foot, and 1.51 feet at the 95 percent confidence level. By comparing results from different transect intervals with the quality-assurance transect data, it was determined that a transect interval of 1 percent of the longitudinal length of Sugar Creek Lake produced nearly as good results as 0.5 percent transect interval for the bathymetric surface model, area/capacity table, and contour map products.

  4. An Automated Grass-Based Procedure to Assess the Geometrical Accuracy of the Openstreetmap Paris Road Network

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.

    2016-06-01

    OpenStreetMap (OSM) is the largest spatial database of the world. One of the most frequently occurring geospatial elements within this database is the road network, whose quality is crucial for applications such as routing and navigation. Several methods have been proposed for the assessment of OSM road network quality, however they are often tightly coupled to the characteristics of the authoritative dataset involved in the comparison. This makes it hard to replicate and extend these methods. This study relies on an automated procedure which was recently developed for comparing OSM with any road network dataset. It is based on three Python modules for the open source GRASS GIS software and provides measures of OSM road network spatial accuracy and completeness. Provided that the user is familiar with the authoritative dataset used, he can adjust the values of the parameters involved thanks to the flexibility of the procedure. The method is applied to assess the quality of the Paris OSM road network dataset through a comparison against the French official dataset provided by the French National Institute of Geographic and Forest Information (IGN). The results show that the Paris OSM road network has both a high completeness and spatial accuracy. It has a greater length than the IGN road network, and is found to be suitable for applications requiring spatial accuracies up to 5-6 m. Also, the results confirm the flexibility of the procedure for supporting users in carrying out their own comparisons between OSM and reference road datasets.

  5. GEOSPATIAL DATA ACCURACY ASSESSMENT

    EPA Science Inventory

    The development of robust accuracy assessment methods for the validation of spatial data represent's a difficult scientific challenge for the geospatial science community. The importance and timeliness of this issue is related directly to the dramatic escalation in the developmen...

  6. Ground Truth Sampling and LANDSAT Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Robinson, J. W.; Gunther, F. J.; Campbell, W. J.

    1982-01-01

    It is noted that the key factor in any accuracy assessment of remote sensing data is the method used for determining the ground truth, independent of the remote sensing data itself. The sampling and accuracy procedures developed for nuclear power plant siting study are described. The purpose of the sampling procedure was to provide data for developing supervised classifications for two study sites and for assessing the accuracy of that and the other procedures used. The purpose of the accuracy assessment was to allow the comparison of the cost and accuracy of various classification procedures as applied to various data types.

  7. Accuracy of remotely sensed data: Sampling and analysis procedures

    NASA Technical Reports Server (NTRS)

    Congalton, R. G.; Oderwald, R. G.; Mead, R. A.

    1982-01-01

    A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given.

  8. Indirect assessment of stifle angle for improved accuracy of preoperative planning of tibial osteotomy procedures in dogs.

    PubMed

    Barnes, D C; Owen, M R

    2015-07-25

    To assess reliability of the mechanical axes stifle angle in dogs positioned for radiography with a neutral stifle (neutral stifle angle (nSA)). To investigate radiographic landmarks for assessment of nSA from a collimated radiographic view. One hundred radiographs were taken of normal stifles belonging to 55 skeletally mature medium and large breed dogs, positioned using a repeatable protocol. Radiographs were widely collimated to include the femoral head and the talus. The angle of Blumensaat's line through the intercondylar fossa relative to the Mechanical Axis of the femur (intercondylar fossa angle, IFA), the angle of inclination of a tibial crest tangent line relative to the Mechanical Axis of the tibia (tibial crest angle, TCA) and the tibial plateau angle (TPA) were recorded. Mean nSA was 133.5°. Mean IFA was 155.5°. TCA had a mean of 6.7°. Estimates for nSA were calculated using mean IFA combined with mean TCA (enSA1), mean TPA (enSA2) and the mechanical axis of the tibia (enSA3). Mean percentage error relative was 2.99 per cent for enSA1, 2.82 per cent for enSA2, 1.67 per cent for enSA3. Blumensaat's line provides a consistent radiological feature for assessment of nSA. Assessment of nSA and correction for values varying from 135° may allow more consistent and accurate measurement of patellar tendon angle for presurgical planning.

  9. Improving Accuracy of Assessment Procedures.

    ERIC Educational Resources Information Center

    Nash, A. H.

    A review of the grading practices of various departments in the Western Australian Institute of Technology is the topic of this paper. The study was initiated in 1969, when an examination of scores given by various departments revealed a large year-to-year fluctuation. It was noted that some departments consistently graded higher than others. A…

  10. When Does Choice of Accuracy Measure Alter Imputation Accuracy Assessments?

    PubMed

    Ramnarine, Shelina; Zhang, Juan; Chen, Li-Shiun; Culverhouse, Robert; Duan, Weimin; Hancock, Dana B; Hartz, Sarah M; Johnson, Eric O; Olfson, Emily; Schwantes-An, Tae-Hwi; Saccone, Nancy L

    2015-01-01

    Imputation, the process of inferring genotypes for untyped variants, is used to identify and refine genetic association findings. Inaccuracies in imputed data can distort the observed association between variants and a disease. Many statistics are used to assess accuracy; some compare imputed to genotyped data and others are calculated without reference to true genotypes. Prior work has shown that the Imputation Quality Score (IQS), which is based on Cohen's kappa statistic and compares imputed genotype probabilities to true genotypes, appropriately adjusts for chance agreement; however, it is not commonly used. To identify differences in accuracy assessment, we compared IQS with concordance rate, squared correlation, and accuracy measures built into imputation programs. Genotypes from the 1000 Genomes reference populations (AFR N = 246 and EUR N = 379) were masked to match the typed single nucleotide polymorphism (SNP) coverage of several SNP arrays and were imputed with BEAGLE 3.3.2 and IMPUTE2 in regions associated with smoking behaviors. Additional masking and imputation was conducted for sequenced subjects from the Collaborative Genetic Study of Nicotine Dependence and the Genetic Study of Nicotine Dependence in African Americans (N = 1,481 African Americans and N = 1,480 European Americans). Our results offer further evidence that concordance rate inflates accuracy estimates, particularly for rare and low frequency variants. For common variants, squared correlation, BEAGLE R2, IMPUTE2 INFO, and IQS produce similar assessments of imputation accuracy. However, for rare and low frequency variants, compared to IQS, the other statistics tend to be more liberal in their assessment of accuracy. IQS is important to consider when evaluating imputation accuracy, particularly for rare and low frequency variants. PMID:26458263

  11. Orbit accuracy assessment for Seasat

    NASA Technical Reports Server (NTRS)

    Schutz, B. E.; Tapley, B. D.

    1980-01-01

    Laser range measurements are used to determine the orbit of Seasat during the period from July 28, 1978, to Aug. 14, 1978, and the influence of the gravity field, atmospheric drag, and solar radiation pressure on the orbit accuracy is investigated. It is noted that for the orbits of three-day duration, little distinction can be made between the influence of different atmospheric models. It is found that the special Seasat gravity field PGS-S3 is most consistent with the data for three-day orbits, but an unmodeled systematic effect in radiation pressure is noted. For orbits of 18-day duration, little distinction can be made between the results derived from the PGS gravity fields. It is also found that the geomagnetic field is an influential factor in the atmospheric modeling during this time period. Seasat altimeter measurements are used to determine the accuracy of the altimeter measurement time tag and to evaluate the orbital accuracy.

  12. Accuracy assessment of GPS satellite orbits

    NASA Technical Reports Server (NTRS)

    Schutz, B. E.; Tapley, B. D.; Abusali, P. A. M.; Ho, C. S.

    1991-01-01

    GPS orbit accuracy is examined using several evaluation procedures. The existence is shown of unmodeled effects which correlate with the eclipsing of the sun. The ability to obtain geodetic results that show an accuracy of 1-2 parts in 10 to the 8th or better has not diminished.

  13. Skinfold Assessment: Accuracy and Application

    ERIC Educational Resources Information Center

    Ball, Stephen; Swan, Pamela D.; Altena, Thomas S.

    2006-01-01

    Although not perfect, skinfolds (SK), or the measurement of fat under the skin, remains the most popular and practical method available to assess body composition on a large scale (Kuczmarski, Flegal, Campbell, & Johnson, 1994). Even for practitioners who have been using SK for years and are highly proficient at locating the correct anatomical…

  14. Estimating Classification Consistency and Accuracy for Cognitive Diagnostic Assessment

    ERIC Educational Resources Information Center

    Cui, Ying; Gierl, Mark J.; Chang, Hua-Hua

    2012-01-01

    This article introduces procedures for the computation and asymptotic statistical inference for classification consistency and accuracy indices specifically designed for cognitive diagnostic assessments. The new classification indices can be used as important indicators of the reliability and validity of classification results produced by…

  15. Arizona Vegetation Resource Inventory (AVRI) accuracy assessment

    USGS Publications Warehouse

    Szajgin, John; Pettinger, L.R.; Linden, D.S.; Ohlen, D.O.

    1982-01-01

    A quantitative accuracy assessment was performed for the vegetation classification map produced as part of the Arizona Vegetation Resource Inventory (AVRI) project. This project was a cooperative effort between the Bureau of Land Management (BLM) and the Earth Resources Observation Systems (EROS) Data Center. The objective of the accuracy assessment was to estimate (with a precision of ?10 percent at the 90 percent confidence level) the comission error in each of the eight level II hierarchical vegetation cover types. A stratified two-phase (double) cluster sample was used. Phase I consisted of 160 photointerpreted plots representing clusters of Landsat pixels, and phase II consisted of ground data collection at 80 of the phase I cluster sites. Ground data were used to refine the phase I error estimates by means of a linear regression model. The classified image was stratified by assigning each 15-pixel cluster to the stratum corresponding to the dominant cover type within each cluster. This method is known as stratified plurality sampling. Overall error was estimated to be 36 percent with a standard error of 2 percent. Estimated error for individual vegetation classes ranged from a low of 10 percent ?6 percent for evergreen woodland to 81 percent ?7 percent for cropland and pasture. Total cost of the accuracy assessment was $106,950 for the one-million-hectare study area. The combination of the stratified plurality sampling (SPS) method of sample allocation with double sampling provided the desired estimates within the required precision levels. The overall accuracy results confirmed that highly accurate digital classification of vegetation is difficult to perform in semiarid environments, due largely to the sparse vegetation cover. Nevertheless, these techniques show promise for providing more accurate information than is presently available for many BLM-administered lands.

  16. Positional Accuracy Assessment of Googleearth in Riyadh

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf; Algarni, Dafer

    2014-06-01

    Google Earth is a virtual globe, map and geographical information program that is controlled by Google corporation. It maps the Earth by the superimposition of images obtained from satellite imagery, aerial photography and GIS 3D globe. With millions of users all around the globe, GoogleEarth® has become the ultimate source of spatial data and information for private and public decision-support systems besides many types and forms of social interactions. Many users mostly in developing countries are also using it for surveying applications, the matter that raises questions about the positional accuracy of the Google Earth program. This research presents a small-scale assessment study of the positional accuracy of GoogleEarth® Imagery in Riyadh; capital of Kingdom of Saudi Arabia (KSA). The results show that the RMSE of the GoogleEarth imagery is 2.18 m and 1.51 m for the horizontal and height coordinates respectively.

  17. Environmental Impact Assessment: A Procedure.

    ERIC Educational Resources Information Center

    Stover, Lloyd V.

    Prepared by a firm of consulting engineers, this booklet outlines the procedural "whys and hows" of assessing environmental impact, particularly for the construction industry. Section I explores the need for environmental assessment and evaluation to determine environmental impact. It utilizes a review of the National Environmental Policy Act and…

  18. Accuracy Assessment of Altimeter Derived Geostrophic Velocities

    NASA Astrophysics Data System (ADS)

    Leben, R. R.; Powell, B. S.; Born, G. H.; Guinasso, N. L.

    2002-12-01

    Along track sea surface height anomaly gradients are proportional to cross track geostrophic velocity anomalies allowing satellite altimetry to provide much needed satellite observations of changes in the geostrophic component of surface ocean currents. Often, surface height gradients are computed from altimeter data archives that have been corrected to give the most accurate absolute sea level, a practice that may unnecessarily increase the error in the cross track velocity anomalies and thereby require excessive smoothing to mitigate noise. Because differentiation along track acts as a high-pass filter, many of the path length corrections applied to altimeter data for absolute height accuracy are unnecessary for the corresponding gradient calculations. We report on a study to investigate appropriate altimetric corrections and processing techniques for improving geostrophic velocity accuracy. Accuracy is assessed by comparing cross track current measurements from two moorings placed along the descending TOPEX/POSEIDON ground track number 52 in the Gulf of Mexico to the corresponding altimeter velocity estimates. The buoys are deployed and maintained by the Texas Automated Buoy System (TABS) under Interagency Contracts with Texas A&M University. The buoys telemeter observations in near real-time via satellite to the TABS station located at the Geochemical and Environmental Research Group (GERG) at Texas A&M. Buoy M is located in shelf waters of 57 m depth with a second, Buoy N, 38 km away on the shelf break at 105 m depth. Buoy N has been operational since the beginning of 2002 and has a current meter at 2m depth providing in situ measurements of surface velocities coincident with Jason and TOPEX/POSEIDON altimeter over flights. This allows one of the first detailed comparisons of shallow water near surface current meter time series to coincident altimetry.

  19. Accuracy assessment of fluoroscopy-transesophageal echocardiography registration

    NASA Astrophysics Data System (ADS)

    Lang, Pencilla; Seslija, Petar; Bainbridge, Daniel; Guiraudon, Gerard M.; Jones, Doug L.; Chu, Michael W.; Holdsworth, David W.; Peters, Terry M.

    2011-03-01

    This study assesses the accuracy of a new transesophageal (TEE) ultrasound (US) fluoroscopy registration technique designed to guide percutaneous aortic valve replacement. In this minimally invasive procedure, a valve is inserted into the aortic annulus via a catheter. Navigation and positioning of the valve is guided primarily by intra-operative fluoroscopy. Poor anatomical visualization of the aortic root region can result in incorrect positioning, leading to heart valve embolization, obstruction of the coronary ostia and acute kidney injury. The use of TEE US images to augment intra-operative fluoroscopy provides significant improvements to image-guidance. Registration is achieved using an image-based TEE probe tracking technique and US calibration. TEE probe tracking is accomplished using a single-perspective pose estimation algorithm. Pose estimation from a single image allows registration to be achieved using only images collected in standard OR workflow. Accuracy of this registration technique is assessed using three models: a point target phantom, a cadaveric porcine heart with implanted fiducials, and in-vivo porcine images. Results demonstrate that registration can be achieved with an RMS error of less than 1.5mm, which is within the clinical accuracy requirements of 5mm. US-fluoroscopy registration based on single-perspective pose estimation demonstrates promise as a method for providing guidance to percutaneous aortic valve replacement procedures. Future work will focus on real-time implementation and a visualization system that can be used in the operating room.

  20. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  1. Test procedures help ensure accuracy of orifice meters

    SciTech Connect

    Fillman, C.R.

    1996-07-01

    Orifice meter measurement with a chart recorder has been a standard in the petroleum industry for years. The meter consists of the plate/tube and recorder, requires minimal maintenance and can accurately measure a wide range of flow rates. It must be routinely tested to ensure sustained accuracy. The orifice meter measures differential pressure, static pressure, and temperature. However, the accuracy of the measurement is only as good as the calibration devices used in the test. A typical meter test consists of meter calibration, orifice plate inspection, quality of gas tests, and documentation (test report) to verify the data. The paper describes 19 steps that a gas technician can follow to conduct a thorough meter test.

  2. 15 CFR 971.1001 - Assessment procedure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Enforcement § 971.1001 Assessment procedure. Subpart B of 15 CFR part 904 governs the procedures for assessing a civil penalty...

  3. 15 CFR 971.1001 - Assessment procedure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Enforcement § 971.1001 Assessment procedure. Subpart B of 15 CFR part 904 governs the procedures for assessing a civil penalty...

  4. 15 CFR 971.1001 - Assessment procedure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Enforcement § 971.1001 Assessment procedure. Subpart B of 15 CFR part 904 governs the procedures for assessing a civil penalty...

  5. 15 CFR 971.1001 - Assessment procedure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Enforcement § 971.1001 Assessment procedure. Subpart B of 15 CFR part 904 governs the procedures for assessing a civil penalty...

  6. 15 CFR 971.1001 - Assessment procedure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Enforcement § 971.1001 Assessment procedure. Subpart B of 15 CFR part 904 governs the procedures for assessing a civil penalty...

  7. Accuracy Assessment for AG500, Electromagnetic Articulograph

    ERIC Educational Resources Information Center

    Yunusova, Yana; Green, Jordan R.; Mefferd, Antje

    2009-01-01

    Purpose: The goal of this article was to evaluate the accuracy and reliability of the AG500 (Carstens Medizinelectronik, Lenglern, Germany), an electromagnetic device developed recently to register articulatory movements in three dimensions. This technology seems to have unprecedented capabilities to provide rich information about time-varying…

  8. Accuracy Of Stereometry In Assessing Orthognathic Surgery

    NASA Astrophysics Data System (ADS)

    King, Geoffrey E.; Bays, R. A.

    1983-07-01

    An X-ray stereometric technique has been developed for the determination of 3-dimensional coordinates of spherical metallic markers previously implanted in monkey skulls. The accuracy of the technique is better than 0.5mm. and uses readily available demountable X-ray equipment. The technique is used to study the effects and stability of experimental orthognathic surgery.

  9. Assessment Of Accuracies Of Remote-Sensing Maps

    NASA Technical Reports Server (NTRS)

    Card, Don H.; Strong, Laurence L.

    1992-01-01

    Report describes study of accuracies of classifications of picture elements in map derived by digital processing of Landsat-multispectral-scanner imagery of coastal plain of Arctic National Wildlife Refuge. Accuracies of portions of map analyzed with help of statistical sampling procedure called "stratified plurality sampling", in which all picture elements in given cluster classified in stratum to which plurality of them belong.

  10. Multimodality Image Fusion-Guided Procedures: Technique, Accuracy, and Applications

    SciTech Connect

    Abi-Jaoudeh, Nadine; Kruecker, Jochen; Kadoury, Samuel; Kobeiter, Hicham; Venkatesan, Aradhana M. Levy, Elliot Wood, Bradford J.

    2012-10-15

    Personalized therapies play an increasingly critical role in cancer care: Image guidance with multimodality image fusion facilitates the targeting of specific tissue for tissue characterization and plays a role in drug discovery and optimization of tailored therapies. Positron-emission tomography (PET), magnetic resonance imaging (MRI), and contrast-enhanced computed tomography (CT) may offer additional information not otherwise available to the operator during minimally invasive image-guided procedures, such as biopsy and ablation. With use of multimodality image fusion for image-guided interventions, navigation with advanced modalities does not require the physical presence of the PET, MRI, or CT imaging system. Several commercially available methods of image-fusion and device navigation are reviewed along with an explanation of common tracking hardware and software. An overview of current clinical applications for multimodality navigation is provided.

  11. Accuracy assessment of NLCD 2006 land cover and impervious surface

    USGS Publications Warehouse

    Wickham, James D.; Stehman, Stephen V.; Gass, Leila; Dewitz, Jon; Fry, Joyce A.; Wade, Timothy G.

    2013-01-01

    Release of NLCD 2006 provides the first wall-to-wall land-cover change database for the conterminous United States from Landsat Thematic Mapper (TM) data. Accuracy assessment of NLCD 2006 focused on four primary products: 2001 land cover, 2006 land cover, land-cover change between 2001 and 2006, and impervious surface change between 2001 and 2006. The accuracy assessment was conducted by selecting a stratified random sample of pixels with the reference classification interpreted from multi-temporal high resolution digital imagery. The NLCD Level II (16 classes) overall accuracies for the 2001 and 2006 land cover were 79% and 78%, respectively, with Level II user's accuracies exceeding 80% for water, high density urban, all upland forest classes, shrubland, and cropland for both dates. Level I (8 classes) accuracies were 85% for NLCD 2001 and 84% for NLCD 2006. The high overall and user's accuracies for the individual dates translated into high user's accuracies for the 2001–2006 change reporting themes water gain and loss, forest loss, urban gain, and the no-change reporting themes for water, urban, forest, and agriculture. The main factor limiting higher accuracies for the change reporting themes appeared to be difficulty in distinguishing the context of grass. We discuss the need for more research on land-cover change accuracy assessment.

  12. Alaska national hydrography dataset positional accuracy assessment study

    USGS Publications Warehouse

    Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy

    2013-01-01

    Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.

  13. Contemporary flow meters: an assessment of their accuracy and reliability.

    PubMed

    Christmas, T J; Chapple, C R; Rickards, D; Milroy, E J; Turner-Warwick, R T

    1989-05-01

    The accuracy, reliability and cost effectiveness of 5 currently marketed flow meters have been assessed. The mechanics of each meter is briefly described in relation to its accuracy and robustness. The merits and faults of the meters are discussed and the important features of flow measurements that need to be taken into account when making diagnostic interpretations are emphasised.

  14. Assessment of the Thematic Accuracy of Land Cover Maps

    NASA Astrophysics Data System (ADS)

    Höhle, J.

    2015-08-01

    Several land cover maps are generated from aerial imagery and assessed by different approaches. The test site is an urban area in Europe for which six classes (`building', `hedge and bush', `grass', `road and parking lot', `tree', `wall and car port') had to be derived. Two classification methods were applied (`Decision Tree' and `Support Vector Machine') using only two attributes (height above ground and normalized difference vegetation index) which both are derived from the images. The assessment of the thematic accuracy applied a stratified design and was based on accuracy measures such as user's and producer's accuracy, and kappa coefficient. In addition, confidence intervals were computed for several accuracy measures. The achieved accuracies and confidence intervals are thoroughly analysed and recommendations are derived from the gained experiences. Reliable reference values are obtained using stereovision, false-colour image pairs, and positioning to the checkpoints with 3D coordinates. The influence of the training areas on the results is studied. Cross validation has been tested with a few reference points in order to derive approximate accuracy measures. The two classification methods perform equally for five classes. Trees are classified with a much better accuracy and a smaller confidence interval by means of the decision tree method. Buildings are classified by both methods with an accuracy of 99% (95% CI: 95%-100%) using independent 3D checkpoints. The average width of the confidence interval of six classes was 14% of the user's accuracy.

  15. Evaluating the Accuracy of Pharmacy Students' Self-Assessment Skills

    PubMed Central

    Gregory, Paul A. M.

    2007-01-01

    Objectives To evaluate the accuracy of self-assessment skills of senior-level bachelor of science pharmacy students. Methods A method proposed by Kruger and Dunning involving comparisons of pharmacy students' self-assessment with weighted average assessments of peers, standardized patients, and pharmacist-instructors was used. Results Eighty students participated in the study. Differences between self-assessment and external assessments were found across all performance quartiles. These differences were particularly large and significant in the third and fourth (lowest) quartiles and particularly marked in the areas of empathy, and logic/focus/coherence of interviewing. Conclusions The quality and accuracy of pharmacy students' self-assessment skills were not as strong as expected, particularly given recent efforts to include self-assessment in the curriculum. Further work is necessary to ensure this important practice competency and life skill is at the level expected for professional practice and continuous professional development. PMID:17998986

  16. The 'obsolescence' of assessment procedures.

    PubMed

    Russell, Elbert W

    2010-01-01

    The concept that obsolescence or being "out of date" makes a test or procedure invalid ("inaccurate," "inappropriate," "not useful," "creating wrong interpretations," etc.) has been widely accepted in psychology and neuropsychology. Such obsolescence, produced by publishing a new version of a test, has produced an extensive nullification of research effort (probably 10,000 Wechsler studies). The arguments, attempting to justify obsolescence, include the Flynn Effect, the creation of a new version of a test or simply time. However, the Flynn Effect appears to have plateaued. In psychometric theory, validated tests do not lose their validity due to the creation of newer versions. Time does not invalidate tests due to the improvement of neurological methodology, such as magnetic resonance imaging. This assumption is unscientific, unproven, and if true, would discredit all older neuropsychological and neurological knowledge. In science, no method, theory, or information, once validated, loses that validation merely due to time or the creation of another test or procedure. Once validated, a procedure is only disproved or replaced by means of new research. PMID:20146123

  17. Robust methods for assessing the accuracy of linear interpolated DEM

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Shi, Wenzhong; Liu, Eryong

    2015-02-01

    Methods for assessing the accuracy of a digital elevation model (DEM) with emphasis on robust methods have been studied in this paper. Based on the squared DEM residual population generated by the bi-linear interpolation method, three average-error statistics including (a) mean, (b) median, and (c) M-estimator are thoroughly investigated for measuring the interpolated DEM accuracy. Correspondingly, their confidence intervals are also constructed for each average error statistic to further evaluate the DEM quality. The first method mainly utilizes the student distribution while the second and third are derived from the robust theories. These innovative robust methods possess the capability of counteracting the outlier effects or even the skew distributed residuals in DEM accuracy assessment. Experimental studies using Monte Carlo simulation have commendably investigated the asymptotic convergence behavior of confidence intervals constructed by these three methods with the increase of sample size. It is demonstrated that the robust methods can produce more reliable DEM accuracy assessment results compared with those by the classical t-distribution-based method. Consequently, these proposed robust methods are strongly recommended for assessing DEM accuracy, particularly for those cases where the DEM residual population is evidently non-normal or heavily contaminated with outliers.

  18. ASSESSING ACCURACY OF NET CHANGE DERIVED FROM LAND COVER MAPS

    EPA Science Inventory

    Net change derived from land-cover maps provides important descriptive information for environmental monitoring and is often used as an input or explanatory variable in environmental models. The sampling design and analysis for assessing net change accuracy differ from traditio...

  19. A new procedure to measure children's reading speed and accuracy in Italian.

    PubMed

    Morlini, Isabella; Stella, Giacomo; Scorza, Maristella

    2014-02-01

    Impaired readers in primary school should be early recognized, in order to asses a targeted intervention within the school and to start a teaching that respects the difficulties in learning to read, to write and to perform calculations. Screening procedures, inside the primary schools aimed at detecting children with difficulties in reading, are of fundamental importance for guaranteeing an early identification of dyslexic children and reducing both the primary negative effects--on learning--and the secondary negative effects--on the development of the personality--of this disturbance. In this study, we propose a new screening procedure measuring reading speed and accuracy. This procedure is very fast (it is exactly 1 min long), simple, cheap and can be provided by teachers without technical knowledge. On the contrary, most of the currently used diagnostic tests are about 10 min long and must be provided by experts. These two major flaws prevent the widespread use of these tests. On the basis of the results obtained in a survey on about 1500 students attending primary school in Italy, we investigate the relationships between variables used in the screening procedure and variables measuring speed and accuracy in the currently used diagnostic tests in Italy. Then, we analyse the validity of the screening procedure from a statistical point of view, and with an explorative factor analysis, we show that reading speed and accuracy seem to be two separate symptoms of the dyslexia phenomenon.

  20. 12 CFR 717.42 - Reasonable policies and procedures concerning the accuracy and integrity of furnished information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Furnishers of Information § 717.42 Reasonable policies and procedures concerning the accuracy and integrity of furnished information. (a) Policies and procedures. Each furnisher must establish and implement reasonable written policies and procedures regarding the accuracy and integrity of the information...

  1. 12 CFR 334.42 - Reasonable policies and procedures concerning the accuracy and integrity of furnished information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Furnishers of Information § 334.42 Reasonable policies and procedures concerning the accuracy and integrity of furnished information. (a) Policies and procedures. Each furnisher must establish and implement reasonable written policies and procedures regarding the accuracy and integrity of the information...

  2. Commissioning Procedures for Mechanical Precision and Accuracy in a Dedicated LINAC

    SciTech Connect

    Ballesteros-Zebadua, P.; Larrga-Gutierrez, J. M.; Garcia-Garduno, O. A.; Juarez, J.; Prieto, I.; Moreno-Jimenez, S.; Celis, M. A.

    2008-08-11

    Mechanical precision measurements are fundamental procedures for the commissioning of a dedicated LINAC. At our Radioneurosurgery Unit, these procedures can be suitable as quality assurance routines that allow the verification of the equipment geometrical accuracy and precision. In this work mechanical tests were performed for gantry and table rotation, obtaining mean associated uncertainties of 0.3 mm and 0.71 mm, respectively. Using an anthropomorphic phantom and a series of localized surface markers, isocenter accuracy showed to be smaller than 0.86 mm for radiosurgery procedures and 0.95 mm for fractionated treatments with mask. All uncertainties were below tolerances. The highest contribution to mechanical variations is due to table rotation, so it is important to correct variations using a localization frame with printed overlays. Mechanical precision knowledge would allow to consider the statistical errors in the treatment planning volume margins.

  3. [Navigation in implantology: Accuracy assessment regarding the literature].

    PubMed

    Barrak, Ibrahim Ádám; Varga, Endre; Piffko, József

    2016-06-01

    Our objective was to assess the literature regarding the accuracy of the different static guided systems. After applying electronic literature search we found 661 articles. After reviewing 139 articles, the authors chose 52 articles for full-text evaluation. 24 studies involved accuracy measurements. Fourteen of our selected references were clinical and ten of them were in vitro (modell or cadaver). Variance-analysis (Tukey's post-hoc test; p < 0.05) was conducted to summarize the selected publications. Regarding 2819 results the average mean error at the entry point was 0.98 mm. At the level of the apex the average deviation was 1.29 mm while the mean of the angular deviation was 3,96 degrees. Significant difference could be observed between the two methods of implant placement (partially and fully guided sequence) in terms of deviation at the entry point, apex and angular deviation. Different levels of quality and quantity of evidence were available for assessing the accuracy of the different computer-assisted implant placement. The rapidly evolving field of digital dentistry and the new developments will further improve the accuracy of guided implant placement. In the interest of being able to draw dependable conclusions and for the further evaluation of the parameters used for accuracy measurements, randomized, controlled single or multi-centered clinical trials are necessary. PMID:27544966

  4. [Navigation in implantology: Accuracy assessment regarding the literature].

    PubMed

    Barrak, Ibrahim Ádám; Varga, Endre; Piffko, József

    2016-06-01

    Our objective was to assess the literature regarding the accuracy of the different static guided systems. After applying electronic literature search we found 661 articles. After reviewing 139 articles, the authors chose 52 articles for full-text evaluation. 24 studies involved accuracy measurements. Fourteen of our selected references were clinical and ten of them were in vitro (modell or cadaver). Variance-analysis (Tukey's post-hoc test; p < 0.05) was conducted to summarize the selected publications. Regarding 2819 results the average mean error at the entry point was 0.98 mm. At the level of the apex the average deviation was 1.29 mm while the mean of the angular deviation was 3,96 degrees. Significant difference could be observed between the two methods of implant placement (partially and fully guided sequence) in terms of deviation at the entry point, apex and angular deviation. Different levels of quality and quantity of evidence were available for assessing the accuracy of the different computer-assisted implant placement. The rapidly evolving field of digital dentistry and the new developments will further improve the accuracy of guided implant placement. In the interest of being able to draw dependable conclusions and for the further evaluation of the parameters used for accuracy measurements, randomized, controlled single or multi-centered clinical trials are necessary.

  5. Survey methods for assessing land cover map accuracy

    USGS Publications Warehouse

    Nusser, S.M.; Klaas, E.E.

    2003-01-01

    The increasing availability of digital photographic materials has fueled efforts by agencies and organizations to generate land cover maps for states, regions, and the United States as a whole. Regardless of the information sources and classification methods used, land cover maps are subject to numerous sources of error. In order to understand the quality of the information contained in these maps, it is desirable to generate statistically valid estimates of accuracy rates describing misclassification errors. We explored a full sample survey framework for creating accuracy assessment study designs that balance statistical and operational considerations in relation to study objectives for a regional assessment of GAP land cover maps. We focused not only on appropriate sample designs and estimation approaches, but on aspects of the data collection process, such as gaining cooperation of land owners and using pixel clusters as an observation unit. The approach was tested in a pilot study to assess the accuracy of Iowa GAP land cover maps. A stratified two-stage cluster sampling design addressed sample size requirements for land covers and the need for geographic spread while minimizing operational effort. Recruitment methods used for private land owners yielded high response rates, minimizing a source of nonresponse error. Collecting data for a 9-pixel cluster centered on the sampled pixel was simple to implement, and provided better information on rarer vegetation classes as well as substantial gains in precision relative to observing data at a single-pixel.

  6. Improvement of olfactometric measurement accuracy and repeatability by optimization of panel selection procedures.

    PubMed

    Capelli, L; Sironi, S; Del Rosso, R; Céntola, P; Bonati, S

    2010-01-01

    The EN 13725:2003, which standardizes the determination of odour concentration by dynamic olfactometry, fixes the limits for panel selection in terms of individual threshold towards a reference gas (n-butanol in nitrogen) and of standard deviation of the responses. Nonetheless, laboratories have some degrees of freedom in developing their own procedures for panel selection and evaluation. Most Italian olfactometric laboratories use a similar procedure for panel selection, based on the repeated analysis of samples of n-butanol at a concentration of 60 ppm. The first part of this study demonstrates that this procedure may originate a sort of "smartening" of the assessors, which means that they become able to guess the right answers in order to maintain their qualification as panel members, independently from their real olfactory perception. For this reason, the panel selection procedure has been revised with the aim of making it less repetitive, therefore preventing the possibility for panel members to be able to guess the best answers in order to comply with the selection criteria. The selection of new panel members and the screening of the active ones according to this revised procedure proved this new procedure to be more selective than the "standard" one. Finally, the results of the tests with n-butanol conducted after the introduction of the revised procedure for panel selection and regular verification showed an effective improvement of the laboratory measurement performances in terms of accuracy and precision.

  7. Improvement of olfactometric measurement accuracy and repeatability by optimization of panel selection procedures.

    PubMed

    Capelli, L; Sironi, S; Del Rosso, R; Céntola, P; Bonati, S

    2010-01-01

    The EN 13725:2003, which standardizes the determination of odour concentration by dynamic olfactometry, fixes the limits for panel selection in terms of individual threshold towards a reference gas (n-butanol in nitrogen) and of standard deviation of the responses. Nonetheless, laboratories have some degrees of freedom in developing their own procedures for panel selection and evaluation. Most Italian olfactometric laboratories use a similar procedure for panel selection, based on the repeated analysis of samples of n-butanol at a concentration of 60 ppm. The first part of this study demonstrates that this procedure may originate a sort of "smartening" of the assessors, which means that they become able to guess the right answers in order to maintain their qualification as panel members, independently from their real olfactory perception. For this reason, the panel selection procedure has been revised with the aim of making it less repetitive, therefore preventing the possibility for panel members to be able to guess the best answers in order to comply with the selection criteria. The selection of new panel members and the screening of the active ones according to this revised procedure proved this new procedure to be more selective than the "standard" one. Finally, the results of the tests with n-butanol conducted after the introduction of the revised procedure for panel selection and regular verification showed an effective improvement of the laboratory measurement performances in terms of accuracy and precision. PMID:20220249

  8. Classification, change-detection and accuracy assessment: Toward fuller automation

    NASA Astrophysics Data System (ADS)

    Podger, Nancy E.

    This research aims to automate methods for conducting change detection studies using remotely sensed images. Five major objectives were tested on two study sites, one encompassing Madison, Wisconsin, and the other Fort Hood, Texas. (Objective 1) Enhance accuracy assessments by estimating standard errors using bootstrap analysis. Bootstrap estimates of the standard errors were found to be comparable to parametric statistical estimates. Also, results show that bootstrapping can be used to evaluate the consistency of a classification process. (Objective 2) Automate the guided clustering classifier. This research shows that the guided clustering classification process can be automated while maintaining highly accurate results. Three different evaluation methods were used. (Evaluation 1) Appraised the consistency of 25 classifications produced from the automated system. The classifications differed from one another by only two to four percent. (Evaluation 2) Compared accuracies produced by the automated system to classification accuracies generated following a manual guided clustering protocol. Results: The automated system produced higher overall accuracies in 50 percent of the tests and was comparable for all but one of the remaining tests. (Evaluation 3) Assessed the time and effort required to produce accurate classifications. Results: The automated system produced classifications in less time and with less effort than the manual 'protocol' method. (Objective 3) Built a flexible, interactive software tool to aid in producing binary change masks. (Objective 4) Reduced by automation the amount of training data needed to classify the second image of a two-time-period change detection project. Locations of the training sites in 'unchanged' areas employed to classify the first image were used to identify sites where spectral information was automatically extracted from the second image. Results: The automatically generated training data produces classification accuracies

  9. Assessing accuracy of an electronic provincial medication repository

    PubMed Central

    2012-01-01

    Background Jurisdictional drug information systems are being implemented in many regions around the world. British Columbia, Canada has had a provincial medication dispensing record, PharmaNet, system since 1995. Little is known about how accurately PharmaNet reflects actual medication usage. Methods This prospective, multi-centre study compared pharmacist collected Best Possible Medication Histories (BPMH) to PharmaNet profiles to assess accuracy of the PharmaNet profiles for patients receiving a BPMH as part of clinical care. A review panel examined the anonymized BPMHs and discrepancies to estimate clinical significance of discrepancies. Results 16% of medication profiles were accurate, with 48% of the discrepant profiles considered potentially clinically significant by the clinical review panel. Cardiac medications tended to be more accurate (e.g. ramipril was accurate >90% of the time), while insulin, warfarin, salbutamol and pain relief medications were often inaccurate (80–85% of the time). 1215 sequential BPMHs were collected and reviewed for this study. Conclusions The PharmaNet medication repository has a low accuracy and should be used in conjunction with other sources for medication histories for clinical or research purposes. This finding is consistent with other, smaller medication repository accuracy studies in other jurisdictions. Our study highlights specific medications that tend to be lower in accuracy. PMID:22621690

  10. OECD Recommends Procedures for Assessing Chemicals

    ERIC Educational Resources Information Center

    Idman, Mariatta

    1977-01-01

    Previously the OECD Council recommended assessment of all chemicals before their production or sale. In this article five guidelines for this process are put forth. Guidelines include procedures for chemical analysis and surveillance, and review. This scheme is proposed as a cooperative measure among all countries to reduce chemical pollutants.…

  11. Standardized accuracy assessment of the calypso wireless transponder tracking system.

    PubMed

    Franz, A M; Schmitt, D; Seitel, A; Chatrasingh, M; Echner, G; Oelfke, U; Nill, S; Birkfellner, W; Maier-Hein, L

    2014-11-21

    Electromagnetic (EM) tracking allows localization of small EM sensors in a magnetic field of known geometry without line-of-sight. However, this technique requires a cable connection to the tracked object. A wireless alternative based on magnetic fields, referred to as transponder tracking, has been proposed by several authors. Although most of the transponder tracking systems are still in an early stage of development and not ready for clinical use yet, Varian Medical Systems Inc. (Palo Alto, California, USA) presented the Calypso system for tumor tracking in radiation therapy which includes transponder technology. But it has not been used for computer-assisted interventions (CAI) in general or been assessed for accuracy in a standardized manner, so far. In this study, we apply a standardized assessment protocol presented by Hummel et al (2005 Med. Phys. 32 2371-9) to the Calypso system for the first time. The results show that transponder tracking with the Calypso system provides a precision and accuracy below 1 mm in ideal clinical environments, which is comparable with other EM tracking systems. Similar to other systems the tracking accuracy was affected by metallic distortion, which led to errors of up to 3.2 mm. The potential of the wireless transponder tracking technology for use in many future CAI applications can be regarded as extremely high.

  12. Standardized accuracy assessment of the calypso wireless transponder tracking system

    NASA Astrophysics Data System (ADS)

    Franz, A. M.; Schmitt, D.; Seitel, A.; Chatrasingh, M.; Echner, G.; Oelfke, U.; Nill, S.; Birkfellner, W.; Maier-Hein, L.

    2014-11-01

    Electromagnetic (EM) tracking allows localization of small EM sensors in a magnetic field of known geometry without line-of-sight. However, this technique requires a cable connection to the tracked object. A wireless alternative based on magnetic fields, referred to as transponder tracking, has been proposed by several authors. Although most of the transponder tracking systems are still in an early stage of development and not ready for clinical use yet, Varian Medical Systems Inc. (Palo Alto, California, USA) presented the Calypso system for tumor tracking in radiation therapy which includes transponder technology. But it has not been used for computer-assisted interventions (CAI) in general or been assessed for accuracy in a standardized manner, so far. In this study, we apply a standardized assessment protocol presented by Hummel et al (2005 Med. Phys. 32 2371-9) to the Calypso system for the first time. The results show that transponder tracking with the Calypso system provides a precision and accuracy below 1 mm in ideal clinical environments, which is comparable with other EM tracking systems. Similar to other systems the tracking accuracy was affected by metallic distortion, which led to errors of up to 3.2 mm. The potential of the wireless transponder tracking technology for use in many future CAI applications can be regarded as extremely high.

  13. Standardized accuracy assessment of the calypso wireless transponder tracking system.

    PubMed

    Franz, A M; Schmitt, D; Seitel, A; Chatrasingh, M; Echner, G; Oelfke, U; Nill, S; Birkfellner, W; Maier-Hein, L

    2014-11-21

    Electromagnetic (EM) tracking allows localization of small EM sensors in a magnetic field of known geometry without line-of-sight. However, this technique requires a cable connection to the tracked object. A wireless alternative based on magnetic fields, referred to as transponder tracking, has been proposed by several authors. Although most of the transponder tracking systems are still in an early stage of development and not ready for clinical use yet, Varian Medical Systems Inc. (Palo Alto, California, USA) presented the Calypso system for tumor tracking in radiation therapy which includes transponder technology. But it has not been used for computer-assisted interventions (CAI) in general or been assessed for accuracy in a standardized manner, so far. In this study, we apply a standardized assessment protocol presented by Hummel et al (2005 Med. Phys. 32 2371-9) to the Calypso system for the first time. The results show that transponder tracking with the Calypso system provides a precision and accuracy below 1 mm in ideal clinical environments, which is comparable with other EM tracking systems. Similar to other systems the tracking accuracy was affected by metallic distortion, which led to errors of up to 3.2 mm. The potential of the wireless transponder tracking technology for use in many future CAI applications can be regarded as extremely high. PMID:25332308

  14. Demons deformable registration for CBCT-guided procedures in the head and neck: Convergence and accuracy

    SciTech Connect

    Nithiananthan, S.; Brock, K. K.; Daly, M. J.; Chan, H.; Irish, J. C.; Siewerdsen, J. H.

    2009-10-15

    Purpose: The accuracy and convergence behavior of a variant of the Demons deformable registration algorithm were investigated for use in cone-beam CT (CBCT)-guided procedures of the head and neck. Online use of deformable registration for guidance of therapeutic procedures such as image-guided surgery or radiation therapy places trade-offs on accuracy and computational expense. This work describes a convergence criterion for Demons registration developed to balance these demands; the accuracy of a multiscale Demons implementation using this convergence criterion is quantified in CBCT images of the head and neck. Methods: Using an open-source ''symmetric'' Demons registration algorithm, a convergence criterion based on the change in the deformation field between iterations was developed to advance among multiple levels of a multiscale image pyramid in a manner that optimized accuracy and computation time. The convergence criterion was optimized in cadaver studies involving CBCT images acquired using a surgical C-arm prototype modified for 3D intraoperative imaging. CBCT-to-CBCT registration was performed and accuracy was quantified in terms of the normalized cross-correlation (NCC) and target registration error (TRE). The accuracy and robustness of the algorithm were then tested in clinical CBCT images of ten patients undergoing radiation therapy of the head and neck. Results: The cadaver model allowed optimization of the convergence factor and initial measurements of registration accuracy: Demons registration exhibited TRE=(0.8{+-}0.3) mm and NCC=0.99 in the cadaveric head compared to TRE=(2.6{+-}1.0) mm and NCC=0.93 with rigid registration. Similarly for the patient data, Demons registration gave mean TRE=(1.6{+-}0.9) mm compared to rigid registration TRE=(3.6{+-}1.9) mm, suggesting registration accuracy at or near the voxel size of the patient images (1x1x2 mm{sup 3}). The multiscale implementation based on optimal convergence criteria completed registration in

  15. Simplified Expert Elicitation Procedure for Risk Assessment of Operating Events

    SciTech Connect

    Ronald L. Boring; David Gertman; Jeffrey Joe; Julie Marble; William Galyean; Larry Blackwood; Harold Blackman

    2005-06-01

    This report describes a simplified, tractable, and usable procedure within the US Nuclear Regulator Commission (NRC) for seeking expert opinion and judgment. The NRC has increased efforts to document the reliability and risk of nuclear power plants (NPPs) through Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) models. The Significance Determination Process (SDP) and Accident Sequence Precursor (ASP) programs at the NRC utilize expert judgment on the probability of failure, human error, and the operability of equipment in cases where otherwise insufficient operational data exist to make meaningful estimates. In the past, the SDP and ASP programs informally sought the opinion of experts inside and outside the NRC. This document represents a formal, documented procedure to take the place of informal expert elicitation. The procedures outlined in this report follow existing formal expert elicitation methodologies, but are streamlined as appropriate to the degree of accuracy required and the schedule for producing SDP and ASP analyses.

  16. Accuracy Assessment of Digital Elevation Models Using GPS

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf; Talaat, Ashraf; Farrag, Farrag A.

    2008-01-01

    A Digital Elevation Model (DEM) is a digital representation of ground surface topography or terrain with different accuracies for different application fields. DEM have been applied to a wide range of civil engineering and military planning tasks. DEM is obtained using a number of techniques such as photogrammetry, digitizing, laser scanning, radar interferometry, classical survey and GPS techniques. This paper presents an assessment study of DEM using GPS (Stop&Go) and kinematic techniques comparing with classical survey. The results show that a DEM generated from (Stop&Go) GPS technique has the highest accuracy with a RMS error of 9.70 cm. The RMS error of DEM derived by kinematic GPS is 12.00 cm.

  17. Whole-procedure clinical accuracy of Gamma Knife treatments of large lesions

    SciTech Connect

    Ma Lijun; Chuang, Cynthia; Descovich, Martina; Petti, Paula; Smith, Vernon; Verhey, Lynn

    2008-11-15

    The mechanical accuracy of Gamma Knife radiosurgery based on single-isocenter measurement has been established to within 0.3 mm. However, the full delivery accuracy for Gamma Knife treatments of large lesions has only been estimated via the quadrature-sum analysis. In this study, the authors directly measured the whole-procedure accuracy for Gamma Knife treatments of large lesions to examine the validity of such estimation. The measurements were conducted on a head-phantom simulating the whole treatment procedure that included frame placement, computed tomography imaging, treatment planning, and treatment delivery. The results of the measurements were compared with the dose calculations from the treatment planning system. Average agreements of 0.1-1.6 mm for the isodose lines ranging from 25% to 90% of the maximum dose were found despite potentially large contributing uncertainties such as 3-mm imaging resolution, 2-mm dose grid size, 1-mm frame registration, multi-isocenter deliveries, etc. The results of our measurements were found to be significantly smaller (>50%) than the calculated value based on the quadrature-sum analysis. In conclusion, Gamma Knife treatments of large lesions can be delivered much more accurately than predicted from the quadrature-sum analysis of major sources of uncertainties from each step of the delivery chain.

  18. The Social Accuracy Model of Interpersonal Perception: Assessing Individual Differences in Perceptive and Expressive Accuracy

    ERIC Educational Resources Information Center

    Biesanz, Jeremy C.

    2010-01-01

    The social accuracy model of interpersonal perception (SAM) is a componential model that estimates perceiver and target effects of different components of accuracy across traits simultaneously. For instance, Jane may be generally accurate in her perceptions of others and thus high in "perceptive accuracy"--the extent to which a particular…

  19. Assessment of flash flood warning procedures

    NASA Astrophysics Data System (ADS)

    Johnson, Lynn E.

    2000-01-01

    Assessment of four alternate flash flood warning procedures was conducted to ascertain their suitability for forecast operations using radar-rainfall imagery. The procedures include (1) areal mean basin effective rainfall, (2) unit hydrograph, (3) time-area, and (4) 2-D numerical modeling. The Buffalo Creek flash flood of July 12, 1996, was used as a case study for application of each of the procedures. A significant feature of the Buffalo Creek event was a forest fire that occurred a few months before the flood and significantly affected watershed runoff characteristics. Objectives were to assess the applicability of the procedures for watersheds having spatial and temporal scale similarities to Buffalo Creek, to compare their technical characteristics, and to consider forecaster usability. Geographic information system techniques for hydrologic database development and flash flood potential computations are illustrated. Generalizations of the case study results are offered relative to their suitability for flash flood forecasting operations. Although all four methods have relative advantages, their application to the Buffalo Creek event resulted in mixed performance. Failure of any method was due primarily to uncertainties of the land surface response (i.e., burn area imperviousness). Results underscore the need for model calibration; a difficult requirement for real-time forecasting.

  20. Accuracy assessment, using stratified plurality sampling, of portions of a LANDSAT classification of the Arctic National Wildlife Refuge Coastal Plain

    NASA Technical Reports Server (NTRS)

    Card, Don H.; Strong, Laurence L.

    1989-01-01

    An application of a classification accuracy assessment procedure is described for a vegetation and land cover map prepared by digital image processing of LANDSAT multispectral scanner data. A statistical sampling procedure called Stratified Plurality Sampling was used to assess the accuracy of portions of a map of the Arctic National Wildlife Refuge coastal plain. Results are tabulated as percent correct classification overall as well as per category with associated confidence intervals. Although values of percent correct were disappointingly low for most categories, the study was useful in highlighting sources of classification error and demonstrating shortcomings of the plurality sampling method.

  1. Assessing the accuracy of different simplified frictional rolling contact algorithms

    NASA Astrophysics Data System (ADS)

    Vollebregt, E. A. H.; Iwnicki, S. D.; Xie, G.; Shackleton, P.

    2012-01-01

    This paper presents an approach for assessing the accuracy of different frictional rolling contact theories. The main characteristic of the approach is that it takes a statistically oriented view. This yields a better insight into the behaviour of the methods in diverse circumstances (varying contact patch ellipticities, mixed longitudinal, lateral and spin creepages) than is obtained when only a small number of (basic) circumstances are used in the comparison. The range of contact parameters that occur for realistic vehicles and tracks are assessed using simulations with the Vampire vehicle system dynamics (VSD) package. This shows that larger values for the spin creepage occur rather frequently. Based on this, our approach is applied to typical cases for which railway VSD packages are used. The results show that particularly the USETAB approach but also FASTSIM give considerably better results than the linear theory, Vermeulen-Johnson, Shen-Hedrick-Elkins and Polach methods, when compared with the 'complete theory' of the CONTACT program.

  2. 12 CFR 571.42 - Reasonable policies and procedures concerning the accuracy and integrity of furnished information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... information. (a) Policies and procedures. Each furnisher must establish and implement reasonable written policies and procedures regarding the accuracy and integrity of the information relating to consumers that... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Reasonable policies and procedures...

  3. 12 CFR 41.42 - Reasonable policies and procedures concerning the accuracy and integrity of furnished information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... information. (a) Policies and procedures. Each furnisher must establish and implement reasonable written policies and procedures regarding the accuracy and integrity of the information relating to consumers that... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Reasonable policies and procedures...

  4. Accuracy Assessment of a Uav-Based Landslide Monitoring System

    NASA Astrophysics Data System (ADS)

    Peppa, M. V.; Mills, J. P.; Moore, P.; Miller, P. E.; Chambers, J. E.

    2016-06-01

    Landslides are hazardous events with often disastrous consequences. Monitoring landslides with observations of high spatio-temporal resolution can help mitigate such hazards. Mini unmanned aerial vehicles (UAVs) complemented by structure-from-motion (SfM) photogrammetry and modern per-pixel image matching algorithms can deliver a time-series of landslide elevation models in an automated and inexpensive way. This research investigates the potential of a mini UAV, equipped with a Panasonic Lumix DMC-LX5 compact camera, to provide surface deformations at acceptable levels of accuracy for landslide assessment. The study adopts a self-calibrating bundle adjustment-SfM pipeline using ground control points (GCPs). It evaluates misalignment biases and unresolved systematic errors that are transferred through the SfM process into the derived elevation models. To cross-validate the research outputs, results are compared to benchmark observations obtained by standard surveying techniques. The data is collected with 6 cm ground sample distance (GSD) and is shown to achieve planimetric and vertical accuracy of a few centimetres at independent check points (ICPs). The co-registration error of the generated elevation models is also examined in areas of stable terrain. Through this error assessment, the study estimates that the vertical sensitivity to real terrain change of the tested landslide is equal to 9 cm.

  5. Assessment of optical localizer accuracy for computer aided surgery systems.

    PubMed

    Elfring, Robert; de la Fuente, Matías; Radermacher, Klaus

    2010-01-01

    The technology for localization of surgical tools with respect to the patient's reference coordinate system in three to six degrees of freedom is one of the key components in computer aided surgery. Several tracking methods are available, of which optical tracking is the most widespread in clinical use. Optical tracking technology has proven to be a reliable method for intra-operative position and orientation acquisition in many clinical applications; however, the accuracy of such localizers is still a topic of discussion. In this paper, the accuracy of three optical localizer systems, the NDI Polaris P4, the NDI Polaris Spectra (in active and passive mode) and the Stryker Navigation System II camera, is assessed and compared critically. Static tests revealed that only the Polaris P4 shows significant warm-up behavior, with a significant shift of accuracy being observed within 42 minutes of being switched on. Furthermore, the intrinsic localizer accuracy was determined for single markers as well as for tools using a volumetric measurement protocol on a coordinate measurement machine. To determine the relative distance error within the measurement volume, the Length Measurement Error (LME) was determined at 35 test lengths. As accuracy depends strongly on the marker configuration employed, the error to be expected in typical clinical setups was estimated in a simulation for different tool configurations. The two active localizer systems, the Stryker Navigation System II camera and the Polaris Spectra (active mode), showed the best results, with trueness values (mean +/- standard deviation) of 0.058 +/- 0.033 mm and 0.089 +/- 0.061 mm, respectively. The Polaris Spectra (passive mode) showed a trueness of 0.170 +/- 0.090 mm, and the Polaris P4 showed the lowest trueness at 0.272 +/- 0.394 mm with a higher number of outliers than for the other cameras. The simulation of the different tool configurations in a typical clinical setup revealed that the tracking error can

  6. Accuracy assessment of gridded precipitation datasets in the Himalayas

    NASA Astrophysics Data System (ADS)

    Khan, A.

    2015-12-01

    Accurate precipitation data are vital for hydro-climatic modelling and water resources assessments. Based on mass balance calculations and Turc-Budyko analysis, this study investigates the accuracy of twelve widely used precipitation gridded datasets for sub-basins in the Upper Indus Basin (UIB) in the Himalayas-Karakoram-Hindukush (HKH) region. These datasets are: 1) Global Precipitation Climatology Project (GPCP), 2) Climate Prediction Centre (CPC) Merged Analysis of Precipitation (CMAP), 3) NCEP / NCAR, 4) Global Precipitation Climatology Centre (GPCC), 5) Climatic Research Unit (CRU), 6) Asian Precipitation Highly Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), 7) Tropical Rainfall Measuring Mission (TRMM), 8) European Reanalysis (ERA) interim data, 9) PRINCETON, 10) European Reanalysis-40 (ERA-40), 11) Willmott and Matsuura, and 12) WATCH Forcing Data based on ERA interim (WFDEI). Precipitation accuracy and consistency was assessed by physical mass balance involving sum of annual measured flow, estimated actual evapotranspiration (average of 4 datasets), estimated glacier mass balance melt contribution (average of 4 datasets), and ground water recharge (average of 3 datasets), during 1999-2010. Mass balance assessment was complemented by Turc-Budyko non-dimensional analysis, where annual precipitation, measured flow and potential evapotranspiration (average of 5 datasets) data were used for the same period. Both analyses suggest that all tested precipitation datasets significantly underestimate precipitation in the Karakoram sub-basins. For the Hindukush and Himalayan sub-basins most datasets underestimate precipitation, except ERA-interim and ERA-40. The analysis indicates that for this large region with complicated terrain features and stark spatial precipitation gradients the reanalysis datasets have better consistency with flow measurements than datasets derived from records of only sparsely distributed climatic

  7. Accuracy Assessment in rainfall upscaling in multiple time scales

    NASA Astrophysics Data System (ADS)

    Yu, H.; Wang, C.; Lin, Y.

    2008-12-01

    Long-term hydrologic parameters, e.g. annual precipitations, are usually used to represent the general hydrologic characteristics in a region. Recently, the analysis of the impact of climate change to hydrological patterns primarily relies on the measurement and/or the estimations in long time scales, e.g. year. Under the general condition of the prevalence of short-term measurements, therefore, it is important to understand the accuracy of upscaling for the long-term estimations of hydrologic parameters. This study applies spatiotemporal geostatistical method to analyze and discuss the accuracy of precipitation upscaling in Taiwan under the different time scales, and also quantifies the uncertainty in the upscaled long-term precipitations. In this study, two space-time upscaling approaches developed by Bayesian Maximum Entropy method (BME) are presented 1) UM1: data aggregation followed by BME estimation and 2) UM2: BME estimation followed by aggregation. The investigation and comparison are also implemented to assess the performance of the rainfall estimations in multiple time scales in Taiwan by the two upscaling. Keywords: upscaling, geostatistics, BME, uncertainty analysis

  8. A comparison of two in vitro methods for assessing the fitting accuracy of composite inlays.

    PubMed

    Qualtrough, A J; Piddock, V; Kypreou, V

    1993-06-19

    Composite inlays were fabricated in standardised cavities cut into aluminum and perspex blocks using a computer controlled milling process. Four materials were used to construct the inlays. These were fabricated using an indirect technique following the manufacturers' recommendations, where applicable. In addition, for one of the composites, the fabrication procedures were modified. The fitting accuracy of the restorations was assessed by taking elastomeric impression wash replicas of the luting space and by examination of sectioned restored units using image analysis. The former method indicated significantly reduced fitting accuracy when either use of die spacer or secondary curing was omitted from restoration construction resulting in incomplete seating. The sectioning technique indicated that more factors appeared to significantly reduce fitting accuracy including bulk packing, alteration in curing time, omission to die spacer and the final polishing procedure. This method also provided more specific information concerning sites of premature contact. One material gave rise to significantly greater film thicknesses using both methods of assessment. No direct correlation was found between the two techniques of fit evaluation but both methods taken together provided complementary information.

  9. Assessing the Accuracy of the Precise Point Positioning Technique

    NASA Astrophysics Data System (ADS)

    Bisnath, S. B.; Collins, P.; Seepersad, G.

    2012-12-01

    The Precise Point Positioning (PPP) GPS data processing technique has developed over the past 15 years to become a standard method for growing categories of positioning and navigation applications. The technique relies on single receiver point positioning combined with the use of precise satellite orbit and clock information and high-fidelity error modelling. The research presented here uniquely addresses the current accuracy of the technique, explains the limits of performance, and defines paths to improvements. For geodetic purposes, performance refers to daily static position accuracy. PPP processing of over 80 IGS stations over one week results in few millimetre positioning rms error in the north and east components and few centimetres in the vertical (all one sigma values). Larger error statistics for real-time and kinematic processing are also given. GPS PPP with ambiguity resolution processing is also carried out, producing slight improvements over the float solution results. These results are categorised into quality classes in order to analyse the root error causes of the resultant accuracies: "best", "worst", multipath, site displacement effects, satellite availability and geometry, etc. Also of interest in PPP performance is solution convergence period. Static, conventional solutions are slow to converge, with approximately 35 minutes required for 95% of solutions to reach the 20 cm or better horizontal accuracy. Ambiguity resolution can significantly reduce this period without biasing solutions. The definition of a PPP error budget is a complex task even with the resulting numerical assessment, as unlike the epoch-by-epoch processing in the Standard Position Service, PPP processing involving filtering. An attempt is made here to 1) define the magnitude of each error source in terms of range, 2) transform ranging error to position error via Dilution Of Precision (DOP), and 3) scale the DOP through the filtering process. The result is a deeper

  10. Quantitative modeling of the accuracy in registering preoperative patient-specific anatomic models into left atrial cardiac ablation procedures

    SciTech Connect

    Rettmann, Maryam E. Holmes, David R.; Camp, Jon J.; Cameron, Bruce M.; Robb, Richard A.; Kwartowitz, David M.; Gunawan, Mia; Johnson, Susan B.; Packer, Douglas L.; Dalegrave, Charles; Kolasa, Mark W.

    2014-02-15

    Purpose: In cardiac ablation therapy, accurate anatomic guidance is necessary to create effective tissue lesions for elimination of left atrial fibrillation. While fluoroscopy, ultrasound, and electroanatomic maps are important guidance tools, they lack information regarding detailed patient anatomy which can be obtained from high resolution imaging techniques. For this reason, there has been significant effort in incorporating detailed, patient-specific models generated from preoperative imaging datasets into the procedure. Both clinical and animal studies have investigated registration and targeting accuracy when using preoperative models; however, the effect of various error sources on registration accuracy has not been quantitatively evaluated. Methods: Data from phantom, canine, and patient studies are used to model and evaluate registration accuracy. In the phantom studies, data are collected using a magnetically tracked catheter on a static phantom model. Monte Carlo simulation studies were run to evaluate both baseline errors as well as the effect of different sources of error that would be present in a dynamicin vivo setting. Error is simulated by varying the variance parameters on the landmark fiducial, physical target, and surface point locations in the phantom simulation studies. In vivo validation studies were undertaken in six canines in which metal clips were placed in the left atrium to serve as ground truth points. A small clinical evaluation was completed in three patients. Landmark-based and combined landmark and surface-based registration algorithms were evaluated in all studies. In the phantom and canine studies, both target registration error and point-to-surface error are used to assess accuracy. In the patient studies, no ground truth is available and registration accuracy is quantified using point-to-surface error only. Results: The phantom simulation studies demonstrated that combined landmark and surface-based registration improved

  11. Assessment of ambulatory blood pressure recorders: accuracy and clinical performance.

    PubMed

    White, W B

    1991-06-01

    There are now more than ten different manufacturers of non-invasive, portable blood pressure monitors in North America, Europe, and Japan. These ambulatory blood pressure recorders measure blood pressure by either auscultatory or oscillometric methodology. Technologic advances in the recorders have resulted in reduction in monitor size, reduction in or absence of motor noise during cuff inflation, ability to program the recorder without an external computer system, and enhanced precision. Recently, there has been concern that more structured validation protocols have not been implemented prior to the widespread marking of ambulatory blood pressure recorders. There is a need for proper assessment of recorders prior to use in clinical research or practice. Data on several existing recorders suggest that while most are reasonably accurate during resting measurements, many lose this accuracy during motion, and clinical performance may vary among the monitors. Validation studies of ambulatory recorders should include comparison with mercury column and intra-arterial determinations, resting and motion measurements, and assessment of clinical performance in hypertensive patients. PMID:1893652

  12. Inertial Measures of Motion for Clinical Biomechanics: Comparative Assessment of Accuracy under Controlled Conditions – Changes in Accuracy over Time

    PubMed Central

    Lebel, Karina; Boissy, Patrick; Hamel, Mathieu; Duval, Christian

    2015-01-01

    Background Interest in 3D inertial motion tracking devices (AHRS) has been growing rapidly among the biomechanical community. Although the convenience of such tracking devices seems to open a whole new world of possibilities for evaluation in clinical biomechanics, its limitations haven’t been extensively documented. The objectives of this study are: 1) to assess the change in absolute and relative accuracy of multiple units of 3 commercially available AHRS over time; and 2) to identify different sources of errors affecting AHRS accuracy and to document how they may affect the measurements over time. Methods This study used an instrumented Gimbal table on which AHRS modules were carefully attached and put through a series of velocity-controlled sustained motions including 2 minutes motion trials (2MT) and 12 minutes multiple dynamic phases motion trials (12MDP). Absolute accuracy was assessed by comparison of the AHRS orientation measurements to those of an optical gold standard. Relative accuracy was evaluated using the variation in relative orientation between modules during the trials. Findings Both absolute and relative accuracy decreased over time during 2MT. 12MDP trials showed a significant decrease in accuracy over multiple phases, but accuracy could be enhanced significantly by resetting the reference point and/or compensating for initial Inertial frame estimation reference for each phase. Interpretation The variation in AHRS accuracy observed between the different systems and with time can be attributed in part to the dynamic estimation error, but also and foremost, to the ability of AHRS units to locate the same Inertial frame. Conclusions Mean accuracies obtained under the Gimbal table sustained conditions of motion suggest that AHRS are promising tools for clinical mobility assessment under constrained conditions of use. However, improvement in magnetic compensation and alignment between AHRS modules are desirable in order for AHRS to reach their

  13. Accuracy of a semiquantitative method for Dermal Exposure Assessment (DREAM)

    PubMed Central

    van Wendel, de Joo... B; Vermeulen, R; van Hemmen, J J; Fransman, W; Kromhout, H

    2005-01-01

    Background: The authors recently developed a Dermal Exposure Assessment Method (DREAM), an observational semiquantitative method to assess dermal exposures by systematically evaluating exposure determinants using pre-assigned default values. Aim: To explore the accuracy of the DREAM method by comparing its estimates with quantitative dermal exposure measurements in several occupational settings. Methods: Occupational hygienists observed workers performing a certain task, whose exposure to chemical agents on skin or clothing was measured quantitatively simultaneously, and filled in the DREAM questionnaire. DREAM estimates were compared with measurement data by estimating Spearman correlation coefficients for each task and for individual observations. In addition, mixed linear regression models were used to study the effect of DREAM estimates on the variability in measured exposures between tasks, between workers, and from day to day. Results: For skin exposures, spearman correlation coefficients for individual observations ranged from 0.19 to 0.82. DREAM estimates for exposure levels on hands and forearms showed a fixed effect between and within surveys, explaining mainly between-task variance. In general, exposure levels on clothing layer were only predicted in a meaningful way by detailed DREAM estimates, which comprised detailed information on the concentration of the agent in the formulation to which exposure occurred. Conclusions: The authors expect that the DREAM method can be successfully applied for semiquantitative dermal exposure assessment in epidemiological and occupational hygiene surveys of groups of workers with considerable contrast in dermal exposure levels (variability between groups >1.0). For surveys with less contrasting exposure levels, quantitative dermal exposure measurements are preferable. PMID:16109819

  14. An accuracy assessment of Cartesian-mesh approaches for the Euler equations

    NASA Technical Reports Server (NTRS)

    Coirier, William J.; Powell, Kenneth G.

    1995-01-01

    A critical assessment of the accuracy of Cartesian-mesh approaches for steady, transonic solutions of the Euler equations of gas dynamics is made. An exact solution of the Euler equations (Ringleb's flow) is used not only to infer the order of the truncation error of the Cartesian-mesh approaches, but also to compare the magnitude of the discrete error directly to that obtained with a structured mesh approach. Uniformly and adaptively refined solutions using a Cartesian-mesh approach are obtained and compared to each other and to uniformly refined structured mesh results. The effect of cell merging is investigated as well as the use of two different K-exact reconstruction procedures. The solution methodology of the schemes is explained and tabulated results are presented to compare the solution accuracies.

  15. 16 CFR 660.3 - Reasonable policies and procedures concerning the accuracy and integrity of furnished information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... integrity of furnished information. (a) Policies and procedures. Each furnisher must establish and implement reasonable written policies and procedures regarding the accuracy and integrity of the information relating... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Reasonable policies and...

  16. 12 CFR 222.42 - Reasonable policies and procedures concerning the accuracy and integrity of furnished information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (REGULATION V) Duties of Furnishers of Information § 222.42 Reasonable policies and procedures concerning the accuracy and integrity of furnished information. (a) Policies and procedures. Each furnisher must establish... information relating to consumers that it furnishes to a consumer reporting agency. The policies...

  17. Accuracy Assessment of Response Surface Approximations for Supersonic Turbine Design

    NASA Technical Reports Server (NTRS)

    Papila, Nilay; Papila, Melih; Shyy, Wei; Haftka, Raphael T.; FitzCoy, Norman

    2001-01-01

    There is a growing trend to employ CFD tools to supply the necessary information for design optimization of fluid dynamics components/systems. Such results are prone to uncertainties due to reasons including discretization. errors, incomplete convergence of computational procedures, and errors associated with physical models such as turbulence closures. Based on this type of information, gradient-based optimization algorithms often suffer from the noisy calculations, which can seriously compromise the outcome. Similar problems arise from the experimental measurements. Global optimization techniques, such as those based on the response surface (RS) concept are becoming popular in part because they can overcome some of these barriers. However, there are also fundamental issues related to such global optimization technique such as RS. For example, in high dimensional design spaces, typically only a small number of function evaluations are available due to computational and experimental costs. On the other hand, complex features of the design variables do not allow one to model the global characteristics of the design space with simple quadratic polynomials. Consequently a main challenge is to reduce the size of the region where we fit the RS, or make it more accurate in the regions where the optimum is likely to reside. Response Surface techniques using either polynomials or and Neural Network (NN) methods offer designers alternatives to conduct design optimization. The RS technique employs statistical and numerical techniques to establish the relationship between design variables and objective/constraint functions, typically using polynomials. In this study, we aim at addressing issues related to the following questions: (1) How to identify outliers associated with a given RS representation and improve the RS model via appropriate treatments? (2) How to focus on selected design data so that RS can give better performance in regions critical to design optimization? (3

  18. High-temperature flaw assessment procedure

    SciTech Connect

    Ruggles, M.B. ); Takahashi, Y. ); Ainsworth, R.A. )

    1991-08-01

    Described is the background work performed jointly by the Electric Power Research Institute in the United States, the Central Research Institute of Electric Power Industry in Japan and Nuclear Electric plc in the United Kingdom with the purpose of developing a high-temperature flaw assessment procedure for reactor components. Existing creep-fatigue crack-growth models are reviewed, and the most promising methods are identified. Sources of material data are outlined, and results of the fundamental deformation and crack-growth tests are discussed. Results of subcritical crack-growth exploratory tests, creep-fatigue crack-growth tests under repeated thermal transient conditions, and exploratory failure tests are presented and contrasted with the analytical modeling. Crack-growth assessment methods are presented and applied to a typical liquid-metal reactor component. The research activities presented herein served as a foundation for the Flaw Assessment Guide for High-Temperature Reactor Components Subjected to Creep-Fatigue Loading published separately. 30 refs., 108 figs., 13 tabs.

  19. An assessment of reservoir storage change accuracy from SWOT

    NASA Astrophysics Data System (ADS)

    Clark, Elizabeth; Moller, Delwyn; Lettenmaier, Dennis

    2013-04-01

    The anticipated Surface Water and Ocean Topography (SWOT) satellite mission will provide water surface height and areal extent measurements for terrestrial water bodies at an unprecedented accuracy with essentially global coverage with a 22-day repeat cycle. These measurements will provide a unique opportunity to observe storage changes in naturally occurring lakes, as well as manmade reservoirs. Given political constraints on the sharing of water information, international data bases of reservoir characteristics, such as the Global Reservoir and Dam Database, are limited to the largest reservoirs for which countries have voluntarily provided information. Impressive efforts have been made to combine currently available altimetry data with satellite-based imagery of water surface extent; however, these data sets are limited to large reservoirs located on an altimeter's flight track. SWOT's global coverage and simultaneous measurement of height and water surface extent remove, in large part, the constraint of location relative to flight path. Previous studies based on Arctic lakes suggest that SWOT will be able to provide a noisy, but meaningful, storage change signal for lakes as small as 250 m x 250 m. Here, we assess the accuracy of monthly storage change estimates over 10 reservoirs in the U.S. and consider the plausibility of estimating total storage change. Published maps of reservoir bathymetry were combined with a historical time series of daily storage to produce daily time series of maps of water surface elevation. Next, these time series were then sampled based on realistic SWOT orbital parameters and noise characteristics to create a time series of synthetic SWOT observations of water surface elevation and extent for each reservoir. We then plotted area versus elevation for the true values and for the synthetic SWOT observations. For each reservoir, a curve was fit to the synthetic SWOT observations, and its integral was used to estimate total storage

  20. Assessing and Ensuring GOES-R Magnetometer Accuracy

    NASA Technical Reports Server (NTRS)

    Kronenwetter, Jeffrey; Carter, Delano R.; Todirita, Monica; Chu, Donald

    2016-01-01

    The GOES-R magnetometer accuracy requirement is 1.7 nanoteslas (nT). During quiet times (100 nT), accuracy is defined as absolute mean plus 3 sigma. During storms (300 nT), accuracy is defined as absolute mean plus 2 sigma. To achieve this, the sensor itself has better than 1 nT accuracy. Because zero offset and scale factor drift over time, it is also necessary to perform annual calibration maneuvers. To predict performance, we used covariance analysis and attempted to corroborate it with simulations. Although not perfect, the two generally agree and show the expected behaviors. With the annual calibration regimen, these predictions suggest that the magnetometers will meet their accuracy requirements.

  1. Assessing and Ensuring GOES-R Magnetometer Accuracy

    NASA Technical Reports Server (NTRS)

    Carter, Delano R.; Todirita, Monica; Kronenwetter, Jeffrey; Chu, Donald

    2016-01-01

    The GOES-R magnetometer subsystem accuracy requirement is 1.7 nanoteslas (nT). During quiet times (100 nT), accuracy is defined as absolute mean plus 3 sigma. During storms (300 nT), accuracy is defined as absolute mean plus 2 sigma. Error comes both from outside the magnetometers, e.g. spacecraft fields and misalignments, as well as inside, e.g. zero offset and scale factor errors. Because zero offset and scale factor drift over time, it will be necessary to perform annual calibration maneuvers. To predict performance before launch, we have used Monte Carlo simulations and covariance analysis. Both behave as expected, and their accuracy predictions agree within 30%. With the proposed calibration regimen, both suggest that the GOES-R magnetometer subsystem will meet its accuracy requirements.

  2. Assessing accuracy and precision for field and laboratory data: a perspective in ecosystem restoration

    USGS Publications Warehouse

    Stapanian, Martin A.; Lewis, Timothy E; Palmer, Craig J.; Middlebrook Amos, Molly

    2016-01-01

    Unlike most laboratory studies, rigorous quality assurance/quality control (QA/QC) procedures may be lacking in ecosystem restoration (“ecorestoration”) projects, despite legislative mandates in the United States. This is due, in part, to ecorestoration specialists making the false assumption that some types of data (e.g. discrete variables such as species identification and abundance classes) are not subject to evaluations of data quality. Moreover, emergent behavior manifested by complex, adapting, and nonlinear organizations responsible for monitoring the success of ecorestoration projects tend to unconsciously minimize disorder, QA/QC being an activity perceived as creating disorder. We discuss similarities and differences in assessing precision and accuracy for field and laboratory data. Although the concepts for assessing precision and accuracy of ecorestoration field data are conceptually the same as laboratory data, the manner in which these data quality attributes are assessed is different. From a sample analysis perspective, a field crew is comparable to a laboratory instrument that requires regular “recalibration,” with results obtained by experts at the same plot treated as laboratory calibration standards. Unlike laboratory standards and reference materials, the “true” value for many field variables is commonly unknown. In the laboratory, specific QA/QC samples assess error for each aspect of the measurement process, whereas field revisits assess precision and accuracy of the entire data collection process following initial calibration. Rigorous QA/QC data in an ecorestoration project are essential for evaluating the success of a project, and they provide the only objective “legacy” of the dataset for potential legal challenges and future uses.

  3. Accuracy Assessment of Coastal Topography Derived from Uav Images

    NASA Astrophysics Data System (ADS)

    Long, N.; Millescamps, B.; Pouget, F.; Dumon, A.; Lachaussée, N.; Bertin, X.

    2016-06-01

    To monitor coastal environments, Unmanned Aerial Vehicle (UAV) is a low-cost and easy to use solution to enable data acquisition with high temporal frequency and spatial resolution. Compared to Light Detection And Ranging (LiDAR) or Terrestrial Laser Scanning (TLS), this solution produces Digital Surface Model (DSM) with a similar accuracy. To evaluate the DSM accuracy on a coastal environment, a campaign was carried out with a flying wing (eBee) combined with a digital camera. Using the Photoscan software and the photogrammetry process (Structure From Motion algorithm), a DSM and an orthomosaic were produced. Compared to GNSS surveys, the DSM accuracy is estimated. Two parameters are tested: the influence of the methodology (number and distribution of Ground Control Points, GCPs) and the influence of spatial image resolution (4.6 cm vs 2 cm). The results show that this solution is able to reproduce the topography of a coastal area with a high vertical accuracy (< 10 cm). The georeferencing of the DSM require a homogeneous distribution and a large number of GCPs. The accuracy is correlated with the number of GCPs (use 19 GCPs instead of 10 allows to reduce the difference of 4 cm); the required accuracy should be dependant of the research problematic. Last, in this particular environment, the presence of very small water surfaces on the sand bank does not allow to improve the accuracy when the spatial resolution of images is decreased.

  4. Assessing and ensuring GOES-R magnetometer accuracy

    NASA Astrophysics Data System (ADS)

    Carter, Delano; Todirita, Monica; Kronenwetter, Jeffrey; Dahya, Melissa; Chu, Donald

    2016-05-01

    The GOES-R magnetometer subsystem accuracy requirement is 1.7 nanoteslas (nT). During quiet times (100 nT), accuracy is defined as absolute mean plus 3 sigma error per axis. During storms (300 nT), accuracy is defined as absolute mean plus 2 sigma error per axis. Error comes both from outside the magnetometers, e.g. spacecraft fields and misalignments, as well as inside, e.g. zero offset and scale factor errors. Because zero offset and scale factor drift over time, it will be necessary to perform annual calibration maneuvers. To predict performance before launch, we have used Monte Carlo simulations and covariance analysis. With the proposed calibration regimen, both suggest that the magnetometer subsystem will meet its accuracy requirements.

  5. Probabilistic Risk Assessment of disassembly procedures

    SciTech Connect

    O`Brien, D.A.; Bement, T.R.; Letellier, B.C.

    1993-10-01

    Probabilistic Risk (Safety) Assessment (PRA or PSA) is an analytic methodology for identifying the combination of events that, if they occur, lead to accidents. Accidents are defined as those events causing loss or injury to people, property, or the environment. PRA also provides a method for estimating the frequency of occurrence of each combination of events and the consequences of each accident. The Los Alamos effort for this study is summarized as follows: The focus of the Los Alamos study was on evaluating the risks specifically associated with disassembling a Los Alamos-designed device. The PRA for the disassembly operation included a detailed evaluation only for those potential accident sequences which could lead to significant off-site consequences and affect public health. The overall purpose of this study was to investigate the feasibility of a risk consequence goal for DOE operations. Often called a Level 3 PRA (or PSA), the methods are general and can with a little modification be applied to other procedures or processes.

  6. Pixels, Blocks of Pixels, and Polygons: Choosing a Spatial Unit for Thematic Accuracy Assessment

    EPA Science Inventory

    Pixels, polygons, and blocks of pixels are all potentially viable spatial assessment units for conducting an accuracy assessment. We develop a statistical population-based framework to examine how the spatial unit chosen affects the outcome of an accuracy assessment. The populati...

  7. Does it Make a Difference? Investigating the Assessment Accuracy of Teacher Tutors and Student Tutors

    ERIC Educational Resources Information Center

    Herppich, Stephanie; Wittwer, Jorg; Nuckles, Matthias; Renkl, Alexander

    2013-01-01

    Tutors often have difficulty with accurately assessing a tutee's understanding. However, little is known about whether the professional expertise of tutors influences their assessment accuracy. In this study, the authors examined the accuracy with which 21 teacher tutors and 25 student tutors assessed a tutee's understanding of the human…

  8. A statistical filtering procedure to improve the accuracy of estimating population parameters in feed composition databases.

    PubMed

    Yoder, P S; St-Pierre, N R; Weiss, W P

    2014-09-01

    Accurate estimates of mean nutrient composition of feeds, nutrient variance (i.e., standard deviation), and covariance (i.e., correlation) are needed to develop a more quantitative approach of formulating diets to reduce risk and optimize safety factors. Commercial feed-testing laboratories have large databases of composition values for many feeds, but because of potentially misidentified feeds or poorly defined feed names, these databases are possibly contaminated by incorrect results and could generate inaccurate statistics. The objectives of this research were to (1) design a procedure (also known as a mathematical filter) that generates accurate estimates of the first 2 moments [i.e., the mean and (co)variance] of the nutrient distributions for the largest subpopulation within a feed in the presence of outliers and multiple subpopulations, and (2) use the procedure to generate feed composition tables with accurate means, variances, and correlations. Feed composition data (>1,300,000 samples) were collected from 2 major US commercial laboratories. A combination of a univariate step and 2 multivariate steps (principal components analysis and cluster analysis) were used to filter the data. On average, 13.5% of the total samples of a particular feed population were removed, of which the multivariate steps removed the majority (66% of removed samples). For some feeds, inaccurate identification (e.g., corn gluten feed samples included in the corn gluten meal population) was a primary reason for outliers, whereas for other feeds, subpopulations of a broader population were identified (e.g., immature alfalfa silage within a broad population of alfalfa silage). Application of the procedure did not usually affect the mean concentration of nutrients but greatly reduced the standard deviation and often changed the correlation estimates among nutrients. More accurate estimates of the variation of feeds and how they tend to vary will improve the economic evaluation of feeds

  9. Constraint on Absolute Accuracy of Metacomprehension Assessments: The Anchoring and Adjustment Model vs. the Standards Model

    ERIC Educational Resources Information Center

    Kwon, Heekyung

    2011-01-01

    The objective of this study is to provide a systematic account of three typical phenomena surrounding absolute accuracy of metacomprehension assessments: (1) the absolute accuracy of predictions is typically quite low; (2) there exist individual differences in absolute accuracy of predictions as a function of reading skill; and (3) postdictions…

  10. ASSESSING THE ACCURACY OF NATIONAL LAND COVER DATASET AREA ESTIMATES AT MULTIPLE SPATIAL EXTENTS

    EPA Science Inventory

    Site specific accuracy assessments provide fine-scale evaluation of the thematic accuracy of land use/land cover (LULC) datasets; however, they provide little insight into LULC accuracy across varying spatial extents. Additionally, LULC data are typically used to describe lands...

  11. Bilingual Language Assessment: A Meta-Analysis of Diagnostic Accuracy

    ERIC Educational Resources Information Center

    Dollaghan, Christine A.; Horner, Elizabeth A.

    2011-01-01

    Purpose: To describe quality indicators for appraising studies of diagnostic accuracy and to report a meta-analysis of measures for diagnosing language impairment (LI) in bilingual Spanish-English U.S. children. Method: The authors searched electronically and by hand to locate peer-reviewed English-language publications meeting inclusion criteria;…

  12. Assessing genomic selection prediction accuracy in a dynamic barley breeding

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genomic selection is a method to improve quantitative traits in crops and livestock by estimating breeding values of selection candidates using phenotype and genome-wide marker data sets. Prediction accuracy has been evaluated through simulation and cross-validation, however validation based on prog...

  13. Pollutant Assessments Group Procedures Manual: Volume 1, Administrative and support procedures

    SciTech Connect

    Not Available

    1992-03-01

    This manual describes procedures currently in use by the Pollutant Assessments Group. The manual is divided into two volumes: Volume 1 includes administrative and support procedures, and Volume 2 includes technical procedures. These procedures are revised in an ongoing process to incorporate new developments in hazardous waste assessment technology and changes in administrative policy. Format inconsistencies will be corrected in subsequent revisions of individual procedures. The purpose of the Pollutant Assessments Groups Procedures Manual is to provide a standardized set of procedures documenting in an auditable manner the activities performed by the Pollutant Assessments Group (PAG) of the Health and Safety Research Division (HASRD) of the Environmental Measurements and Applications Section (EMAS) at Oak Ridge National Laboratory (ORNL). The Procedures Manual ensures that the organizational, administrative, and technical activities of PAG conform properly to protocol outlined by funding organizations. This manual also ensures that the techniques and procedures used by PAG and other contractor personnel meet the requirements of applicable governmental, scientific, and industrial standards. The Procedures Manual is sufficiently comprehensive for use by PAG and contractor personnel in the planning, performance, and reporting of project activities and measurements. The Procedures Manual provides procedures for conducting field measurements and includes program planning, equipment operation, and quality assurance elements. Successive revisions of this manual will be archived in the PAG Document Control Department to facilitate tracking of the development of specific procedures.

  14. Accuracy of DIF Estimates and Power in Unbalanced Designs Using the Mantel-Haenszel DIF Detection Procedure

    ERIC Educational Resources Information Center

    Paek, Insu; Guo, Hongwen

    2011-01-01

    This study examined how much improvement was attainable with respect to accuracy of differential item functioning (DIF) measures and DIF detection rates in the Mantel-Haenszel procedure when employing focal and reference groups with notably unbalanced sample sizes where the focal group has a fixed small sample which does not satisfy the minimum…

  15. Recent Advances in Image Assisted Neurosurgical Procedures: Improved Navigational Accuracy and Patient Safety

    ScienceCinema

    Olivi, Alessandro, M.D.

    2016-07-12

    Neurosurgical procedures require precise planning and intraoperative support. Recent advances in image guided technology have provided neurosurgeons with improved navigational support for more effective and safer procedures. A number of exemplary cases will be presented.

  16. Recent Advances in Image Assisted Neurosurgical Procedures: Improved Navigational Accuracy and Patient Safety

    SciTech Connect

    Olivi, Alessandro, M.D.

    2010-08-28

    Neurosurgical procedures require precise planning and intraoperative support. Recent advances in image guided technology have provided neurosurgeons with improved navigational support for more effective and safer procedures. A number of exemplary cases will be presented.

  17. 78 FR 29071 - Assessment of Mediation and Arbitration Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ...\\ Assessment of Mediation and Arbitration Procedures, 75 FR 52054. \\4\\ Assessment of Mediation and Arbitration... Resolution, 60 FR 19494, 19499-500 (April 19, 1995) (codified at 18 CFR 385.605 (Rule 605)) (describing...

  18. The Measurement of Values: Effects of Different Assessment Procedures

    ERIC Educational Resources Information Center

    Feather, N. T.

    1973-01-01

    Rating and pair-comparison procedures for assessing the importance of terminal and instrumental values were compared with the standard ranking procedure developed by Rokeach. Effects of order of presentation of of the value sets were also investigated. Neither procedure nor order had replicable effect though some sex differences were apparent. (TO)

  19. Combining accuracy assessment of land-cover maps with environmental monitoring programs

    USGS Publications Warehouse

    Stehman, S.V.; Czaplewski, R.L.; Nusser, S.M.; Yang, L.; Zhu, Z.

    2000-01-01

    A scientifically valid accuracy assessment of a large-area, land-cover map is expensive. Environmental monitoring programs offer a potential source of data to partially defray the cost of accuracy assessment while still maintaining the statistical validity. In this article, three general strategies for combining accuracy assessment and environmental monitoring protocols are described. These strategies range from a fully integrated accuracy assessment and environmental monitoring protocol, to one in which the protocols operate nearly independently. For all three strategies, features critical to using monitoring data for accuracy assessment include compatibility of the land-cover classification schemes, precisely co-registered sample data, and spatial and temporal compatibility of the map and reference data. Two monitoring programs, the National Resources Inventory (NRI) and the Forest Inventory and Monitoring (FIM), are used to illustrate important features for implementing a combined protocol.

  20. Assessment of Exceptional Students: Educational and Psychological Procedures. Sixth Edition.

    ERIC Educational Resources Information Center

    Taylor, Ronald L.

    This text on the assessment of students with disabilities is divided into six major parts. Part 1, "Introduction to Assessment, Issues and Concerns," discusses the historical, philosophical, legal, practical and ethical bases of assessment and proposes an assessment model. Part 2, "Informal Procedures: Basic Tools for Teachers," includes chapters…

  1. Teacher Growth and Assessment Process Procedural Handbook

    ERIC Educational Resources Information Center

    Howard, Barbara B.

    2005-01-01

    Teacher Growth and Assessment (TGA) is a comprehensive teacher evaluation system that includes structures for both accountability and professional growth, taking teacher evaluation to a new level. TGA provides the opportunity to use teacher evaluation data to plan professional development, involve teachers in self-assessment, and structure…

  2. Precision and accuracy of visual foliar injury assessments

    SciTech Connect

    Gumpertz, M.L.; Tingey, D.T.; Hogsett, W.E.

    1982-07-01

    The study compared three measures of foliar injury: (i) mean percent leaf area injured of all leaves on the plant, (ii) mean percent leaf area injured of the three most injured leaves, and (iii) the proportion of injured leaves to total number of leaves. For the first measure, the variation caused by reader biases and day-to-day variations were compared with the innate plant-to-plant variation. Bean (Phaseolus vulgaris 'Pinto'), pea (Pisum sativum 'Little Marvel'), radish (Rhaphanus sativus 'Cherry Belle'), and spinach (Spinacia oleracea 'Northland') plants were exposed to either 3 ..mu..L L/sup -1/ SO/sub 2/ or 0.3 ..mu..L L/sup -1/ ozone for 2 h. Three leaf readers visually assessed the percent injury on every leaf of each plant while a fourth reader used a transparent grid to make an unbiased assessment for each plant. The mean leaf area injured of the three most injured leaves was highly correlated with all leaves on the plant only if the three most injured leaves were <100% injured. The proportion of leaves injured was not highly correlated with percent leaf area injured of all leaves on the plant for any species in this study. The largest source of variation in visual assessments was plant-to-plant variation, which ranged from 44 to 97% of the total variance, followed by variation among readers (0-32% of the variance). Except for radish exposed to ozone, the day-to-day variation accounted for <18% of the total. Reader bias in assessment of ozone injury was significant but could be adjusted for each reader by a simple linear regression (R/sup 2/ = 0.89-0.91) of the visual assessments against the grid assessments.

  3. Vestibular and Oculomotor Assessments May Increase Accuracy of Subacute Concussion Assessment.

    PubMed

    McDevitt, J; Appiah-Kubi, K O; Tierney, R; Wright, W G

    2016-08-01

    In this study, we collected and analyzed preliminary data for the internal consistency of a new condensed model to assess vestibular and oculomotor impairments following a concussion. We also examined this model's ability to discriminate concussed athletes from healthy controls. Each participant was tested in a concussion assessment protocol that consisted of the Neurocom's Sensory Organization Test (SOT), Balance Error Scoring System exam, and a series of 8 vestibular and oculomotor assessments. Of these 10 assessments, only the SOT, near point convergence, and the signs and symptoms (S/S) scores collected following optokinetic stimulation, the horizontal eye saccades test, and the gaze stabilization test were significantly correlated with health status, and were used in further analyses. Multivariate logistic regression for binary outcomes was employed and these beta weights were used to calculate the area under the receiver operating characteristic curve ( area under the curve). The best model supported by our findings suggest that an exam consisting of the 4 SOT sensory ratios, near point convergence, and the optokinetic stimulation signs and symptoms score are sensitive in discriminating concussed athletes from healthy controls (accuracy=98.6%, AUC=0.983). However, an even more parsimonious model consisting of only the optokinetic stimulation and gaze stabilization test S/S scores and near point convergence was found to be a sensitive model for discriminating concussed athletes from healthy controls (accuracy=94.4%, AUC=0.951) without the need for expensive equipment. Although more investigation is needed, these findings will be helpful to health professionals potentially providing them with a sensitive and specific battery of simple vestibular and oculomotor assessments for concussion management. PMID:27176886

  4. Assessing Data Accuracy When Involving Students in Authentic Paleontological Research.

    ERIC Educational Resources Information Center

    Harnik, Paul G.; Ross, Robert M.

    2003-01-01

    Regards Student-Scientist Partnerships (SSPs) as beneficial collaborations for both students and researchers. Introduces the Paleontological Research Institution (PRI), which developed and pilot tested an SSP that involved grade 4-9 students in paleontological research on Devonian marine fossil assemblages. Reports formative data assessment and…

  5. An assessment of the accuracy of orthotropic photoelasticity

    NASA Technical Reports Server (NTRS)

    Hyer, M. W.; Liu, D. H.

    1984-01-01

    The accuracy of orthotropic photoelasticity was studied. The study consisted of both theoretical and experimental phases. In the theoretical phase a stress-optic law was developed. The stress-optic law included the effects of residual birefringence in the relation between applied stress and the material's optical response. The experimental phase had several portions. First, it was shown that four-point bending tests and the concept of an optical neutral axis could be conveniently used to calibrate the stress-optic behavior of the material. Second, the actual optical response of an orthotropic disk in diametral compression was compared with theoretical predictions. Third, the stresses in the disk were determined from the observed optical response, the stress-optic law, and a finite-difference form of the plane stress equilibrium equations. It was concluded that orthotropic photoelasticity is not as accurate as isotropic photoelasticity. This is believed to be due to the lack of good fringe resolution and the low sensitivity of most orthotropic photoelastic materials.

  6. Accuracy assessment and automation of free energy calculations for drug design.

    PubMed

    Christ, Clara D; Fox, Thomas

    2014-01-27

    As the free energy of binding of a ligand to its target is one of the crucial optimization parameters in drug design, its accurate prediction is highly desirable. In the present study we have assessed the average accuracy of free energy calculations for a total of 92 ligands binding to five different targets. To make this study and future larger scale applications possible we automated the setup procedure. Starting from user defined binding modes, the procedure decides which ligands to connect via a perturbation based on maximum common substructure criteria and produces all necessary parameter files for free energy calculations in AMBER 11. For the systems investigated, errors due to insufficient sampling were found to be substantial in some cases whereas differences in estimators (thermodynamic integration (TI) versus multistate Bennett acceptance ratio (MBAR)) were found to be negligible. Analytical uncertainty estimates calculated from a single free energy calculation were found to be much smaller than the sample standard deviation obtained from two independent free energy calculations. Agreement with experiment was found to be system dependent ranging from excellent to mediocre (RMSE = [0.9, 8.2, 4.7, 5.7, 8.7] kJ/mol). When restricting analyses to free energy calculations with sample standard deviations below 1 kJ/mol agreement with experiment improved (RMSE = [0.8, 6.9, 1.8, 3.9, 5.6] kJ/mol).

  7. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  8. Accuracy of peak VO2 assessments in career firefighters

    PubMed Central

    2011-01-01

    Background Sudden cardiac death is the leading cause of on-duty death in United States firefighters. Accurately assessing cardiopulmonary capacity is critical to preventing, or reducing, cardiovascular events in this population. Methods A total of 83 male firefighters performed Wellness-Fitness Initiative (WFI) maximal exercise treadmill tests and direct peak VO2 assessments to volitional fatigue. Of the 83, 63 completed WFI sub-maximal exercise treadmill tests for comparison to directly measured peak VO2 and historical estimations. Results Maximal heart rates were overestimated by the traditional 220-age equation by about 5 beats per minute (p < .001). Peak VO2 was overestimated by the WFI maximal exercise treadmill and the historical WFI sub-maximal estimation by ~ 1MET and ~ 2 METs, respectively (p < 0.001). The revised 2008 WFI sub-maximal treadmill estimation was found to accurately estimate peak VO2 when compared to directly measured peak VO2. Conclusion Accurate assessment of cardiopulmonary capacity is critical in determining appropriate duty assignments, and identification of potential cardiovascular problems, for firefighters. Estimation of cardiopulmonary fitness improves using the revised 2008 WFI sub-maximal equation. PMID:21943154

  9. The Attribute Accuracy Assessment of Land Cover Data in the National Geographic Conditions Survey

    NASA Astrophysics Data System (ADS)

    Ji, X.; Niu, X.

    2014-04-01

    With the widespread national survey of geographic conditions, object-based data has already became the most common data organization pattern in the area of land cover research. Assessing the accuracy of object-based land cover data is related to lots of processes of data production, such like the efficiency of inside production and the quality of final land cover data. Therefore,there are a great deal of requirements of accuracy assessment of object-based classification map. Traditional approaches for accuracy assessment in surveying and mapping are not aimed at land cover data. It is necessary to employ the accuracy assessment in imagery classification. However traditional pixel-based accuracy assessing methods are inadequate for the requirements. The measures we improved are based on error matrix and using objects as sample units, because the pixel sample units are not suitable for assessing the accuracy of object-based classification result. Compared to pixel samples, we realize that the uniformity of object samples has changed. In order to make the indexes generating from error matrix reliable, we using the areas of object samples as the weight to establish the error matrix of object-based image classification map. We compare the result of two error matrixes setting up by the number of object samples and the sum of area of object samples. The error matrix using the sum of area of object sample is proved to be an intuitive, useful technique for reflecting the actual accuracy of object-based imagery classification result.

  10. Mapping with Small UAS: A Point Cloud Accuracy Assessment

    NASA Astrophysics Data System (ADS)

    Toth, Charles; Jozkow, Grzegorz; Grejner-Brzezinska, Dorota

    2015-12-01

    Interest in using inexpensive Unmanned Aerial System (UAS) technology for topographic mapping has recently significantly increased. Small UAS platforms equipped with consumer grade cameras can easily acquire high-resolution aerial imagery allowing for dense point cloud generation, followed by surface model creation and orthophoto production. In contrast to conventional airborne mapping systems, UAS has limited ground coverage due to low flying height and limited flying time, yet it offers an attractive alternative to high performance airborne systems, as the cost of the sensors and platform, and the flight logistics, is relatively low. In addition, UAS is better suited for small area data acquisitions and to acquire data in difficult to access areas, such as urban canyons or densely built-up environments. The main question with respect to the use of UAS is whether the inexpensive consumer sensors installed in UAS platforms can provide the geospatial data quality comparable to that provided by conventional systems. This study aims at the performance evaluation of the current practice of UAS-based topographic mapping by reviewing the practical aspects of sensor configuration, georeferencing and point cloud generation, including comparisons between sensor types and processing tools. The main objective is to provide accuracy characterization and practical information for selecting and using UAS solutions in general mapping applications. The analysis is based on statistical evaluation as well as visual examination of experimental data acquired by a Bergen octocopter with three different image sensor configurations, including a GoPro HERO3+ Black Edition, a Nikon D800 DSLR and a Velodyne HDL-32. In addition, georeferencing data of varying quality were acquired and evaluated. The optical imagery was processed by using three commercial point cloud generation tools. Comparing point clouds created by active and passive sensors by using different quality sensors, and finally

  11. Attribute-Level and Pattern-Level Classification Consistency and Accuracy Indices for Cognitive Diagnostic Assessment

    ERIC Educational Resources Information Center

    Wang, Wenyi; Song, Lihong; Chen, Ping; Meng, Yaru; Ding, Shuliang

    2015-01-01

    Classification consistency and accuracy are viewed as important indicators for evaluating the reliability and validity of classification results in cognitive diagnostic assessment (CDA). Pattern-level classification consistency and accuracy indices were introduced by Cui, Gierl, and Chang. However, the indices at the attribute level have not yet…

  12. Probabilistic Digital Elevation Model Generation For Spatial Accuracy Assessment

    NASA Astrophysics Data System (ADS)

    Jalobeanu, A.

    2008-12-01

    are presented. A pair of images (including the nadir view) at 30m resolution was used to obtain a DEM with a vertical accuracy better than 10m in well-textured areas. The lack of information in smooth regions naturally led to large uncertainty estimates.

  13. Thermal radiation view factor: Methods, accuracy and computer-aided procedures

    NASA Technical Reports Server (NTRS)

    Kadaba, P. V.

    1982-01-01

    The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.

  14. TADS Needs Assessment Procedures Manual, Summer 1980.

    ERIC Educational Resources Information Center

    Black, Talbot; And Others

    The TADS (Technical Assistance Development System) Needs Assessment Manual is designed to guide the comprehensive review of Handicapped Children's Early Education Program (HCEEP) demonstration projects in identifying technical assistance needs. An introduction reviews the TADS technical assistance model which includes program planning, needs…

  15. Procedural ultrasound in pediatric patients: techniques and tips for accuracy and safety.

    PubMed

    Lin, Sophia

    2016-06-01

    Point-of-care ultrasound is becoming more prevalent in pediatric emergency departments as a critical adjunct to both diagnosis and procedure guidance. It is cost-effective, safe for unstable patients, and easily repeatable as a patient's clinical status changes. Point-of-care ultrasound does not expose the patient to ionizing radiation and may care ultrasound in pediatric emergency medicine is relatively new, the body of literature evaluating its utility is small, but growing. Data from adult emergency medicine, radiology, critical care, and anesthesia evaluating the utility of ultrasound guidance must be extrapolated to pediatric emergency medicine. This issue will review the adult literature and the available pediatric literature comparing ultrasound guidance to more traditional approaches. Methods for using ultrasound guidance to perform various procedures, and the pitfalls associated with each procedure, will also be described.

  16. Procedural ultrasound in pediatric patients: techniques and tips for accuracy and safety.

    PubMed

    Lin, Sophia

    2016-06-01

    Point-of-care ultrasound is becoming more prevalent in pediatric emergency departments as a critical adjunct to both diagnosis and procedure guidance. It is cost-effective, safe for unstable patients, and easily repeatable as a patient's clinical status changes. Point-of-care ultrasound does not expose the patient to ionizing radiation and may care ultrasound in pediatric emergency medicine is relatively new, the body of literature evaluating its utility is small, but growing. Data from adult emergency medicine, radiology, critical care, and anesthesia evaluating the utility of ultrasound guidance must be extrapolated to pediatric emergency medicine. This issue will review the adult literature and the available pediatric literature comparing ultrasound guidance to more traditional approaches. Methods for using ultrasound guidance to perform various procedures, and the pitfalls associated with each procedure, will also be described. PMID:27232771

  17. Assessment of RFID Read Accuracy for ISS Water Kit

    NASA Technical Reports Server (NTRS)

    Chu, Andrew

    2011-01-01

    The Space Life Sciences Directorate/Medical Informatics and Health Care Systems Branch (SD4) is assessing the benefits Radio Frequency Identification (RFID) technology for tracking items flown onboard the International Space Station (ISS). As an initial study, the Avionic Systems Division Electromagnetic Systems Branch (EV4) is collaborating with SD4 to affix RFID tags to a water kit supplied by SD4 and studying the read success rate of the tagged items. The tagged water kit inside a Cargo Transfer Bag (CTB) was inventoried using three different RFID technologies, including the Johnson Space Center Building 14 Wireless Habitat Test Bed RFID portal, an RFID hand-held reader being targeted for use on board the ISS, and an RFID enclosure designed and prototyped by EV4.

  18. Accuracy of virtual models in the assessment of maxillary defects

    PubMed Central

    Kurşun, Şebnem; Kılıç, Cenk; Özen, Tuncer

    2015-01-01

    Purpose This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Materials and Methods Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60×60 mm FOV, 0.125 mm3 (FOV60); 2) 80×80 mm FOV, 0.160 mm3 (FOV80); and 3) 100×100 mm FOV, 0.250 mm3 (FOV100). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. Results A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. Conclusion In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements. PMID:25793180

  19. Evaluating the Effect of Learning Style and Student Background on Self-Assessment Accuracy

    ERIC Educational Resources Information Center

    Alaoutinen, Satu

    2012-01-01

    This study evaluates a new taxonomy-based self-assessment scale and examines factors that affect assessment accuracy and course performance. The scale is based on Bloom's Revised Taxonomy and is evaluated by comparing students' self-assessment results with course performance in a programming course. Correlation has been used to reveal possible…

  20. 12 CFR 630.5 - Accuracy of reports and assessment of internal control over financial reporting.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CREDIT SYSTEM General § 630.5 Accuracy of reports and assessment of internal control over financial... assessment of internal control over financial reporting. (1) Annual reports must include a report by the Funding Corporation's management assessing the effectiveness of the internal control over...

  1. Update and review of accuracy assessment techniques for remotely sensed data

    NASA Technical Reports Server (NTRS)

    Congalton, R. G.; Heinen, J. T.; Oderwald, R. G.

    1983-01-01

    Research performed in the accuracy assessment of remotely sensed data is updated and reviewed. The use of discrete multivariate analysis techniques for the assessment of error matrices, the use of computer simulation for assessing various sampling strategies, and an investigation of spatial autocorrelation techniques are examined.

  2. Test-Induced Priming Impairs Source Monitoring Accuracy in the DRM Procedure

    ERIC Educational Resources Information Center

    Dewhurst, Stephen A.; Knott, Lauren M.; Howe, Mark L.

    2011-01-01

    Three experiments investigated the effects of test-induced priming (TIP) on false recognition in the Deese/Roediger-McDermott procedure (Deese, 1959; Roediger & McDermott, 1995). In Experiment 1, TIP significantly increased false recognition for participants who made old/new decisions at test but not for participants who made remember/know…

  3. Accuracy Assessment of GPS Buoy Sea Level Measurements for Coastal Applications

    NASA Astrophysics Data System (ADS)

    Chiu, S.; Cheng, K.

    2008-12-01

    The GPS buoy in this study contains a geodetic antenna and a compact floater with the GPS receiver and power supply tethered to a boat. The coastal applications using GPS include monitoring of sea level and its change, calibration of satellite altimeters, hydrological or geophysical parameters modeling, seafloor geodesy, and others. Among these applications, in order to understand the overall data or model quality, it is required to gain the knowledge of position accuracy of GPS buoys or GPS-equipped vessels. Despite different new GPS data processing techniques, e.g., Precise Point Positioning (PPP) and virtual reference station (VRS), that require a prioir information obtained from the a regional GPS network. While the required a prioir information can be implemented on land, it may not be available on the sea. Hence, in this study, the GPS buoy was positioned with respect to a onshore GPS reference station using the traditional double- difference technique. Since the atmosphere starts to decorrelate as the baseline, the distance between the buoy and the reference station, increases, the positioning accuracy consequently decreases. Therefore, this study aims to assess the buoy position accuracy as the baseline increases and in order to quantify the upper limit of sea level measured by the GPS buoy. A GPS buoy campaign was conducted by National Chung Cheng University in An Ping, Taiwan with a 8- hour GPS buoy data collection. In addition, a GPS network contains 4 Continuous GPS (CGPS) stations in Taiwan was established with the goal to enable baselines in different range for buoy data processing. A vector relation from the network was utilized in order to find the correct ambiguities, which were applied to the long-baseline solution to eliminate the position error caused by incorrect ambiguities. After this procedure, a 3.6-cm discrepancy was found in the mean sea level solution between the long (~80 km) and the short (~1.5 km) baselines. The discrepancy between a

  4. Factor-Analytic Procedures for Assessing Response Pattern Scalability

    ERIC Educational Resources Information Center

    Ferrando, Pere J.

    2007-01-01

    This paper proposes procedures for assessing the fit of a psychometric model at the level of the individual respondent. The procedures are intended for personality measures made up of Likert-type items, which, in applied research, are usually analyzed by means of factor analysis. Two scalability indices are proposed, which can be considered as…

  5. [Hygienic assessment of innovation procedures for lesson scheduling at school].

    PubMed

    Stepanova, M I; Sazaniuk, Z I; Polenova, M A

    2012-01-01

    Hygienic assessment has established that education, by using innovation procedures for school lesson scheduling, diminishes the weariness of a schooling load and optimizes the pupils' psychosomatic status and school routine.

  6. 77 FR 19591 - Assessment of Mediation and Arbitration Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ..., EP 699 (STB served Aug. 20, 2010). \\3\\ Assessment of Mediation and Arbitration Procedures, 75 FR 52... authorize parties to a proceeding before the Board, upon mutual request, to participate in meditation with...

  7. [Hygienic assessment of innovation procedures for lesson scheduling at school].

    PubMed

    Stepanova, M I; Sazaniuk, Z I; Polenova, M A

    2012-01-01

    Hygienic assessment has established that education, by using innovation procedures for school lesson scheduling, diminishes the weariness of a schooling load and optimizes the pupils' psychosomatic status and school routine. PMID:22712330

  8. Bringing everyday mind reading into everyday life: assessing empathic accuracy with daily diary data.

    PubMed

    Howland, Maryhope; Rafaeli, Eshkol

    2010-10-01

    Individual differences in empathic accuracy (EA) can be assessed using daily diary methods as a complement to more commonly used lab-based behavioral observations. Using electronic dyadic diaries, we distinguished among elements of EA (i.e., accuracy in levels, scatter, and pattern, regarding both positive and negative moods) and examined them as phenomena at both the day and the person level. In a 3-week diary study of cohabiting partners, we found support for differentiating these elements. The proposed indices reflect differing aspects of accuracy, with considerable similarity among same-valenced accuracy indices. Overall there was greater accuracy regarding negative target moods than positive target moods. These methods and findings take the phenomenon of "everyday mindreading" (Ickes, 2003) into everyday life. We conclude by discussing empathic accuracies as a family of capacities for, or tendencies toward, accurate interpersonal sensitivity. Members of this family may have distinct associations with the perceiver's, target's, and relationship's well-being.

  9. Nanopositioning and nanomeasuring machine for high accuracy measuring procedures of small features in large areas

    NASA Astrophysics Data System (ADS)

    Manske, E.; Hausotte, T.; Mastylo, R.; Hofmann, N.; Jäger, G.

    2005-10-01

    Driven by increasing precision and accuracy requirements due to miniaturization and performance enhancement, measuring technologies need alternative ways of positioning, probing and measurement strategies. The paper describes the operation of the high-precision wide scale three-dimensional nanopositioning and nanomeasuring machine (NPM-Machine) having a resolution of 0.1 nm over the positioning and measuring range of 25 mm x 25 mm x 5 mm. The NPM-Machine has been developed by the Technische Universitat Ilmenau and manufactured by the SIOS Messtechnik GmbH Ilmenau. Three plane-mirror miniature interferometers and two angular sensors are arranged, to realize in all three coordinates zero Abbe offset measurements. Therefore, this device closes a gap in coordinate-measuring technique regarding resolution, accuracy and measuring range. The machines are operating successfully in several German and foreign research institutes including the Physikalisch-Technische Bundesanstalt (PTB). The integration of several, optical and tactile probe systems and scanning force microscopes makes the NPM-Machine suitable for various tasks, such as large-area scanning probe microscopy, mask and water inspection, circuit testing as well as measuring optical and mechanical precision work pieces such as micro lens arrays, concave lenses, step height standards.

  10. Probabilistic risk assessment of disassembly procedures

    SciTech Connect

    O`Brien, D.A.; Bement, T.R.; Letellier, B.C.

    1993-11-01

    The purpose of this report is to describe the use of Probabilistic Risk (Safety) Assessment (PRA or PSA) at a Department of Energy (DOE) facility. PRA is a methodology for (i) identifying combinations of events that, if they occur, lead to accidents (ii) estimating the frequency of occurrence of each combination of events and (iii) estimating the consequences of each accident. Specifically the study focused on evaluating the risks associated with dissembling a hazardous assembly. The PRA for the operation included a detailed evaluation only for those potential accident sequences which could lead to significant off-site consequences and affect public health. The overall purpose of this study was to investigate the feasibility of establishing a risk-consequence goal for DOE operations.

  11. Assessing the Accuracy and Consistency of Language Proficiency Classification under Competing Measurement Models

    ERIC Educational Resources Information Center

    Zhang, Bo

    2010-01-01

    This article investigates how measurement models and statistical procedures can be applied to estimate the accuracy of proficiency classification in language testing. The paper starts with a concise introduction of four measurement models: the classical test theory (CTT) model, the dichotomous item response theory (IRT) model, the testlet response…

  12. Assessing the accuracy of Landsat Thematic Mapper classification using double sampling

    USGS Publications Warehouse

    Kalkhan, M.A.; Reich, R.M.; Stohlgren, T.J.

    1998-01-01

    Double sampling was used to provide a cost efficient estimate of the accuracy of a Landsat Thematic Mapper (TM) classification map of a scene located in the Rocky Moutnain National Park, Colorado. In the first phase, 200 sample points were randomly selected to assess the accuracy between Landsat TM data and aerial photography. The overall accuracy and Kappa statistic were 49.5% and 32.5%, respectively. In the second phase, 25 sample points identified in the first phase were selected using stratified random sampling and located in the field. This information was used to correct for misclassification errors associated with the first phase samples. The overall accuracy and Kappa statistic increased to 59.6% and 45.6%, respectively.Double sampling was used to provide a cost efficient estimate of the accuracy of a Landsat Thematic Mapper (TM) classification map of a scene located in the Rocky Mountain National Park, Colorado. In the first phase, 200 sample points were randomly selected to assess the accuracy between Landsat TM data and aerial photography. The overall accuracy and Kappa statistic were 49.5 per cent and 32.5 per cent, respectively. In the second phase, 25 sample points identified in the first phase were selected using stratified random sampling and located in the field. This information was used to correct for misclassification errors associated with the first phase samples. The overall accuracy and Kappa statistic increased to 59.6 per cent and 45.6 per cent, respectively.

  13. Identifying the procedural gap and improved methods for maintaining accuracy during total hip arthroplasty.

    PubMed

    Gross, Allan; Muir, Jeffrey M

    2016-09-01

    Osteoarthritis is a ubiquitous condition, affecting 26 million Americans each year, with up to 17% of adults over age 75 suffering from one variation of arthritis. The hip is one of the most commonly affected joints and while there are conservative options for treatment, as symptoms progress, many patients eventually turn to surgery to manage their pain and dysfunction. Early surgical options such as osteotomy or arthroscopy are reserved for younger, more active patients with less severe disease and symptoms. Total hip arthroplasty offers a viable solution for patients with severe degenerative changes; however, post-surgical discrepancies in leg length, offset and component malposition are common and cause significant complications. Such discrepancies are associated with consequences such as low back pain, neurological deficits, instability and overall patient dissatisfaction. Current methods for managing leg length and offset during hip arthroplasty are either inaccurate and susceptible to error or are cumbersome, expensive and lengthen surgical time. There is currently no viable option that provides accurate, real-time data to surgeons regarding leg length, offset and cup position in a cost-effective manner. As such, we hypothesize that a procedural gap exists in hip arthroplasty, a gap into which fall a large majority of arthroplasty patients who are at increased risk of complications following surgery. These complications and associated treatments place significant stress on the healthcare system. The costs associated with addressing leg length and offset discrepancies can be minor, requiring only heel lifts and short-term rehabilitation, but can also be substantial, with revision hip arthroplasty costs of up to $54,000 per procedure. The need for a cost-effective, simple to use and unobtrusive technology to address this procedural gap in hip arthroplasty and improve patient outcomes is of increasing importance. Given the aging of the population, the projected

  14. Peaks, plateaus, numerical instabilities, and achievable accuracy in Galerkin and norm minimizing procedures for solving Ax=b

    SciTech Connect

    Cullum, J.

    1994-12-31

    Plots of the residual norms generated by Galerkin procedures for solving Ax = b often exhibit strings of irregular peaks. At seemingly erratic stages in the iterations, peaks appear in the residual norm plot, intervals of iterations over which the norms initially increase and then decrease. Plots of the residual norms generated by related norm minimizing procedures often exhibit long plateaus, sequences of iterations over which reductions in the size of the residual norm are unacceptably small. In an earlier paper the author discussed and derived relationships between such peaks and plateaus within corresponding Galerkin/Norm Minimizing pairs of such methods. In this paper, through a set of numerical experiments, the author examines connections between peaks, plateaus, numerical instabilities, and the achievable accuracy for such pairs of iterative methods. Three pairs of methods, GMRES/Arnoldi, QMR/BCG, and two bidiagonalization methods are studied.

  15. An improved multivariate analytical method to assess the accuracy of acoustic sediment classification maps.

    NASA Astrophysics Data System (ADS)

    Biondo, M.; Bartholomä, A.

    2014-12-01

    High resolution hydro acoustic methods have been successfully employed for the detailed classification of sedimentary habitats. The fine-scale mapping of very heterogeneous, patchy sedimentary facies, and the compound effect of multiple non-linear physical processes on the acoustic signal, cause the classification of backscatter images to be subject to a great level of uncertainty. Standard procedures for assessing the accuracy of acoustic classification maps are not yet established. This study applies different statistical techniques to automated classified acoustic images with the aim of i) quantifying the ability of backscatter to resolve grain size distributions ii) understanding complex patterns influenced by factors other than grain size variations iii) designing innovative repeatable statistical procedures to spatially assess classification uncertainties. A high-frequency (450 kHz) sidescan sonar survey, carried out in the year 2012 in the shallow upper-mesotidal inlet the Jade Bay (German North Sea), allowed to map 100 km2 of surficial sediment with a resolution and coverage never acquired before in the area. The backscatter mosaic was ground-truthed using a large dataset of sediment grab sample information (2009-2011). Multivariate procedures were employed for modelling the relationship between acoustic descriptors and granulometric variables in order to evaluate the correctness of acoustic classes allocation and sediment group separation. Complex patterns in the acoustic signal appeared to be controlled by the combined effect of surface roughness, sorting and mean grain size variations. The area is dominated by silt and fine sand in very mixed compositions; in this fine grained matrix, percentages of gravel resulted to be the prevailing factor affecting backscatter variability. In the absence of coarse material, sorting mostly affected the ability to detect gradual but significant changes in seabed types. Misclassification due to temporal discrepancies

  16. 42 CFR 90.3 - Procedures for requesting health assessments.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Procedures for requesting health assessments. 90.3 Section 90.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES...

  17. Teacher Learning through Participation in a Negotiated Assessment Procedure

    ERIC Educational Resources Information Center

    Verberg, Christel P. M.; Tigelaar, Dineke E. H.; Verloop, Nico

    2013-01-01

    This article focuses on the impact of a specific formative assessment procedure, negotiated assessment, on teacher professional learning. Negotiations between the assessor and the teacher as assessee seem to be especially promising for this teacher learning. However, there is no empirical evidence yet that has confirmed this. We explored…

  18. 42 CFR 90.3 - Procedures for requesting health assessments.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Procedures for requesting health assessments. 90.3 Section 90.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES...

  19. 42 CFR 90.3 - Procedures for requesting health assessments.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Procedures for requesting health assessments. 90.3 Section 90.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES...

  20. 42 CFR 90.3 - Procedures for requesting health assessments.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Procedures for requesting health assessments. 90.3 Section 90.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES...

  1. 42 CFR 90.3 - Procedures for requesting health assessments.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Procedures for requesting health assessments. 90.3 Section 90.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES...

  2. Teacher Compliance and Accuracy in State Assessment of Student Motor Skill Performance

    ERIC Educational Resources Information Center

    Hall, Tina J.; Hicklin, Lori K.; French, Karen E.

    2015-01-01

    Purpose: The purpose of this study was to investigate teacher compliance with state mandated assessment protocols and teacher accuracy in assessing student motor skill performance. Method: Middle school teachers (N = 116) submitted eighth grade student motor skill performance data from 318 physical education classes to a trained monitoring…

  3. Peer Interaction and Corrective Feedback for Accuracy and Fluency Development: Monitoring, Practice, and Proceduralization

    ERIC Educational Resources Information Center

    Sato, Masatoshi; Lyster, Roy

    2012-01-01

    This quasi-experimental study is aimed at (a) teaching learners how to provide corrective feedback (CF) during peer interaction and (b) assessing the effects of peer interaction and CF on second language (L2) development. Four university-level English classes in Japan participated (N = 167), each assigned to one of four treatment conditions. Of…

  4. Accuracy assessment of the integration of GNSS and a MEMS IMU in a terrestrial platform.

    PubMed

    Madeira, Sergio; Yan, Wenlin; Bastos, Luísa; Gonçalves, José A

    2014-11-04

    MEMS Inertial Measurement Units are available at low cost and can replace expensive units in mobile mapping platforms which need direct georeferencing. This is done through the integration with GNSS measurements in order to achieve a continuous positioning solution and to obtain orientation angles. This paper presents the results of the assessment of the accuracy of a system that integrates GNSS and a MEMS IMU in a terrestrial platform. We describe the methodology used and the tests realized where the accuracy of the positions and orientation parameters were assessed using an independent photogrammetric technique employing cameras that integrate the mobile mapping system developed by the authors. Results for the accuracy of attitude angles and coordinates show that accuracies better than a decimeter in positions, and under a degree in angles, can be achieved even considering that the terrestrial platform is operating in less than favorable environments.

  5. Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform

    PubMed Central

    Madeira, Sergio; Yan, Wenlin; Bastos, Luísa; Gonçalves, José A.

    2014-01-01

    MEMS Inertial Measurement Units are available at low cost and can replace expensive units in mobile mapping platforms which need direct georeferencing. This is done through the integration with GNSS measurements in order to achieve a continuous positioning solution and to obtain orientation angles. This paper presents the results of the assessment of the accuracy of a system that integrates GNSS and a MEMS IMU in a terrestrial platform. We describe the methodology used and the tests realized where the accuracy of the positions and orientation parameters were assessed using an independent photogrammetric technique employing cameras that integrate the mobile mapping system developed by the authors. Results for the accuracy of attitude angles and coordinates show that accuracies better than a decimeter in positions, and under a degree in angles, can be achieved even considering that the terrestrial platform is operating in less than favorable environments. PMID:25375757

  6. Procedures for scour assessments at bridges in Pennsylvania

    USGS Publications Warehouse

    Cinotto, Peter J.; White, Kirk E.

    2000-01-01

    Scour is the process and result of flowing water eroding the bed and banks of a stream. Scour at nearly 14,300 bridges(1) spanning water, and the stability of river and stream channels in Pennsylvania, are being assessed by the U.S. Geological Survey (USGS) in cooperation with the Pennsylvania Department of Transportation (PennDOT). Procedures for bridge-scour assessments have been established to address the needs of PennDOT in meeting a 1988 Federal Highway Administration mandate requiring states to establish a program to assess all public bridges over water for their vulnerability to scour. The procedures also have been established to help develop an understanding of the local and regional factors that affect scour and channel stability. This report describes procedures for the assessment of scour at all bridges that are 20 feet or greater in length that span water in Pennsylvania. There are two basic types of assessment: field-viewed bridge site assessments, for which USGS personnel visit the bridge site, and office-reviewed bridge site assessments, for which USGS personnel compile PennDOT data and do not visit the bridge site. Both types of assessments are primarily focused at assisting PennDOT in meeting the requirements of the Federal Highway Administration mandate; however, both assessments include procedures for the collection and processing of ancillary data for subsequent analysis. Date of bridge construction and the accessibility of the bridge substructure units for inspection determine which type of assessment a bridge receives. A Scour-Critical Bridge Indicator Code and a Scour Assessment Rating are computed from selected collected and compiled data. PennDOT personnel assign the final Scour-Critical Bridge Indicator Code and a Scour Assessment Rating on the basis of their review of all data. (1)Words presented in bold type are defined in the Glossary section of this report.

  7. Quality and accuracy assessment of nutrition information on the Web for cancer prevention.

    PubMed

    Shahar, Suzana; Shirley, Ng; Noah, Shahrul A

    2013-01-01

    This study aimed to assess the quality and accuracy of nutrition information about cancer prevention available on the Web. The keywords 'nutrition  +  diet  +  cancer  +  prevention' were submitted to the Google search engine. Out of 400 websites evaluated, 100 met the inclusion and exclusion criteria and were selected as the sample for the assessment of quality and accuracy. Overall, 54% of the studied websites had low quality, 48 and 57% had no author's name or information, respectively, 100% were not updated within 1 month during the study period and 86% did not have the Health on the Net seal. When the websites were assessed for readability using the Flesch Reading Ease test, nearly 44% of the websites were categorised as 'quite difficult'. With regard to accuracy, 91% of the websites did not precisely follow the latest WCRF/AICR 2007 recommendation. The quality scores correlated significantly with the accuracy scores (r  =  0.250, p  <  0.05). Professional websites (n  =  22) had the highest mean quality scores, whereas government websites (n  =  2) had the highest mean accuracy scores. The quality of the websites selected in this study was not satisfactory, and there is great concern about the accuracy of the information being disseminated. PMID:22957981

  8. Ecotoxicological effects assessment: A comparison of several extrapolation procedures

    SciTech Connect

    Okkerman, P.C.; v.d. Plassche, E.J.; Slooff, W.; Van Leeuwen, C.J.; Canton, J.H. , Bilthoven )

    1991-04-01

    In the future, extrapolation procedures will become more and more important for the effect assessment of compounds in aquatic systems. For achieving a reliable method these extrapolation procedures have to be evaluated thoroughly. As a first step three extrapolation procedures are compared by means of two sets of data, consisting of (semi)chronic and acute toxicity test results for 11 aquatic species and 8 compounds. Because of its statistical basis the extrapolation procedure of Van Straalen and Denneman is preferred over the procedures of the EPA and Stephan et al. The results of the calculations showed that lower numbers of toxicity data increase the chance of underestimating the risk of a compound. Therefore it is proposed to extend the OECD guidelines for algae, Daphnia, and fish with chronic (aquatic) toxicity tests for more species of different taxonomic groups.

  9. Using composite images to assess accuracy in personality attribution to faces.

    PubMed

    Little, Anthony C; Perrett, David I

    2007-02-01

    Several studies have demonstrated some accuracy in personality attribution using only visual appearance. Using composite images of those scoring high and low on a particular trait, the current study shows that judges perform better than chance in guessing others' personality, particularly for the traits conscientiousness and extraversion. This study also shows that attractiveness, masculinity and age may all provide cues to assess personality accurately and that accuracy is affected by the sex of both of those judging and being judged. Individuals do perform better than chance at guessing another's personality from only facial information, providing some support for the popular belief that it is possible to assess accurately personality from faces. PMID:17319053

  10. Assessment of the accuracy of pharmacy students' compounded solutions using vapor pressure osmometry.

    PubMed

    Kolling, William M; McPherson, Timothy B

    2013-04-12

    OBJECTIVE. To assess the effectiveness of using a vapor pressure osmometer to measure the accuracy of pharmacy students' compounding skills. DESIGN. Students calculated the theoretical osmotic pressure (mmol/kg) of a solution as a pre-laboratory exercise, compared their calculations with actual values, and then attempted to determine the cause of any errors found. ASSESSMENT. After the introduction of the vapor pressure osmometer, the first-time pass rate for solution compounding has varied from 85% to 100%. Approximately 85% of students surveyed reported that the instrument was valuable as a teaching tool because it objectively assessed their work and provided immediate formative assessment. CONCLUSIONS. This simple technique of measuring compounding accuracy using a vapor pressure osmometer allowed students to see the importance of quality control and assessment in practice for both pharmacists and technicians.

  11. Assessment of the Accuracy of Pharmacy Students’ Compounded Solutions Using Vapor Pressure Osmometry

    PubMed Central

    McPherson, Timothy B.

    2013-01-01

    Objective. To assess the effectiveness of using a vapor pressure osmometer to measure the accuracy of pharmacy students’ compounding skills. Design. Students calculated the theoretical osmotic pressure (mmol/kg) of a solution as a pre-laboratory exercise, compared their calculations with actual values, and then attempted to determine the cause of any errors found. Assessment. After the introduction of the vapor pressure osmometer, the first-time pass rate for solution compounding has varied from 85% to 100%. Approximately 85% of students surveyed reported that the instrument was valuable as a teaching tool because it objectively assessed their work and provided immediate formative assessment. Conclusions. This simple technique of measuring compounding accuracy using a vapor pressure osmometer allowed students to see the importance of quality control and assessment in practice for both pharmacists and technicians. PMID:23610476

  12. A novel phantom and procedure providing submillimeter accuracy in daily QA tests of accelerators used for stereotactic radiosurgery.

    PubMed

    Brezovich, Ivan A; Popple, Richard A; Duan, Jun; Shen, Sui; Wu, Xingen; Benhabib, Sidi; Huang, Mi; Cardan, Rex A

    2016-01-01

    Stereotactic radiosurgery (SRS) places great demands on spatial accuracy. Steel BBs used as markers in quality assurance (QA) phantoms are clearly visible in MV and planar kV images, but artifacts compromise cone-beam CT (CBCT) isocenter localization. The purpose of this work was to develop a QA phantom for measuring with sub-mm accuracy isocenter congruence of planar kV, MV, and CBCT imaging systems and to design a practical QA procedure that includes daily Winston-Lutz (WL) tests and does not require computer aid. The salient feature of the phantom (Universal Alignment Ball (UAB)) is a novel marker for precisely localizing isocenters of CBCT, planar kV, and MV beams. It consists of a 25.4mm diameter sphere of polymethylmetacrylate (PMMA) containing a concentric 6.35mm diameter tungsten carbide ball. The large density difference between PMMA and the polystyrene foam in which the PMMA sphere is embedded yields a sharp image of the sphere for accurate CBCT registration. The tungsten carbide ball serves in finding isocenter in planar kV and MV images and in doing WL tests. With the aid of the UAB, CBCT isocenter was located within 0.10 ± 0.05 mm of its true positon, and MV isocenter was pinpointed in planar images to within 0.06 ± 0.04mm. In clinical morning QA tests extending over an 18 months period the UAB consistently yielded measurements with sub-mm accuracy. The average distance between isocenter defined by orthogonal kV images and CBCT measured 0.16 ± 0.12 mm. In WL tests the central ray of anterior beams defined by a 1.5 × 1.5 cm2 MLC field agreed with CBCT isocenter within 0.03 ± 0.14 mm in the lateral direction and within 0.10 ± 0.19 mm in the longitudinal direction. Lateral MV beams approached CBCT isocenter within 0.00 ± 0.11 mm in the vertical direction and within -0.14 ± 0.15 mm longitudinally. It took therapists about 10 min to do the tests. The novel QA phantom allows pinpointing CBCT and MV isocenter positions to better than 0.2 mm, using

  13. Assessing the impact of measurement frequency on accuracy and uncertainty of water quality data

    NASA Astrophysics Data System (ADS)

    Helm, Björn; Schiffner, Stefanie; Krebs, Peter

    2014-05-01

    Physico-chemical water quality is a major objective for the evaluation of the ecological state of a river water body. Physical and chemical water properties are measured to assess the river state, identify prevalent pressures and develop mitigating measures. Regularly water quality is assessed based on weekly to quarterly grab samples. The increasing availability of online-sensor data measured at a high frequency allows for an enhanced understanding of emission and transport dynamics, as well as the identification of typical and critical states. In this study we present a systematic approach to assess the impact of measurement frequency on the accuracy and uncertainty of derived aggregate indicators of environmental quality. High frequency measured (10 min-1 and 15 min-1) data on water temperature, pH, turbidity, electric conductivity and concentrations of dissolved oxygen nitrate, ammonia and phosphate are assessed in resampling experiments. The data is collected at 14 sites in eastern and northern Germany representing catchments between 40 km2 and 140 000 km2 of varying properties. Resampling is performed to create series of hourly to quarterly frequency, including special restrictions like sampling at working hours or discharge compensation. Statistical properties and their confidence intervals are determined in a bootstrapping procedure and evaluated along a gradient of sampling frequency. For all variables the range of the aggregate indicators increases largely in the bootstrapping realizations with decreasing sampling frequency. Mean values of electric conductivity, pH and water temperature obtained with monthly frequency differ in average less than five percent from the original data. Mean dissolved oxygen, nitrate and phosphate had in most stations less than 15 % bias. Ammonia and turbidity are most sensitive to the increase of sampling frequency with up to 30 % in average and 250 % maximum bias at monthly sampling frequency. A systematic bias is recognized

  14. Accuracy assessment of novel two-axes rotating and single-axis translating calibration equipment

    NASA Astrophysics Data System (ADS)

    Liu, Bo; Ye, Dong; Che, Rensheng

    2009-11-01

    There is a new method that the rocket nozzle 3D motion is measured by a motion tracking system based on the passive optical markers. However, an important issue is required to resolve-how to assess the accuracy of rocket nozzle motion test. Therefore, calibration equipment is designed and manufactured for generating the truth of nozzle model motion such as translation, angle, velocity, angular velocity, etc. It consists of a base, a lifting platform, a rotary table and a rocket nozzle model with precise geometry size. The nozzle model associated with the markers is installed on the rotary table, which can translate or rotate at the known velocity. The general accuracy of rocket nozzle motion test is evaluated by comparing the truth value with the static and dynamic test data. This paper puts emphasis on accuracy assessment of novel two-axes rotating and single-axis translating calibration equipment. By substituting measured value of the error source into error model, the pointing error reaches less than 0.005deg, rotation center position error reaches 0.08mm, and the rate stability is less than 10-3. The calibration equipment accuracy is much higher than the accuracy of nozzle motion test system, thus the former can be used to assess and calibrate the later.

  15. Accuracy assessment of a surface electromyogram decomposition system in human first dorsal interosseus muscle

    NASA Astrophysics Data System (ADS)

    Hu, Xiaogang; Rymer, William Z.; Suresh, Nina L.

    2014-04-01

    Objective. The aim of this study is to assess the accuracy of a surface electromyogram (sEMG) motor unit (MU) decomposition algorithm during low levels of muscle contraction. Approach. A two-source method was used to verify the accuracy of the sEMG decomposition system, by utilizing simultaneous intramuscular and surface EMG recordings from the human first dorsal interosseous muscle recorded during isometric trapezoidal force contractions. Spike trains from each recording type were decomposed independently utilizing two different algorithms, EMGlab and dEMG decomposition algorithms. The degree of agreement of the decomposed spike timings was assessed for three different segments of the EMG signals, corresponding to specified regions in the force task. A regression analysis was performed to examine whether certain properties of the sEMG and force signal can predict the decomposition accuracy. Main results. The average accuracy of successful decomposition among the 119 MUs that were common to both intramuscular and surface records was approximately 95%, and the accuracy was comparable between the different segments of the sEMG signals (i.e., force ramp-up versus steady state force versus combined). The regression function between the accuracy and properties of sEMG and force signals revealed that the signal-to-noise ratio of the action potential and stability in the action potential records were significant predictors of the surface decomposition accuracy. Significance. The outcomes of our study confirm the accuracy of the sEMG decomposition algorithm during low muscle contraction levels and provide confidence in the overall validity of the surface dEMG decomposition algorithm.

  16. A Procedure for High Resolution Satellite Imagery Quality Assessment

    PubMed Central

    Crespi, Mattia; De Vendictis, Laura

    2009-01-01

    Data products generated from High Resolution Satellite Imagery (HRSI) are routinely evaluated during the so-called in-orbit test period, in order to verify if their quality fits the desired features and, if necessary, to obtain the image correction parameters to be used at the ground processing center. Nevertheless, it is often useful to have tools to evaluate image quality also at the final user level. Image quality is defined by some parameters, such as the radiometric resolution and its accuracy, represented by the noise level, and the geometric resolution and sharpness, described by the Modulation Transfer Function (MTF). This paper proposes a procedure to evaluate these image quality parameters; the procedure was implemented in a suitable software and tested on high resolution imagery acquired by the QuickBird, WorldView-1 and Cartosat-1 satellites. PMID:22412312

  17. Assessing map accuracy in a remotely sensed, ecoregion-scale cover map

    USGS Publications Warehouse

    Edwards, T.C.; Moisen, G.G.; Cutler, D.R.

    1998-01-01

    Landscape- and ecoregion-based conservation efforts increasingly use a spatial component to organize data for analysis and interpretation. A challenge particular to remotely sensed cover maps generated from these efforts is how best to assess the accuracy of the cover maps, especially when they can exceed 1000 s/km2 in size. Here we develop and describe a methodological approach for assessing the accuracy of large-area cover maps, using as a test case the 21.9 million ha cover map developed for Utah Gap Analysis. As part of our design process, we first reviewed the effect of intracluster correlation and a simple cost function on the relative efficiency of cluster sample designs to simple random designs. Our design ultimately combined clustered and subsampled field data stratified by ecological modeling unit and accessibility (hereafter a mixed design). We next outline estimation formulas for simple map accuracy measures under our mixed design and report results for eight major cover types and the three ecoregions mapped as part of the Utah Gap Analysis. Overall accuracy of the map was 83.2% (SE=1.4). Within ecoregions, accuracy ranged from 78.9% to 85.0%. Accuracy by cover type varied, ranging from a low of 50.4% for barren to a high of 90.6% for man modified. In addition, we examined gains in efficiency of our mixed design compared with a simple random sample approach. In regard to precision, our mixed design was more precise than a simple random design, given fixed sample costs. We close with a discussion of the logistical constraints facing attempts to assess the accuracy of large-area, remotely sensed cover maps.

  18. Innovative Approaches to Increasing the Student Assessment Procedures Effectiveness

    ERIC Educational Resources Information Center

    Dorozhkin, Evgenij M.; Chelyshkova, Marina B.; Malygin, Alexey A.; Toymentseva, Irina A.; Anopchenko, Tatiana Y.

    2016-01-01

    The relevance of the investigated problem is determined by the need to improving the evaluation procedures in education and the student assessment in the age of the context of education widening, new modes of study developing (such as blending learning, e-learning, massive open online courses), immediate feedback necessity, reliable and valid…

  19. Maine Educational Assessment (MEA) Operational Procedures, March 2005 Administration.

    ERIC Educational Resources Information Center

    Maine Department of Education, 2004

    2004-01-01

    This document is intended for use in conjunction with "Policies and Procedures for Accommodations and Alternate Assessment to the MEA," and both the "MEA Principal/Test Coordinator's Manual" and the "MEA Test Administrator's Manual." The first section, Enrollment, covers the following subjects: (1) Participation of Enrolled Students; (2) Students…

  20. 34 CFR 303.166 - Evaluation, assessment, and nondiscriminatory procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Statewide System-Application Requirements § 303.166 Evaluation, assessment, and nondiscriminatory procedures. Each application must include information to demonstrate that the requirements in §§ 303.322 and 303.323 are met. (Approved by the Office of Management and Budget under control number...

  1. 34 CFR 303.166 - Evaluation, assessment, and nondiscriminatory procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... System-Application Requirements § 303.166 Evaluation, assessment, and nondiscriminatory procedures. Each application must include information to demonstrate that the requirements in §§ 303.322 and 303.323 are met. (Approved by the Office of Management and Budget under control number 1820-0550) (Authority: 20 U.S.C....

  2. Needs Assessment Procedure: Mainstreaming Handicapped. Volume I. Final Technical Report.

    ERIC Educational Resources Information Center

    Hughes, James H.; Rice, Eric

    The design and development of a needs assessment procedure to assist local vocational education administrators in planning a mainstreaming approach for handicapped students in vocational education was the purpose of this study. The methodology involved a review of the literature on the general mainstreaming topic, program planning, needs…

  3. ICan: An Optimized Ion-Current-Based Quantification Procedure with Enhanced Quantitative Accuracy and Sensitivity in Biomarker Discovery

    PubMed Central

    2015-01-01

    The rapidly expanding availability of high-resolution mass spectrometry has substantially enhanced the ion-current-based relative quantification techniques. Despite the increasing interest in ion-current-based methods, quantitative sensitivity, accuracy, and false discovery rate remain the major concerns; consequently, comprehensive evaluation and development in these regards are urgently needed. Here we describe an integrated, new procedure for data normalization and protein ratio estimation, termed ICan, for improved ion-current-based analysis of data generated by high-resolution mass spectrometry (MS). ICan achieved significantly better accuracy and precision, and lower false-positive rate for discovering altered proteins, over current popular pipelines. A spiked-in experiment was used to evaluate the performance of ICan to detect small changes. In this study E. coli extracts were spiked with moderate-abundance proteins from human plasma (MAP, enriched by IgY14-SuperMix procedure) at two different levels to set a small change of 1.5-fold. Forty-five (92%, with an average ratio of 1.71 ± 0.13) of 49 identified MAP protein (i.e., the true positives) and none of the reference proteins (1.0-fold) were determined as significantly altered proteins, with cutoff thresholds of ≥1.3-fold change and p ≤ 0.05. This is the first study to evaluate and prove competitive performance of the ion-current-based approach for assigning significance to proteins with small changes. By comparison, other methods showed remarkably inferior performance. ICan can be broadly applicable to reliable and sensitive proteomic survey of multiple biological samples with the use of high-resolution MS. Moreover, many key features evaluated and optimized here such as normalization, protein ratio determination, and statistical analyses are also valuable for data analysis by isotope-labeling methods. PMID:25285707

  4. Accuracy Assessment of Underwater Photogrammetric Three Dimensional Modelling for Coral Reefs

    NASA Astrophysics Data System (ADS)

    Guo, T.; Capra, A.; Troyer, M.; Gruen, A.; Brooks, A. J.; Hench, J. L.; Schmitt, R. J.; Holbrook, S. J.; Dubbini, M.

    2016-06-01

    Recent advances in automation of photogrammetric 3D modelling software packages have stimulated interest in reconstructing highly accurate 3D object geometry in unconventional environments such as underwater utilizing simple and low-cost camera systems. The accuracy of underwater 3D modelling is affected by more parameters than in single media cases. This study is part of a larger project on 3D measurements of temporal change of coral cover in tropical waters. It compares the accuracies of 3D point clouds generated by using images acquired from a system camera mounted in an underwater housing and the popular GoPro cameras respectively. A precisely measured calibration frame was placed in the target scene in order to provide accurate control information and also quantify the errors of the modelling procedure. In addition, several objects (cinder blocks) with various shapes were arranged in the air and underwater and 3D point clouds were generated by automated image matching. These were further used to examine the relative accuracy of the point cloud generation by comparing the point clouds of the individual objects with the objects measured by the system camera in air (the best possible values). Given a working distance of about 1.5 m, the GoPro camera can achieve a relative accuracy of 1.3 mm in air and 2.0 mm in water. The system camera achieved an accuracy of 1.8 mm in water, which meets our requirements for coral measurement in this system.

  5. Calibration of ground-based microwave radiometers - Accuracy assessment and recommendations for network users

    NASA Astrophysics Data System (ADS)

    Pospichal, Bernhard; Küchler, Nils; Löhnert, Ulrich; Crewell, Susanne; Czekala, Harald; Güldner, Jürgen

    2016-04-01

    Ground-based microwave radiometers (MWR) are becoming widely used in atmospheric remote sensing and start to be routinely operated by national weather services and other institutions. However, common standards for calibration of these radiometers and a detailed knowledge about the error characteristics is needed, in order to assimilate the data into models. Intercomparisons of calibrations by different MWRs have rarely been done. Therefore, two calibration experiments in Lindenberg (2014) and Meckenheim (2015) were performed in the frame of TOPROF (Cost action ES1303) in order to assess uncertainties and differences between various instruments. In addition, a series of experiments were taken in Oklahoma in autumn 2014. The focus lay on the performance of the two main instrument types, which are currently used operationally. These are the MP-Profiler series by Radiometrics Corporation as well as the HATPRO series by Radiometer Physics GmbH (RPG). Both instrument types are operating in two frequency bands, one along the 22 GHz water vapour line, the other one at the lower wing of the 60 GHz oxygen absorption complex. The goal was to establish protocols for providing quality controlled (QC) MWR data and their uncertainties. To this end, standardized calibration procedures for MWR were developed and recommendations for radiometer users were compiled. We focus here mainly on data types, integration times and optimal settings for calibration intervals, both for absolute (liquid nitrogen, tipping curve) as well as relative (hot load, noise diode) calibrations. Besides the recommendations for ground-based MWR operators, we will present methods to determine the accuracy of the calibration as well as means for automatic data quality control. In addition, some results from the intercomparison of different radiometers will be discussed.

  6. Assessing the Accuracy of MODIS-NDVI Derived Land-Cover Across the Great Lakes Basin

    EPA Science Inventory

    This research describes the accuracy assessment process for a land-cover dataset developed for the Great Lakes Basin (GLB). This land-cover dataset was developed from the 2007 MODIS Normalized Difference Vegetation Index (NDVI) 16-day composite (MOD13Q) 250 m time-series data. Tr...

  7. A PIXEL COMPOSITION-BASED REFERENCE DATA SET FOR THEMATIC ACCURACY ASSESSMENT

    EPA Science Inventory

    Developing reference data sets for accuracy assessment of land-cover classifications derived from coarse spatial resolution sensors such as MODIS can be difficult due to the large resolution differences between the image data and available reference data sources. Ideally, the spa...

  8. The Word Writing CAFE: Assessing Student Writing for Complexity, Accuracy, and Fluency

    ERIC Educational Resources Information Center

    Leal, Dorothy J.

    2005-01-01

    The Word Writing CAFE is a new assessment tool designed for teachers to evaluate objectively students' word-writing ability for fluency, accuracy, and complexity. It is designed to be given to the whole class at one time. This article describes the development of the CAFE and provides directions for administering and scoring it. The author also…

  9. Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Redell, Nickalus; Thompson, Marilyn S.; Levy, Roy

    2016-01-01

    Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA…

  10. Gender Differences in Structured Risk Assessment: Comparing the Accuracy of Five Instruments

    ERIC Educational Resources Information Center

    Coid, Jeremy; Yang, Min; Ullrich, Simone; Zhang, Tianqiang; Sizmur, Steve; Roberts, Colin; Farrington, David P.; Rogers, Robert D.

    2009-01-01

    Structured risk assessment should guide clinical risk management, but it is uncertain which instrument has the highest predictive accuracy among men and women. In the present study, the authors compared the Psychopathy Checklist-Revised (PCL-R; R. D. Hare, 1991, 2003); the Historical, Clinical, Risk Management-20 (HCR-20; C. D. Webster, K. S.…

  11. In the Right Ballpark? Assessing the Accuracy of Net Price Calculators

    ERIC Educational Resources Information Center

    Anthony, Aaron M.; Page, Lindsay C.; Seldin, Abigail

    2016-01-01

    Large differences often exist between a college's sticker price and net price after accounting for financial aid. Net price calculators (NPCs) were designed to help students more accurately estimate their actual costs to attend a given college. This study assesses the accuracy of information provided by net price calculators. Specifically, we…

  12. APPLICATION OF A "VITURAL FIELD REFERENCE DATABASE" TO ASSESS LAND-COVER MAP ACCURACIES

    EPA Science Inventory

    An accuracy assessment was performed for the Neuse River Basin, NC land-cover/use
    (LCLU) mapping results using a "Virtual Field Reference Database (VFRDB)". The VFRDB was developed using field measurement and digital imagery (camera) data collected at 1,409 sites over a perio...

  13. Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.

  14. Assessing the Accuracy of Classwide Direct Observation Methods: Two Analyses Using Simulated and Naturalistic Data

    ERIC Educational Resources Information Center

    Dart, Evan H.; Radley, Keith C.; Briesch, Amy M.; Furlow, Christopher M.; Cavell, Hannah J.; Briesch, Amy M.

    2016-01-01

    Two studies investigated the accuracy of eight different interval-based group observation methods that are commonly used to assess the effects of classwide interventions. In Study 1, a Microsoft Visual Basic program was created to simulate a large set of observational data. Binary data were randomly generated at the student level to represent…

  15. 12 CFR 620.3 - Accuracy of reports and assessment of internal control over financial reporting.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CREDIT SYSTEM DISCLOSURE TO SHAREHOLDERS General § 620.3 Accuracy of reports and assessment of internal... shall make any disclosure to shareholders or the general public concerning any matter required to be... person shall make such additional or corrective disclosure as is necessary to provide shareholders...

  16. The short- to medium-term predictive accuracy of static and dynamic risk assessment measures in a secure forensic hospital.

    PubMed

    Chu, Chi Meng; Thomas, Stuart D M; Ogloff, James R P; Daffern, Michael

    2013-04-01

    Although violence risk assessment knowledge and practice has advanced over the past few decades, it remains practically difficult to decide which measures clinicians should use to assess and make decisions about the violence potential of individuals on an ongoing basis, particularly in the short to medium term. Within this context, this study sought to compare the predictive accuracy of dynamic risk assessment measures for violence with static risk assessment measures over the short term (up to 1 month) and medium term (up to 6 months) in a forensic psychiatric inpatient setting. Results showed that dynamic measures were generally more accurate than static measures for short- to medium-term predictions of inpatient aggression. These findings highlight the necessity of using risk assessment measures that are sensitive to important clinical risk state variables to improve the short- to medium-term prediction of aggression within the forensic inpatient setting. Such knowledge can assist with the development of more accurate and efficient risk assessment procedures, including the selection of appropriate risk assessment instruments to manage and prevent the violence of offenders with mental illnesses during inpatient treatment.

  17. New Criteria for Assessing the Accuracy of Blood Glucose Monitors meeting, October 28, 2011.

    PubMed

    Walsh, John; Roberts, Ruth; Vigersky, Robert A; Schwartz, Frank

    2012-03-01

    Glucose meters (GMs) are routinely used for self-monitoring of blood glucose by patients and for point-of-care glucose monitoring by health care providers in outpatient and inpatient settings. Although widely assumed to be accurate, numerous reports of inaccuracies with resulting morbidity and mortality have been noted. Insulin dosing errors based on inaccurate GMs are most critical. On October 28, 2011, the Diabetes Technology Society invited 45 diabetes technology clinicians who were attending the 2011 Diabetes Technology Meeting to participate in a closed-door meeting entitled New Criteria for Assessing the Accuracy of Blood Glucose Monitors. This report reflects the opinions of most of the attendees of that meeting. The Food and Drug Administration (FDA), the public, and several medical societies are currently in dialogue to establish a new standard for GM accuracy. This update to the FDA standard is driven by improved meter accuracy, technological advances (pumps, bolus calculators, continuous glucose monitors, and insulin pens), reports of hospital and outpatient deaths, consumer complaints about inaccuracy, and research studies showing that several approved GMs failed to meet FDA or International Organization for Standardization standards in postapproval testing. These circumstances mandate a set of new GM standards that appropriately match the GMs' analytical accuracy to the clinical accuracy required for their intended use, as well as ensuring their ongoing accuracy following approval. The attendees of the New Criteria for Assessing the Accuracy of Blood Glucose Monitors meeting proposed a graduated standard and other methods to improve GM performance, which are discussed in this meeting report.

  18. Accuracy Assessment and Correction of Vaisala RS92 Radiosonde Water Vapor Measurements

    NASA Technical Reports Server (NTRS)

    Whiteman, David N.; Miloshevich, Larry M.; Vomel, Holger; Leblanc, Thierry

    2008-01-01

    Relative humidity (RH) measurements from Vaisala RS92 radiosondes are widely used in both research and operational applications, although the measurement accuracy is not well characterized as a function of its known dependences on height, RH, and time of day (or solar altitude angle). This study characterizes RS92 mean bias error as a function of its dependences by comparing simultaneous measurements from RS92 radiosondes and from three reference instruments of known accuracy. The cryogenic frostpoint hygrometer (CFH) gives the RS92 accuracy above the 700 mb level; the ARM microwave radiometer gives the RS92 accuracy in the lower troposphere; and the ARM SurTHref system gives the RS92 accuracy at the surface using 6 RH probes with NIST-traceable calibrations. These RS92 assessments are combined using the principle of Consensus Referencing to yield a detailed estimate of RS92 accuracy from the surface to the lowermost stratosphere. An empirical bias correction is derived to remove the mean bias error, yielding corrected RS92 measurements whose mean accuracy is estimated to be +/-3% of the measured RH value for nighttime soundings and +/-4% for daytime soundings, plus an RH offset uncertainty of +/-0.5%RH that is significant for dry conditions. The accuracy of individual RS92 soundings is further characterized by the 1-sigma "production variability," estimated to be +/-1.5% of the measured RH value. The daytime bias correction should not be applied to cloudy daytime soundings, because clouds affect the solar radiation error in a complicated and uncharacterized way.

  19. Increasing accuracy in the assessment of motion sickness: A construct methodology

    NASA Technical Reports Server (NTRS)

    Stout, Cynthia S.; Cowings, Patricia S.

    1993-01-01

    The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.

  20. Radiative accuracy assessment of CrIS upper level channels using COSMIC RO data

    NASA Astrophysics Data System (ADS)

    Qi, C.; Weng, F.; Han, Y.; Lin, L.; Chen, Y.; Wang, L.

    2012-12-01

    The Cross-track Infrared Sounder(CrIS) onboard Suomi National Polar-orbiting Partnership(NPP) satellite is designed to provide high vertical resolution information on the atmosphere's three-dimensional structure of temperature and water vapor. There are much work has been done to verify the observation accuracy of CrIS since its launch date of Oct. 28, 2011, such as SNO cross comparison with other hyper-spectral infrared instruments and forward simulation comparison using radiative transfer model based on numerical prediction background profiles. Radio occultation technique can provide profiles of the Earth's ionosphere and neutral atmosphere with high accuracy, high vertical resolution and global coverage. It has advantages of all-weather capability, low expense, long-term stability etc. Assessing CrIS radiative calibration accuracy was conducted by comparison between observation and Line-by-line simulation using COSMIC RO data. The main process technique include : (a) COSMIC RO data downloading and collocation with CrIS measurements through weighting function (wf) peak altitude dependent collocation method; (b) High spectral resolution of Line-by-line radiance simulation using collocated COSMIC RO profiles ; (c) Generation of CrIS channel radiance by FFT transform method; (d): Bias analysis . This absolute calibration accuracy assessing method verified a 0.3K around bias error of CrIS measurements.

  1. Evaluating the effect of learning style and student background on self-assessment accuracy

    NASA Astrophysics Data System (ADS)

    Alaoutinen, Satu

    2012-06-01

    This study evaluates a new taxonomy-based self-assessment scale and examines factors that affect assessment accuracy and course performance. The scale is based on Bloom's Revised Taxonomy and is evaluated by comparing students' self-assessment results with course performance in a programming course. Correlation has been used to reveal possible connections between student information and both self-assessment and course performance. The results show that students can place their knowledge along the taxonomy-based scale quite well and the scale seems to fit engineering students' learning style. Advanced students assess themselves more accurately than novices. The results also show that reflective students were better in programming than active. The scale used in this study gives a more objective picture of students' knowledge than general scales and with modifications it can be used in other classes than programming.

  2. Standardizing the Protocol for Hemispherical Photographs: Accuracy Assessment of Binarization Algorithms

    PubMed Central

    Glatthorn, Jonas; Beckschäfer, Philip

    2014-01-01

    Hemispherical photography is a well-established method to optically assess ecological parameters related to plant canopies; e.g. ground-level light regimes and the distribution of foliage within the crown space. Interpreting hemispherical photographs involves classifying pixels as either sky or vegetation. A wide range of automatic thresholding or binarization algorithms exists to classify the photographs. The variety in methodology hampers ability to compare results across studies. To identify an optimal threshold selection method, this study assessed the accuracy of seven binarization methods implemented in software currently available for the processing of hemispherical photographs. Therefore, binarizations obtained by the algorithms were compared to reference data generated through a manual binarization of a stratified random selection of pixels. This approach was adopted from the accuracy assessment of map classifications known from remote sensing studies. Percentage correct () and kappa-statistics () were calculated. The accuracy of the algorithms was assessed for photographs taken with automatic exposure settings (auto-exposure) and photographs taken with settings which avoid overexposure (histogram-exposure). In addition, gap fraction values derived from hemispherical photographs were compared with estimates derived from the manually classified reference pixels. All tested algorithms were shown to be sensitive to overexposure. Three of the algorithms showed an accuracy which was high enough to be recommended for the processing of histogram-exposed hemispherical photographs: “Minimum” ( 98.8%; 0.952), “Edge Detection” ( 98.1%; 0.950), and “Minimum Histogram” ( 98.1%; 0.947). The Minimum algorithm overestimated gap fraction least of all (11%). The overestimation by the algorithms Edge Detection (63%) and Minimum Histogram (67%) were considerably larger. For the remaining four evaluated algorithms (IsoData, Maximum Entropy, MinError, and Otsu) an

  3. Standardizing the protocol for hemispherical photographs: accuracy assessment of binarization algorithms.

    PubMed

    Glatthorn, Jonas; Beckschäfer, Philip

    2014-01-01

    Hemispherical photography is a well-established method to optically assess ecological parameters related to plant canopies; e.g. ground-level light regimes and the distribution of foliage within the crown space. Interpreting hemispherical photographs involves classifying pixels as either sky or vegetation. A wide range of automatic thresholding or binarization algorithms exists to classify the photographs. The variety in methodology hampers ability to compare results across studies. To identify an optimal threshold selection method, this study assessed the accuracy of seven binarization methods implemented in software currently available for the processing of hemispherical photographs. Therefore, binarizations obtained by the algorithms were compared to reference data generated through a manual binarization of a stratified random selection of pixels. This approach was adopted from the accuracy assessment of map classifications known from remote sensing studies. Percentage correct (Pc) and kappa-statistics (K) were calculated. The accuracy of the algorithms was assessed for photographs taken with automatic exposure settings (auto-exposure) and photographs taken with settings which avoid overexposure (histogram-exposure). In addition, gap fraction values derived from hemispherical photographs were compared with estimates derived from the manually classified reference pixels. All tested algorithms were shown to be sensitive to overexposure. Three of the algorithms showed an accuracy which was high enough to be recommended for the processing of histogram-exposed hemispherical photographs: "Minimum" (Pc 98.8%; K 0.952), "Edge Detection" (Pc 98.1%; K 0.950), and "Minimum Histogram" (Pc 98.1%; K 0.947). The Minimum algorithm overestimated gap fraction least of all (11%). The overestimation by the algorithms Edge Detection (63%) and Minimum Histogram (67%) were considerably larger. For the remaining four evaluated algorithms (IsoData, Maximum Entropy, MinError, and Otsu

  4. Comparative Accuracy Assessment of Global Land Cover Datasets Using Existing Reference Data

    NASA Astrophysics Data System (ADS)

    Tsendbazar, N. E.; de Bruin, S.; Mora, B.; Herold, M.

    2014-12-01

    Land cover is a key variable to monitor the impact of human and natural processes on the biosphere. As one of the Essential Climate Variables, land cover observations are used for climate models and several other applications. Remote sensing technologies have enabled the generation of several global land cover (GLC) products that are based on different data sources and methods (e.g. legends). Moreover, the reported map accuracies result from varying validation strategies. Such differences make the comparison of the GLC products challenging and create confusion on selecting suitable datasets for different applications. This study aims to conduct comparative accuracy assessment of GLC datasets (LC-CCI 2005, MODIS 2005, and Globcover 2005) using the Globcover 2005 reference data which can represent the thematic differences of these GLC maps. This GLC reference dataset provides LCCS classifier information for 3 main land cover types for each sample plot. The LCCS classifier information was translated according to the legends of the GLC maps analysed. The preliminary analysis showed some challenges in LCCS classifier translation arising from missing important classifier information, differences in class definition between the legends and absence of class proportion of main land cover types. To overcome these issues, we consolidated the entire reference data (i.e. 3857 samples distributed at global scale). Then the GLC maps and the reference dataset were harmonized into 13 general classes to perform the comparative accuracy assessments. To help users on selecting suitable GLC dataset(s) for their application, we conducted the map accuracy assessments considering different users' perspectives: climate modelling, bio-diversity assessments, agriculture monitoring, and map producers. This communication will present the method and the results of this study and provide a set of recommendations to the GLC map producers and users with the aim to facilitate the use of GLC maps.

  5. Accuracy assessment of topographic mapping using UAV image integrated with satellite images

    NASA Astrophysics Data System (ADS)

    Azmi, S. M.; Ahmad, Baharin; Ahmad, Anuar

    2014-02-01

    Unmanned Aerial Vehicle or UAV is extensively applied in various fields such as military applications, archaeology, agriculture and scientific research. This study focuses on topographic mapping and map updating. UAV is one of the alternative ways to ease the process of acquiring data with lower operating costs, low manufacturing and operational costs, plus it is easy to operate. Furthermore, UAV images will be integrated with QuickBird images that are used as base maps. The objective of this study is to make accuracy assessment and comparison between topographic mapping using UAV images integrated with aerial photograph and satellite image. The main purpose of using UAV image is as a replacement for cloud covered area which normally exists in aerial photograph and satellite image, and for updating topographic map. Meanwhile, spatial resolution, pixel size, scale, geometric accuracy and correction, image quality and information contents are important requirements needed for the generation of topographic map using these kinds of data. In this study, ground control points (GCPs) and check points (CPs) were established using real time kinematic Global Positioning System (RTK-GPS) technique. There are two types of analysis that are carried out in this study which are quantitative and qualitative assessments. Quantitative assessment is carried out by calculating root mean square error (RMSE). The outputs of this study include topographic map and orthophoto. From this study, the accuracy of UAV image is ± 0.460 m. As conclusion, UAV image has the potential to be used for updating of topographic maps.

  6. Multinomial tree models for assessing the status of the reference in studies of the accuracy of tools for binary classification

    PubMed Central

    Botella, Juan; Huang, Huiling; Suero, Manuel

    2013-01-01

    Studies that evaluate the accuracy of binary classification tools are needed. Such studies provide 2 × 2 cross-classifications of test outcomes and the categories according to an unquestionable reference (or gold standard). However, sometimes a suboptimal reliability reference is employed. Several methods have been proposed to deal with studies where the observations are cross-classified with an imperfect reference. These methods require that the status of the reference, as a gold standard or as an imperfect reference, is known. In this paper a procedure for determining whether it is appropriate to maintain the assumption that the reference is a gold standard or an imperfect reference, is proposed. This procedure fits two nested multinomial tree models, and assesses and compares their absolute and incremental fit. Its implementation requires the availability of the results of several independent studies. These should be carried out using similar designs to provide frequencies of cross-classification between a test and the reference under investigation. The procedure is applied in two examples with real data. PMID:24106484

  7. Multinomial tree models for assessing the status of the reference in studies of the accuracy of tools for binary classification.

    PubMed

    Botella, Juan; Huang, Huiling; Suero, Manuel

    2013-01-01

    Studies that evaluate the accuracy of binary classification tools are needed. Such studies provide 2 × 2 cross-classifications of test outcomes and the categories according to an unquestionable reference (or gold standard). However, sometimes a suboptimal reliability reference is employed. Several methods have been proposed to deal with studies where the observations are cross-classified with an imperfect reference. These methods require that the status of the reference, as a gold standard or as an imperfect reference, is known. In this paper a procedure for determining whether it is appropriate to maintain the assumption that the reference is a gold standard or an imperfect reference, is proposed. This procedure fits two nested multinomial tree models, and assesses and compares their absolute and incremental fit. Its implementation requires the availability of the results of several independent studies. These should be carried out using similar designs to provide frequencies of cross-classification between a test and the reference under investigation. The procedure is applied in two examples with real data.

  8. The analysis accuracy assessment of CORINE land cover in the Iberian coast

    NASA Astrophysics Data System (ADS)

    Grullón, Yraida R.; Alhaddad, Bahaaeddin; Cladera, Josep R.

    2009-09-01

    Corine land cover 2000 (CLC2000) is a project jointly managed by the Joint Research Centre (JRC) and the European Environment Agency (EEA). Its aim is to update the Corine land cover database in Europe for the year 2000. Landsat-7 Enhanced Thematic Mapper (ETM) satellite images were used for the update and were acquired within the framework of the Image2000 project. Knowledge of the land status through the use of mapping CORINE Land Cover is of great importance to study of interaction land cover and land use categories in Europe scale. This paper presents the accuracy assessment methodology designed and implemented to validate the Iberian Coast CORINE Land Cover 2000 cartography. It presents an implementation of a new methodological concept for land cover data production, Object- Based classification, and automatic generalization to assess the thematic accuracy of CLC2000 by means of an independent data source based on the comparison of the land cover database with reference data derived from visual interpretation of high resolution satellite imageries for sample areas. In our case study, the existing Object-Based classifications are supported with digital maps and attribute databases. According to the quality tests performed, we computed the overall accuracy, and Kappa Coefficient. We will focus on the development of a methodology based on classification and generalization analysis for built-up areas that may improve the investigation. This study can be divided in these fundamental steps: -Extract artificial areas from land use Classifications based on Land-sat and Spot images. -Manuel interpretation for high resolution of multispectral images. -Determine the homogeneity of artificial areas by generalization process. -Overall accuracy, Kappa Coefficient and Special grid (fishnet) test for quality test. Finally, this paper will concentrate to illustrate the precise accuracy of CORINE dataset based on the above general steps.

  9. Accuracy of ELISA detection methods for gluten and reference materials: a realistic assessment.

    PubMed

    Diaz-Amigo, Carmen; Popping, Bert

    2013-06-19

    The determination of prolamins by ELISA and subsequent conversion of the resulting concentration to gluten content in food appears to be a comparatively simple and straightforward process with which many laboratories have years-long experience. At the end of the process, a value of gluten, expressed in mg/kg or ppm, is obtained. This value often is the basis for the decision if a product can be labeled gluten-free or not. On the basis of currently available scientific information, the accuracy of the obtained values with commonly used commercial ELISA kits has to be questioned. Although recently several multilaboratory studies have been conducted in an attempt to emphasize and ensure the accuracy of the results, data suggest that it was the precision of these assays, not the accuracy, that was confirmed because some of the underlying assumptions for calculating the gluten content lack scientific data support as well as appropriate reference materials for comparison. This paper discusses the issues of gluten determination and quantification with respect to antibody specificity, extraction procedures, reference materials, and their commutability.

  10. Assessing the Accuracy of Alaska National Hydrography Data for Mapping and Science

    NASA Astrophysics Data System (ADS)

    Arundel, S. T.; Yamamoto, K. H.; Mantey, K.; Vinyard-Houx, J.; Miller-Corbett, C. D.

    2012-12-01

    In July, 2011, the National Geospatial Program embarked on a large-scale Alaska Topographic Mapping Initiative. Maps will be published through the USGS US Topo program. Mapping of the state requires an understanding of the spatial quality of the National Hydrography Dataset (NHD), which is the hydrographic source for the US Topo. The NHD in Alaska was originally produced from topographic maps at 1:63,360 scale. It is critical to determine whether the NHD is accurate enough to be represented at the targeted map scale of the US Topo (1:25,000). Concerns are the spatial accuracy of data and the density of the stream network. Unsuitably low accuracy can be a result of the lower positional accuracy standards required for the original 1:63,360 scale mapping, temporal changes in water features, or any combination of these factors. Insufficient positional accuracy results in poor vertical integration with data layers of higher positional accuracy. Poor integration is readily apparent on the US Topo, particularly relative to current imagery and elevation data. In Alaska, current IFSAR-derived digital terrain models meet positional accuracy requirements for 1:24,000-scale mapping. Initial visual assessments indicate a wide range in the quality of fit between features in NHD and the IFSAR. However, no statistical analysis had been performed to quantify NHD feature accuracy. Determining the absolute accuracy is cost prohibitive, because of the need to collect independent, well-defined test points for such analysis; however, quantitative analysis of relative positional error is a feasible alternative. The purpose of this study is to determine the baseline accuracy of Alaska NHD pertinent to US Topo production, and to recommend reasonable guidelines and costs for NHD improvement and updates. A second goal is to detect error trends that might help identify areas or features where data improvements are most needed. There are four primary objectives of the study: 1. Choose study

  11. Technical note: A physical phantom for assessment of accuracy of deformable alignment algorithms

    SciTech Connect

    Kashani, Rojano; Hub, Martina; Kessler, Marc L.; Balter, James M.

    2007-07-15

    The purpose of this study was to investigate the feasibility of a simple deformable phantom as a QA tool for testing and validation of deformable image registration algorithms. A diagnostic thoracic imaging phantom with a deformable foam insert was used in this study. Small plastic markers were distributed through the foam to create a lattice with a measurable deformation as the ground truth data for all comparisons. The foam was compressed in the superior-inferior direction using a one-dimensional drive stage pushing a flat 'diaphragm' to create deformations similar to those from inhale and exhale states. Images were acquired at different compressions of the foam and the location of every marker was manually identified on each image volume to establish a known deformation field with a known accuracy. The markers were removed digitally from corresponding images prior to registration. Different image registration algorithms were tested using this method. Repeat measurement of marker positions showed an accuracy of better than 1 mm in identification of the reference marks. Testing the method on several image registration algorithms showed that the system is capable of evaluating errors quantitatively. This phantom is able to quantitatively assess the accuracy of deformable image registration, using a measure of accuracy that is independent of the signals that drive the deformation parameters.

  12. An assessment of template-guided implant surgery in terms of accuracy and related factors

    PubMed Central

    Lee, Jee-Ho; Park, Ji-Man; Kim, Soung-Min; Kim, Myung-Joo; Lee, Jong-Ho

    2013-01-01

    PURPOSE Template-guided implant therapy has developed hand-in-hand with computed tomography (CT) to improve the accuracy of implant surgery and future prosthodontic treatment. In our present study, the accuracy and causative factors for computer-assisted implant surgery were assessed to further validate the stable clinical application of this technique. MATERIALS AND METHODS A total of 102 implants in 48 patients were included in this study. Implant surgery was performed with a stereolithographic template. Pre- and post-operative CTs were used to compare the planned and placed implants. Accuracy and related factors were statistically analyzed with the Spearman correlation method and the linear mixed model. Differences were considered to be statistically significant at P≤.05. RESULTS The mean errors of computer-assisted implant surgery were 1.09 mm at the coronal center, 1.56 mm at the apical center, and the axis deviation was 3.80°. The coronal and apical errors of the implants were found to be strongly correlated. The errors developed at the coronal center were magnified at the apical center by the fixture length. The case of anterior edentulous area and longer fixtures affected the accuracy of the implant template. CONCLUSION The control of errors at the coronal center and stabilization of the anterior part of the template are needed for safe implant surgery and future prosthodontic treatment. PMID:24353883

  13. Accuracy assessment of minimum control points for UAV photography and georeferencing

    NASA Astrophysics Data System (ADS)

    Skarlatos, D.; Procopiou, E.; Stavrou, G.; Gregoriou, M.

    2013-08-01

    In recent years, Autonomous Unmanned Aerial Vehicles (AUAV) became popular among researchers across disciplines because they combine many advantages. One major application is monitoring and mapping. Their ability to fly beyond eye sight autonomously, collecting data over large areas whenever, wherever, makes them excellent platform for monitoring hazardous areas or disasters. In both cases rapid mapping is needed while human access isn't always a given. Indeed, current automatic processing of aerial photos using photogrammetry and computer vision algorithms allows for rapid orthophomap production and Digital Surface Model (DSM) generation, as tools for monitoring and damage assessment. In such cases, control point measurement using GPS is either impossible, or time consuming or costly. This work investigates accuracies that can be attained using few or none control points over areas of one square kilometer, in two test sites; a typical block and a corridor survey. On board GPS data logged during AUAV's flight are being used for direct georeferencing, while ground check points are being used for evaluation. In addition various control point layouts are being tested using bundle adjustment for accuracy evaluation. Results indicate that it is possible to use on board single frequency GPS for direct georeferencing in cases of disaster management or areas without easy access, or even over featureless areas. Due to large numbers of tie points in the bundle adjustment, horizontal accuracy can be fulfilled with a rather small number of control points, but vertical accuracy may not.

  14. Scoring and Testing Procedures Devoted to Probabilistic Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Albarello, Dario; D'Amico, Vera

    2015-03-01

    This review addresses long-term (tens of years) seismic ground-motion forecasting (seismic hazard assessment) in the presence of alternative computational models (the so-called epistemic uncertainty affecting hazard estimates). We review the different approaches that have been proposed to manage epistemic uncertainty in the context of probabilistic seismic hazard assessment (PSHA). Ex- ante procedures (based on the combination of expert judgments about inherent characteristics of the PSHA model) and ex- post approaches (based on empirical comparison of model outcomes and observations) should not be considered as mutually exclusive alternatives but can be combined in a coherent Bayesian view. Therefore, we propose a procedure that allows a better exploitation of available PSHA models to obtain comprehensive estimates, which account for both epistemic and aleatory uncertainty. We also discuss the respective roles of empirical ex-post scoring and testing of alternative models concurring in the development of comprehensive hazard maps. In order to show how the proposed procedure may work, we also present a tentative application to the Italian area. In particular, four PSHA models are evaluated ex-post against macroseismic effects actually observed in a large set of Italian municipalities during the time span 1957-2006. This analysis shows that, when the whole Italian area is considered, all the models provide estimates that do not agree with the observations. However, two of them provide results that are compatible with observations, when a subregion of Italy (Apulia Region) is considered. By focusing on this area, we computed a comprehensive hazard curve for a single locality in order to show the feasibility of the proposed procedure.

  15. Formative Assessment of Procedural Skills: Students' Responses to the Objective Structured Clinical Examination and the Integrated Performance Procedural Instrument

    ERIC Educational Resources Information Center

    Nestel, Debra; Kneebone, Roger; Nolan, Carmel; Akhtar, Kash; Darzi, Ara

    2011-01-01

    Assessment of clinical skills is a critical element of undergraduate medical education. We compare a traditional approach to procedural skills assessment--the Objective Structured Clinical Examination (OSCE) with the Integrated Performance Procedural Instrument (IPPI). In both approaches, students work through "stations" or "scenarios" undertaking…

  16. Accuracy assessment of modeling architectural structures and details using terrestrial laser scanning

    NASA Astrophysics Data System (ADS)

    Kedzierski, M.; Walczykowski, P.; Orych, A.; Czarnecka, P.

    2015-08-01

    One of the most important aspects when performing architectural documentation of cultural heritage structures is the accuracy of both the data and the products which are generated from these data: documentation in the form of 3D models or vector drawings. The paper describes an assessment of the accuracy of modelling data acquired using a terrestrial phase scanner in relation to the density of a point cloud representing the surface of different types of construction materials typical for cultural heritage structures. This analysis includes the impact of the scanning geometry: the incidence angle of the laser beam and the scanning distance. For the purposes of this research, a test field consisting of samples of different types of construction materials (brick, wood, plastic, plaster, a ceramic tile, sheet metal) was built. The study involved conducting measurements at different angles and from a range of distances for chosen scanning densities. Data, acquired in the form of point clouds, were then filtered and modelled. An accuracy assessment of the 3D model was conducted by fitting it with the point cloud. The reflection intensity of each type of material was also analyzed, trying to determine which construction materials have the highest reflectance coefficients, and which have the lowest reflection coefficients, and in turn how this variable changes for different scanning parameters. Additionally measurements were taken of a fragment of a building in order to compare the results obtained in laboratory conditions, with those taken in field conditions.

  17. Assessing the accuracy and reproducibility of modality independent elastography in a murine model of breast cancer

    PubMed Central

    Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.

    2015-01-01

    Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120

  18. Gender differences in structured risk assessment: comparing the accuracy of five instruments.

    PubMed

    Coid, Jeremy; Yang, Min; Ullrich, Simone; Zhang, Tianqiang; Sizmur, Steve; Roberts, Colin; Farrington, David P; Rogers, Robert D

    2009-04-01

    Structured risk assessment should guide clinical risk management, but it is uncertain which instrument has the highest predictive accuracy among men and women. In the present study, the authors compared the Psychopathy Checklist-Revised (PCL-R; R. D. Hare, 1991, 2003); the Historical, Clinical, Risk Management-20 (HCR-20; C. D. Webster, K. S. Douglas, D. Eaves, & S. D. Hart, 1997); the Risk Matrix 2000-Violence (RM2000[V]; D. Thornton et al., 2003); the Violence Risk Appraisal Guide (VRAG; V. L. Quinsey, G. T. Harris, M. E. Rice, & C. A. Cormier, 1998); the Offenders Group Reconviction Scale (OGRS; J. B. Copas & P. Marshall, 1998; R. Taylor, 1999); and the total previous convictions among prisoners, prospectively assessed prerelease. The authors compared predischarge measures with subsequent offending and instruments ranked using multivariate regression. Most instruments demonstrated significant but moderate predictive ability. The OGRS ranked highest for violence among men, and the PCL-R and HCR-20 H subscale ranked highest for violence among women. The OGRS and total previous acquisitive convictions demonstrated greatest accuracy in predicting acquisitive offending among men and women. Actuarial instruments requiring no training to administer performed as well as personality assessment and structured risk assessment and were superior among men for violence.

  19. Assessing the quality of studies on the diagnostic accuracy of tumor markers

    PubMed Central

    Goebell, Peter J.; Kamat, Ashish M.; Sylvester, Richard J.; Black, Peter; Droller, Michael; Godoy, Guilherme; Hudson, M’Liss A.; Junker, Kerstin; Kassouf, Wassim; Knowles, Margaret A.; Schulz, Wolfgang A.; Seiler, Roland; Schmitz-Dräger, Bernd J.

    2015-01-01

    Objectives With rapidly increasing numbers of publications, assessments of study quality, reporting quality, and classification of studies according to their level of evidence or developmental stage have become key issues in weighing the relevance of new information reported. Diagnostic marker studies are often criticized for yielding highly discrepant and even controversial results. Much of this discrepancy has been attributed to differences in study quality. So far, numerous tools for measuring study quality have been developed, but few of them have been used for systematic reviews and meta-analysis. This is owing to the fact that most tools are complicated and time consuming, suffer from poor reproducibility, and do not permit quantitative scoring. Methods The International Bladder Cancer Network (IBCN) has adopted this problem and has systematically identified the more commonly used tools developed since 2000. Results In this review, those tools addressing study quality (Quality Assessment of Studies of Diagnostic Accuracy and Newcastle-Ottawa Scale), reporting quality (Standards for Reporting of Diagnostic Accuracy), and developmental stage (IBCN phases) of studies on diagnostic markers in bladder cancer are introduced and critically analyzed. Based upon this, the IBCN has launched an initiative to assess and validate existing tools with emphasis on diagnostic bladder cancer studies. Conclusions The development of simple and reproducible tools for quality assessment of diagnostic marker studies permitting quantitative scoring is suggested. PMID:25159014

  20. A comparative study between evaluation methods for quality control procedures for determining the accuracy of PET/CT registration

    NASA Astrophysics Data System (ADS)

    Cha, Min Kyoung; Ko, Hyun Soo; Jung, Woo Young; Ryu, Jae Kwang; Choe, Bo-Young

    2015-08-01

    The Accuracy of registration between positron emission tomography (PET) and computed tomography (CT) images is one of the important factors for reliable diagnosis in PET/CT examinations. Although quality control (QC) for checking alignment of PET and CT images should be performed periodically, the procedures have not been fully established. The aim of this study is to determine optimal quality control (QC) procedures that can be performed at the user level to ensure the accuracy of PET/CT registration. Two phantoms were used to carry out this study: the American college of Radiology (ACR)-approved PET phantom and National Electrical Manufacturers Association (NEMA) International Electrotechnical Commission (IEC) body phantom, containing fillable spheres. All PET/CT images were acquired on a Biograph TruePoint 40 PET/CT scanner using routine protocols. To measure registration error, the spatial coordinates of the estimated centers of the target slice (spheres) was calculated independently for the PET and the CT images in two ways. We compared the images from the ACR-approved PET phantom to that from the NEMA IEC body phantom. Also, we measured the total time required from phantom preparation to image analysis. The first analysis method showed a total difference of 0.636 ± 0.11 mm for the largest hot sphere and 0.198 ± 0.09 mm for the largest cold sphere in the case of the ACR-approved PET phantom. In the NEMA IEC body phantom, the total difference was 3.720 ± 0.97 mm for the largest hot sphere and 4.800 ± 0.85 mm for the largest cold sphere. The second analysis method showed that the differences in the x location at the line profile of the lesion on PET and CT were (1.33, 1.33) mm for a bone lesion, (-1.26, -1.33) mm for an air lesion and (-1.67, -1.60) mm for a hot sphere lesion for the ACR-approved PET phantom. For the NEMA IEC body phantom, the differences in the x location at the line profile of the lesion on PET and CT were (-1.33, 4.00) mm for the air

  1. Development of three dimensional Eulerian numerical procedure toward plate-mantle simulation: accuracy test by the fluid rope coiling

    NASA Astrophysics Data System (ADS)

    Furuichi, M.; Kameyama, M.; Kageyama, A.

    2007-12-01

    Reproducing a realistic plate tectonics with mantle convection simulation is one of the greatest challenges in computational geophysics. We have developed a three dimensional Eulerian numerical procedure toward plate-mantle simulation, which includes a finite deformation of the plate in the mantle convection. Our method, combined with CIP-CSLR (Constrained Interpolation Profile method-Conservative Semi-Lagrangian advection scheme with Rational function) and ACuTE method, enables us to solve advection and force balance equations even with a large and sharp viscosity jump, which marks the interface between the plates and surrounding upper mantle materials. One of the typical phenomena represented by our method is a fluid rope coiling event, where a stream of viscous fluid is poured onto the bottom plane from a certain height. This coiling motion is due to delicate balances between bending, twisting and stretching motions of fluid rope. In the framework of the Eulerian scheme, the fluid rope and surrounding air are treated as a viscosity profile which differs by several orders of magnitude. Our method solves the complex force balances of the fluid rope and air, by a multigrid iteration technique of ACuTE algorithm. In addition, the CIP-CSLR advection scheme allows us to obtain a deforming shape of the fluid rope, as a low diffusive solution in the Eulerian frame of reference. In this presentation, we will show the simulation result of the fluid rope coiling as an accuracy test for our simulation scheme, by comparing with the simplified numerical solution for thin viscous jet.

  2. Assessing the Accuracy of Cone-Beam Computerized Tomography in Measuring Thinning Oral and Buccal Bone.

    PubMed

    Raskó, Zoltán; Nagy, Lili; Radnai, Márta; Piffkó, József; Baráth, Zoltán

    2016-06-01

    The aim of this study was to assess the accuracy and reliability of cone-beam computerized tomography (CBCT) in measuring thinning bone surrounding dental implants. Three implants were inserted into the mandible of a domestic pig at 6 different bone thicknesses on the vestibular and the lingual sides, and measurements were recorded using CBCT. The results were obtained, analyzed, and compared with areas without implants. Our results indicated that the bone thickness and the neighboring implants decreased the accuracy and reliability of CBCT for measuring bone volume around dental implants. We concluded that CBCT slightly undermeasured the bone thickness around the implant, both buccally and orally, compared with the same thickness without the implant. These results support that using the i-CAT NG with a 0.2 voxel size is not accurate for either qualitative or quantitative bone evaluations, especially when the bone is thinner than 0.72 mm in the horizontal dimension.

  3. Predictive accuracy of the Miller assessment for preschoolers in children with prenatal drug exposure.

    PubMed

    Fulks, Mary-Ann L; Harris, Susan R

    2005-01-01

    The Miller Assessment for Preschoolers (MAP) is a standardized test purported to identify preschool-aged children at risk for later learning difficulties. We evaluated the predictive validity of the MAP Total Score, relative to later cognitive performance and across a range of possible cut-points, in 37 preschool-aged children with prenatal drug exposure. Criterion measures were the Wechsler Preschool & Primary Scale of Intelligence-Revised (WPPSI-R), Test of Early Reading Ability-2, Peabody Picture Vocabulary Test-Revised, and Developmental Test of Visual Motor Integration. The highest predictive accuracy was demonstrated when the WPPSI-R was the criterion measure. The 14th percentile cutoff point demonstrated the highest predictive accuracy across all measures.

  4. Theory and methods for accuracy assessment of thematic maps using fuzzy sets

    SciTech Connect

    Gopal, S.; Woodcock, C. )

    1994-02-01

    The use of fuzzy sets in map accuracy assessment expands the amount of information that can be provided regarding the nature, frequency, magnitude, and source of errors in a thematic map. The need for using fuzzy sets arises from the observation that all map locations do not fit unambiguously in a single map category. Fuzzy sets allow for varying levels of set membership for multiple map categories. A linguistic measurement scale allows the kinds of comments commonly made during map evaluations to be used to quantify map accuracy. Four tables result from the use of fuzzy functions, and when taken together they provide more information than traditional confusion matrices. The use of a hypothetical dataset helps illustrate the benefits of the new methods. It is hoped that the enhanced ability to evaluate maps resulting from the use of fuzzy sets will improve our understanding of uncertainty in maps and facilitate improved error modeling. 40 refs.

  5. Accuracy assessment of a mobile terrestrial lidar survey at Padre Island National Seashore

    USGS Publications Warehouse

    Lim, Samsung; Thatcher, Cindy A.; Brock, John C.; Kimbrow, Dustin R.; Danielson, Jeffrey J.; Reynolds, B.J.

    2013-01-01

    The higher point density and mobility of terrestrial laser scanning (light detection and ranging (lidar)) is desired when extremely detailed elevation data are needed for mapping vertically orientated complex features such as levees, dunes, and cliffs, or when highly accurate data are needed for monitoring geomorphic changes. Mobile terrestrial lidar scanners have the capability for rapid data collection on a larger spatial scale compared with tripod-based terrestrial lidar, but few studies have examined the accuracy of this relatively new mapping technology. For this reason, we conducted a field test at Padre Island National Seashore of a mobile lidar scanner mounted on a sport utility vehicle and integrated with a position and orientation system. The purpose of the study was to assess the vertical and horizontal accuracy of data collected by the mobile terrestrial lidar system, which is georeferenced to the Universal Transverse Mercator coordinate system and the North American Vertical Datum of 1988. To accomplish the study objectives, independent elevation data were collected by conducting a high-accuracy global positioning system survey to establish the coordinates and elevations of 12 targets spaced throughout the 12 km transect. These independent ground control data were compared to the lidar scanner-derived elevations to quantify the accuracy of the mobile lidar system. The performance of the mobile lidar system was also tested at various vehicle speeds and scan density settings (e.g. field of view and linear point spacing) to estimate the optimal parameters for desired point density. After adjustment of the lever arm parameters, the final point cloud accuracy was 0.060 m (east), 0.095 m (north), and 0.053 m (height). The very high density of the resulting point cloud was sufficient to map fine-scale topographic features, such as the complex shape of the sand dunes.

  6. Rectal cancer staging: Multidetector-row computed tomography diagnostic accuracy in assessment of mesorectal fascia invasion

    PubMed Central

    Ippolito, Davide; Drago, Silvia Girolama; Franzesi, Cammillo Talei; Fior, Davide; Sironi, Sandro

    2016-01-01

    AIM: To assess the diagnostic accuracy of multidetector-row computed tomography (MDCT) as compared with conventional magnetic resonance imaging (MRI), in identifying mesorectal fascia (MRF) invasion in rectal cancer patients. METHODS: Ninety-one patients with biopsy proven rectal adenocarcinoma referred for thoracic and abdominal CT staging were enrolled in this study. The contrast-enhanced MDCT scans were performed on a 256 row scanner (ICT, Philips) with the following acquisition parameters: tube voltage 120 KV, tube current 150-300 mAs. Imaging data were reviewed as axial and as multiplanar reconstructions (MPRs) images along the rectal tumor axis. MRI study, performed on 1.5 T with dedicated phased array multicoil, included multiplanar T2 and axial T1 sequences and diffusion weighted images (DWI). Axial and MPR CT images independently were compared to MRI and MRF involvement was determined. Diagnostic accuracy of both modalities was compared and statistically analyzed. RESULTS: According to MRI, the MRF was involved in 51 patients and not involved in 40 patients. DWI allowed to recognize the tumor as a focal mass with high signal intensity on high b-value images, compared with the signal of the normal adjacent rectal wall or with the lower tissue signal intensity background. The number of patients correctly staged by the native axial CT images was 71 out of 91 (41 with involved MRF; 30 with not involved MRF), while by using the MPR 80 patients were correctly staged (45 with involved MRF; 35 with not involved MRF). Local tumor staging suggested by MDCT agreed with those of MRI, obtaining for CT axial images sensitivity and specificity of 80.4% and 75%, positive predictive value (PPV) 80.4%, negative predictive value (NPV) 75% and accuracy 78%; while performing MPR the sensitivity and specificity increased to 88% and 87.5%, PPV was 90%, NPV 85.36% and accuracy 88%. MPR images showed higher diagnostic accuracy, in terms of MRF involvement, than native axial images

  7. Structured Assessment Approach: a procedure for the assessment of fuel cycle safeguard systems

    SciTech Connect

    Parziale, A.A.; Patenaude, C.J.; Renard, P.A.; Sacks, I.J.

    1980-03-06

    Lawrence Livermore National Laboratory has developed and tested for the United States Nuclear Regulatory Commission a procedure for the evaluation of Material Control and Accounting (MC and A) Systems at Nuclear Fuel Facilities. This procedure, called the Structured Assessment Approach, SAA, subjects the MC and A system at a facility to a series of increasingly sophisticated adversaries and strategies. A fully integrated version of the computer codes which assist the analyst in this assessment was made available in October, 1979. The concepts of the SAA and the results of the assessment of a hypothetical but typical facility are presented.

  8. Caspian Rapid Assessment Method: a localized procedure for assessment of wetlands at southern fringe of the Caspian Sea.

    PubMed

    Khorami Pour, Sanaz; Monavari, Seyed Masoud; Riazi, Borhan; Khorasani, Nematollah

    2015-07-01

    Although Iran is of founders of the Ramsar Convention, there is no comprehensive information available in the country on the status of wetlands in the past or at present. There is also no specific guideline for assessing the status of wetlands in the basin of the Caspian Sea as an ecosystem with unique ecological features. The main aim of this study was to develop a new procedure called "Caspian Rapid Assessment Method" (CRAM) for assessment of wetlands at southern fringe of the Caspian Sea. To this end, 16 rapid assessment methods analyzed by US EPA in 2003 were reviewed to provide an inventory of rapid assessment indices. Excluding less important indices, the inventory was short-listed based on Delphi panelists' consensus. The CRAM was developed with 6 main criteria and 12 sub-criteria. The modified method was used to assess three important wetlands of Anzali, Boojagh and Miyankaleh at the southern border of the Caspian Sea. According to the obtained results, the highest score of 60 was assigned to the Anzali Wetland. Obtaining the scores of 56 and 47, Miyankaleh and Boojagh wetlands were ranked in the next priorities, respectively. At final stage, the accuracy of CRAM prioritization values was confirmed using the Friedman test. All of the wetlands were classified into category II, which indicates destroyed wetlands with rehabilitation potentials. In recent years, serious threats have deteriorated the wetlands from class III (normal condition) to the class II.

  9. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    SciTech Connect

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  10. Comparative study of application accuracy of two frameless neuronavigation systems: experimental error assessment quantifying registration methods and clinically influencing factors.

    PubMed

    Paraskevopoulos, Dimitrios; Unterberg, Andreas; Metzner, Roland; Dreyhaupt, Jens; Eggers, Georg; Wirtz, Christian Rainer

    2010-04-01

    This study aimed at comparing the accuracy of two commercial neuronavigation systems. Error assessment and quantification of clinical factors and surface registration, often resulting in decreased accuracy, were intended. Active (Stryker Navigation) and passive (VectorVision Sky, BrainLAB) neuronavigation systems were tested with an anthropomorphic phantom with a deformable layer, simulating skin and soft tissue. True coordinates measured by computer numerical control were compared with coordinates on image data and during navigation, to calculate software and system accuracy respectively. Comparison of image and navigation coordinates was used to evaluate navigation accuracy. Both systems achieved an overall accuracy of <1.5 mm. Stryker achieved better software accuracy, whereas BrainLAB better system and navigation accuracy. Factors with conspicuous influence (P<0.01) were imaging, instrument replacement, sterile cover drape and geometry of instruments. Precision data indicated by the systems did not reflect measured accuracy in general. Surface matching resulted in no improvement of accuracy, confirming former studies. Laser registration showed no differences compared to conventional pointers. Differences between the two systems were limited. Surface registration may improve inaccurate point-based registrations but does not in general affect overall accuracy. Accuracy feedback by the systems does not always match with true target accuracy and requires critical evaluation from the surgeon.

  11. Mathematical accuracy of Aztec land surveys assessed from records in the Codex Vergara.

    PubMed

    Jorge, María del Carmen; Williams, Barbara J; Garza-Hume, C E; Olvera, Arturo

    2011-09-13

    Land surveying in ancient states is documented not only for Eurasia but also for the Americas, amply attested by two Acolhua-Aztec pictorial manuscripts from the Valley of Mexico. The Codex Vergara and the Códice de Santa María Asunción consist of hundreds of drawings of agricultural fields that uniquely record surface areas as well as perimeter measurements. A previous study of the Codex Vergara examines how Acolhua-Aztecs determined field area by reconstructing their calculation procedures. Here we evaluate the accuracy of their area values using modern mathematics. The findings verify the overall mathematical validity of the codex records. Three-quarters of the areas are within 5% of the maximum possible value, and 85% are within 10%, which compares well with reported errors by Western surveyors that postdate Aztec-Acolhua work by several centuries. PMID:21876138

  12. Mathematical accuracy of Aztec land surveys assessed from records in the Codex Vergara

    PubMed Central

    Williams, Barbara J.; Garza-Hume, C. E.; Olvera, Arturo

    2011-01-01

    Land surveying in ancient states is documented not only for Eurasia but also for the Americas, amply attested by two Acolhua–Aztec pictorial manuscripts from the Valley of Mexico. The Codex Vergara and the Códice de Santa María Asunción consist of hundreds of drawings of agricultural fields that uniquely record surface areas as well as perimeter measurements. A previous study of the Codex Vergara examines how Acolhua–Aztecs determined field area by reconstructing their calculation procedures. Here we evaluate the accuracy of their area values using modern mathematics. The findings verify the overall mathematical validity of the codex records. Three-quarters of the areas are within 5% of the maximum possible value, and 85% are within 10%, which compares well with reported errors by Western surveyors that postdate Aztec–Acolhua work by several centuries. PMID:21876138

  13. Application of a Monte Carlo accuracy assessment tool to TDRS and GPS

    NASA Technical Reports Server (NTRS)

    Pavloff, Michael S.

    1994-01-01

    In support of a NASA study on the application of radio interferometry to satellite orbit determination, MITRE developed a simulation tool for assessing interferometric tracking accuracy. Initially, the tool was applied to the problem of determining optimal interferometric station siting for orbit determination of the Tracking and Data Relay Satellite (TDRS). Subsequently, the Orbit Determination Accuracy Estimator (ODAE) was expanded to model the general batch maximum likelihood orbit determination algorithms of the Goddard Trajectory Determination System (GTDS) with measurement types including not only group and phase delay from radio interferometry, but also range, range rate, angular measurements, and satellite-to-satellite measurements. The user of ODAE specifies the statistical properties of error sources, including inherent observable imprecision, atmospheric delays, station location uncertainty, and measurement biases. Upon Monte Carlo simulation of the orbit determination process, ODAE calculates the statistical properties of the error in the satellite state vector and any other parameters for which a solution was obtained in the orbit determination. This paper presents results from ODAE application to two different problems: (1)determination of optimal geometry for interferometirc tracking of TDRS, and (2) expected orbit determination accuracy for Global Positioning System (GPS) tracking of low-earth orbit (LEO) satellites. Conclusions about optimal ground station locations for TDRS orbit determination by radio interferometry are presented, and the feasibility of GPS-based tracking for IRIDIUM, a LEO mobile satellite communications (MOBILSATCOM) system, is demonstrated.

  14. The influence of sampling interval on the accuracy of trail impact assessment

    USGS Publications Warehouse

    Leung, Y.-F.; Marion, J.L.

    1999-01-01

    Trail impact assessment and monitoring (IA&M) programs have been growing in importance and application in recreation resource management at protected areas. Census-based and sampling-based approaches have been developed in such programs, with systematic point sampling being the most common survey design. This paper examines the influence of sampling interval on the accuracy of estimates for selected trail impact problems. A complete census of four impact types on 70 trails in Great Smoky Mountains National Park was utilized as the base data set for the analyses. The census data were resampled at increasing intervals to create a series of simulated point data sets. Estimates of frequency of occurrence and lineal extent for the four impact types were compared with the census data set. The responses of accuracy loss on lineal extent estimates to increasing sampling intervals varied across different impact types, while the responses on frequency of occurrence estimates were consistent, approximating an inverse asymptotic curve. These findings suggest that systematic point sampling may be an appropriate method for estimating the lineal extent but not the frequency of trail impacts. Sample intervals of less than 100 m appear to yield an excellent level of accuracy for the four impact types evaluated. Multiple regression analysis results suggest that appropriate sampling intervals are more likely to be determined by the type of impact in question rather than the length of trail. The census-based trail survey and the resampling-simulation method developed in this study can be a valuable first step in establishing long-term trail IA&M programs, in which an optimal sampling interval range with acceptable accuracy is determined before investing efforts in data collection.

  15. Assessment of the Geodetic and Color Accuracy of Multi-Pass Airborne/Mobile Lidar Data

    NASA Astrophysics Data System (ADS)

    Pack, R. T.; Petersen, B.; Sunderland, D.; Blonquist, K.; Israelsen, P.; Crum, G.; Fowles, A.; Neale, C.

    2008-12-01

    The ability to merge lidar and color image data acquired by multiple passes of an aircraft or van is largely dependent on the accuracy of the navigation system that estimates the dynamic position and orientation of the sensor. We report an assessment of the performance of a Riegl Q560 lidar transceiver combined with a Litton LN-200 inertial measurement unit (IMU) based NovAtel SPAN GPS/IMU system and a Panasonic HD Video Camera system. Several techniques are reported that were used to maximize the performance of the GPS/IMU system in generating precisely merged point clouds. The airborne data used included eight flight lines all overflying the same building on the campus at Utah State University. These lines were flown at the FAA minimum altitude of 1000 feet for fixed-wing aircraft. The mobile data was then acquired with the same system mounted to look sideways out of a van several months later. The van was driven around the same building at variable speed in order to avoid pedestrians. An absolute accuracy of about 6 cm and a relative accuracy of less than 2.5 cm one-sigma are documented for the merged data. Several techniques are also reported for merging of the color video data stream with the lidar point cloud. A technique for back-projecting and burning lidar points within the video stream enables the verification of co-boresighting accuracy. The resulting pixel-level alignment is accurate with within the size of a lidar footprint. The techniques described in this paper enable the display of high-resolution colored points of high detail and color clarity.

  16. Shuttle radar topography mission accuracy assessment and evaluation for hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Mercuri, Pablo Alberto

    Digital Elevation Models (DEMs) are increasingly used even in low relief landscapes for multiple mapping applications and modeling approaches such as surface hydrology, flood risk mapping, agricultural suitability, and generation of topographic attributes. The National Aeronautics and Space Administration (NASA) has produced a nearly global database of highly accurate elevation data, the Shuttle Radar Topography Mission (SRTM) DEM. The main goals of this thesis were to investigate quality issues of SRTM, provide measures of vertical accuracy with emphasis on low relief areas, and to analyze the performance for the generation of physical boundaries and streams for watershed modeling and characterization. The absolute and relative accuracy of the two SRTM resolutions, at 1 and 3 arc-seconds, were investigated to generate information that can be used as a reference in areas with similar characteristics in other regions of the world. The absolute accuracy was obtained from accurate point estimates using the best available federal geodetic network in Indiana. The SRTM root mean square error for this area of the Midwest US surpassed data specifications. It was on the order of 2 meters for the 1 arc-second resolution in flat areas of the Midwest US. Estimates of error were smaller for the global coverage 3 arc-second data with very similar results obtained in the flat plains in Argentina. In addition to calculating the vertical accuracy, the impacts of physiography and terrain attributes, like slope, on the error magnitude were studied. The assessment also included analysis of the effects of land cover on vertical accuracy. Measures of local variability were described to identify the adjacency effects produced by surface features in the SRTM DEM, like forests and manmade features near the geodetic point. Spatial relationships among the bare-earth National Elevation Data and SRTM were also analyzed to assess the relative accuracy that was 2.33 meters in terms of the total

  17. Geometric calibration and accuracy assessment of a multispectral imager on UAVs

    NASA Astrophysics Data System (ADS)

    Zheng, Fengjie; Yu, Tao; Chen, Xingfeng; Chen, Jiping; Yuan, Guoti

    2012-11-01

    The increasing developments in Unmanned Aerial Vehicles (UAVs) platforms and associated sensing technologies have widely promoted UAVs remote sensing application. UAVs, especially low-cost UAVs, limit the sensor payload in weight and dimension. Mostly, cameras on UAVs are panoramic, fisheye lens, small-format CCD planar array camera, unknown intrinsic parameters and lens optical distortion will cause serious image aberrations, even leading a few meters or tens of meters errors in ground per pixel. However, the characteristic of high spatial resolution make accurate geolocation more critical to UAV quantitative remote sensing research. A method for MCC4-12F Multispectral Imager designed to load on UAVs has been developed and implemented. Using multi-image space resection algorithm to assess geometric calibration parameters of random position and different photogrammetric altitudes in 3D test field, which is suitable for multispectral cameras. Both theoretical and practical accuracy assessments were selected. The results of theoretical strategy, resolving object space and image point coordinate differences by space intersection, showed that object space RMSE were 0.2 and 0.14 pixels in X direction and in Y direction, image space RMSE were superior to 0.5 pixels. In order to verify the accuracy and reliability of the calibration parameters,practical study was carried out in Tianjin UAV flight experiments, the corrected accuracy validated by ground checkpoints was less than 0.3m. Typical surface reflectance retrieved on the basis of geo-rectified data was compared with ground ASD measurement resulting 4% discrepancy. Hence, the approach presented here was suitable for UAV multispectral imager.

  18. Accuracy assessment of 3D bone reconstructions using CT: an intro comparison.

    PubMed

    Lalone, Emily A; Willing, Ryan T; Shannon, Hannah L; King, Graham J W; Johnson, James A

    2015-08-01

    Computed tomography provides high contrast imaging of the joint anatomy and is used routinely to reconstruct 3D models of the osseous and cartilage geometry (CT arthrography) for use in the design of orthopedic implants, for computer assisted surgeries and computational dynamic and structural analysis. The objective of this study was to assess the accuracy of bone and cartilage surface model reconstructions by comparing reconstructed geometries with bone digitizations obtained using an optical tracking system. Bone surface digitizations obtained in this study determined the ground truth measure for the underlying geometry. We evaluated the use of a commercially available reconstruction technique using clinical CT scanning protocols using the elbow joint as an example of a surface with complex geometry. To assess the accuracies of the reconstructed models (8 fresh frozen cadaveric specimens) against the ground truth bony digitization-as defined by this study-proximity mapping was used to calculate residual error. The overall mean error was less than 0.4 mm in the cortical region and 0.3 mm in the subchondral region of the bone. Similarly creating 3D cartilage surface models from CT scans using air contrast had a mean error of less than 0.3 mm. Results from this study indicate that clinical CT scanning protocols and commonly used and commercially available reconstruction algorithms can create models which accurately represent the true geometry.

  19. Assessment of Accuracy and Reliability in Acetabular Cup Placement Using an iPhone/iPad System.

    PubMed

    Kurosaka, Kenji; Fukunishi, Shigeo; Fukui, Tomokazu; Nishio, Shoji; Fujihara, Yuki; Okahisa, Shohei; Takeda, Yu; Daimon, Takashi; Yoshiya, Shinichi

    2016-07-01

    Implant positioning is one of the critical factors that influences postoperative outcome of total hip arthroplasty (THA). Malpositioning of the implant may lead to an increased risk of postoperative complications such as prosthetic impingement, dislocation, restricted range of motion, polyethylene wear, and loosening. In 2012, the intraoperative use of smartphone technology in THA for improved accuracy of acetabular cup placement was reported. The purpose of this study was to examine the accuracy of an iPhone/iPad-guided technique in positioning the acetabular cup in THA compared with the reference values obtained from the image-free navigation system in a cadaveric experiment. Five hips of 5 embalmed whole-body cadavers were used in the study. Seven orthopedic surgeons (4 residents and 3 senior hip surgeons) participated in the study. All of the surgeons examined each of the 5 hips 3 times. The target angle was 38°/19° for operative inclination/anteversion angles, which corresponded to radiographic inclination/anteversion angles of 40°/15°. The simultaneous assessment using the navigation system showed mean±SD radiographic alignment angles of 39.4°±2.6° and 16.4°±2.6° for inclination and anteversion, respectively. Assessment of cup positioning based on Lewinnek's safe zone criteria showed all of the procedures (n=105) achieved acceptable alignment within the safe zone. A comparison of the performances by resident and senior hip surgeons showed no significant difference between the groups (P=.74 for inclination and P=.81 for anteversion). The iPhone/iPad technique examined in this study could achieve acceptable performance in determining cup alignment in THA regardless of the surgeon's expertise. [Orthopedics. 2016; 39(4):e621-e626.]. PMID:27322169

  20. Assessment of Accuracy and Reliability in Acetabular Cup Placement Using an iPhone/iPad System.

    PubMed

    Kurosaka, Kenji; Fukunishi, Shigeo; Fukui, Tomokazu; Nishio, Shoji; Fujihara, Yuki; Okahisa, Shohei; Takeda, Yu; Daimon, Takashi; Yoshiya, Shinichi

    2016-07-01

    Implant positioning is one of the critical factors that influences postoperative outcome of total hip arthroplasty (THA). Malpositioning of the implant may lead to an increased risk of postoperative complications such as prosthetic impingement, dislocation, restricted range of motion, polyethylene wear, and loosening. In 2012, the intraoperative use of smartphone technology in THA for improved accuracy of acetabular cup placement was reported. The purpose of this study was to examine the accuracy of an iPhone/iPad-guided technique in positioning the acetabular cup in THA compared with the reference values obtained from the image-free navigation system in a cadaveric experiment. Five hips of 5 embalmed whole-body cadavers were used in the study. Seven orthopedic surgeons (4 residents and 3 senior hip surgeons) participated in the study. All of the surgeons examined each of the 5 hips 3 times. The target angle was 38°/19° for operative inclination/anteversion angles, which corresponded to radiographic inclination/anteversion angles of 40°/15°. The simultaneous assessment using the navigation system showed mean±SD radiographic alignment angles of 39.4°±2.6° and 16.4°±2.6° for inclination and anteversion, respectively. Assessment of cup positioning based on Lewinnek's safe zone criteria showed all of the procedures (n=105) achieved acceptable alignment within the safe zone. A comparison of the performances by resident and senior hip surgeons showed no significant difference between the groups (P=.74 for inclination and P=.81 for anteversion). The iPhone/iPad technique examined in this study could achieve acceptable performance in determining cup alignment in THA regardless of the surgeon's expertise. [Orthopedics. 2016; 39(4):e621-e626.].

  1. A PRIOR EVALUATION OF TWO-STAGE CLUSTER SAMPLING FOR ACCURACY ASSESSMENT OF LARGE-AREA LAND-COVER MAPS

    EPA Science Inventory

    Two-stage cluster sampling reduces the cost of collecting accuracy assessment reference data by constraining sample elements to fall within a limited number of geographic domains (clusters). However, because classification error is typically positively spatially correlated, withi...

  2. High-temperature flaw assessment procedure: Interim report

    SciTech Connect

    Ruggles, M.B.; Takahashi, Y.; Ainsworth, R.A.

    1989-08-01

    The current program represents a joint effort between the Electric Power Research Institute (EPRI) in the USA, the Central Research Institute of Electric Power Industry (CRIEPI) in Japan, and the Central Electricity Generating Board (CEGB) in the UK. The goal is to develop an interim high-temperature flaw assessment procedure for high-temperature reactor components. This is to be accomplished through exploratory experimental and analytical studies of high-temperature crack growth. The state-of-the-art assessment and the fracture mechanics database for both types 304 and 316 stainless steels, completed in 1988, serve as a foundation for the present work. Work in the three participating organizations is progressing roughly on schedule. Results to-date are presented in this document. Fundamental tests results are discussed in Section 2. Section 3 focuses on results of exploratory subcritical crack growth tests. Progress in subcritical crack growth modeling is reported in Section 4. Exploratory failure tests are outlined in Section 5. 21 refs., 70 figs., 7 tabs.

  3. Assessing Reading. Using Cloze Procedure To Assess Reading Skills. [Packet] and Handbook.

    ERIC Educational Resources Information Center

    Vaughan, Judy

    These instructor's materials consist of a handbook directed to the teacher and 33 worksheets teachers can use with adult students in order to use the cloze procedure to assess how readily they can read materials of differing complexity. The handbook introduces the materials by considering such questions as What is meant by reading?, How could…

  4. The Eye Phone Study: reliability and accuracy of assessing Snellen visual acuity using smartphone technology

    PubMed Central

    Perera, C; Chakrabarti, R; Islam, F M A; Crowston, J

    2015-01-01

    Purpose Smartphone-based Snellen visual acuity charts has become popularized; however, their accuracy has not been established. This study aimed to evaluate the equivalence of a smartphone-based visual acuity chart with a standard 6-m Snellen visual acuity (6SVA) chart. Methods First, a review of available Snellen chart applications on iPhone was performed to determine the most accurate application based on optotype size. Subsequently, a prospective comparative study was performed by measuring conventional 6SVA and then iPhone visual acuity using the ‘Snellen' application on an Apple iPhone 4. Results Eleven applications were identified, with accuracy of optotype size ranging from 4.4–39.9%. Eighty-eight patients from general medical and surgical wards in a tertiary hospital took part in the second part of the study. The mean difference in logMAR visual acuity between the two charts was 0.02 logMAR (95% limit of agreement −0.332, 0.372 logMAR). The largest mean difference in logMAR acuity was noted in the subgroup of patients with 6SVA worse than 6/18 (n=5), who had a mean difference of two Snellen visual acuity lines between the charts (0.276 logMAR). Conclusion We did not identify a Snellen visual acuity app at the time of study, which could predict a patients standard Snellen visual acuity within one line. There was considerable variability in the optotype accuracy of apps. Further validation is required for assessment of acuity in patients with severe vision impairment. PMID:25931170

  5. Quantitative Assessment of Shockwave Lithotripsy Accuracy and the Effect of Respiratory Motion*

    PubMed Central

    Bailey, Michael R.; Shah, Anup R.; Hsi, Ryan S.; Paun, Marla; Harper, Jonathan D.

    2012-01-01

    Abstract Background and Purpose Effective stone comminution during shockwave lithotripsy (SWL) is dependent on precise three-dimensional targeting of the shockwave. Respiratory motion, imprecise targeting or shockwave alignment, and stone movement may compromise treatment efficacy. The purpose of this study was to evaluate the accuracy of shockwave targeting during SWL treatment and the effect of motion from respiration. Patients and Methods Ten patients underwent SWL for the treatment of 13 renal stones. Stones were targeted fluoroscopically using a Healthtronics Lithotron (five cases) or Dornier Compact Delta II (five cases) shockwave lithotripter. Shocks were delivered at a rate of 1 to 2 Hz with ramping shockwave energy settings of 14 to 26 kV or level 1 to 5. After the low energy pretreatment and protective pause, a commercial diagnostic ultrasound (US) imaging system was used to record images of the stone during active SWL treatment. Shockwave accuracy, defined as the proportion of shockwaves that resulted in stone motion with shockwave delivery, and respiratory stone motion were determined by two independent observers who reviewed the ultrasonographic videos. Results Mean age was 51±15 years with 60% men, and mean stone size was 10.5±3.7 mm (range 5–18 mm). A mean of 2675±303 shocks was delivered. Shockwave-induced stone motion was observed with every stone. Accurate targeting of the stone occurred in 60%±15% of shockwaves. Conclusions US imaging during SWL revealed that 40% of shockwaves miss the stone and contribute solely to tissue injury, primarily from movement with respiration. These data support the need for a device to deliver shockwaves only when the stone is in target. US imaging provides real-time assessment of stone targeting and accuracy of shockwave delivery. PMID:22471349

  6. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    SciTech Connect

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  7. Assessing the accuracy of satellite derived global and national urban maps in Kenya.

    PubMed

    Tatem, A J; Noor, A M; Hay, S I

    2005-05-15

    Ninety percent of projected global urbanization will be concentrated in low income countries (United-Nations, 2004). This will have considerable environmental, economic and public health implications for those populations. Objective and efficient methods of delineating urban extent are a cross-sectoral need complicated by a diversity of urban definition rubrics world-wide. Large-area maps of urban extents are becoming increasingly available in the public domain, as are a wide-range of medium spatial resolution satellite imagery. Here we describe the extension of a methodology based on Landsat ETM and Radarsat imagery to the production of a human settlement map of Kenya. This map was then compared with five satellite imagery-derived, global maps of urban extent at Kenya national-level, against an expert opinion coverage for accuracy assessment. The results showed the map produced using medium spatial resolution satellite imagery was of comparable accuracy to the expert opinion coverage. The five global urban maps exhibited a range of inaccuracies, emphasising that care should be taken with use of these maps at national and sub-national scale.

  8. Reconstruction Accuracy Assessment of Surface and Underwater 3D Motion Analysis: A New Approach.

    PubMed

    de Jesus, Kelly; de Jesus, Karla; Figueiredo, Pedro; Vilas-Boas, João Paulo; Fernandes, Ricardo Jorge; Machado, Leandro José

    2015-01-01

    This study assessed accuracy of surface and underwater 3D reconstruction of a calibration volume with and without homography. A calibration volume (6000 × 2000 × 2500 mm) with 236 markers (64 above and 88 underwater control points--with 8 common points at water surface--and 92 validation points) was positioned on a 25 m swimming pool and recorded with two surface and four underwater cameras. Planar homography estimation for each calibration plane was computed to perform image rectification. Direct linear transformation algorithm for 3D reconstruction was applied, using 1600000 different combinations of 32 and 44 points out of the 64 and 88 control points for surface and underwater markers (resp.). Root Mean Square (RMS) error with homography of control and validations points was lower than without it for surface and underwater cameras (P ≤ 0.03). With homography, RMS errors of control and validation points were similar between surface and underwater cameras (P ≥ 0.47). Without homography, RMS error of control points was greater for underwater than surface cameras (P ≤ 0.04) and the opposite was observed for validation points (P ≤ 0.04). It is recommended that future studies using 3D reconstruction should include homography to improve swimming movement analysis accuracy.

  9. Assessing the Accuracy of a Knowledge-Based System: Special Education Regulations & Procedures. Final Report.

    ERIC Educational Resources Information Center

    Hofmeister, Alan M.

    The purpose of this research project was the development and initial validation of Mandate Consultant, an expert system that provides a second opinion of the appropriateness of the decision-making process used in the development of Individualized Education Programs with handicapped children. (Expert systems are a development of artificial…

  10. Assessing the accuracy of selectivity as a basis for solvent screening in extractive distillation processes

    SciTech Connect

    Momoh, S.O. )

    1991-01-01

    An important parameter for consideration in the screening of solvents for an extractive distillation process is selectivity at infinite dilution. The higher the selectivity, the better the solvent. This paper assesses the accuracy of using selectivity as a basis for solvent screening in extractive distillation processes. Three types of binary mixtures that are usually separated by an extractive distillation process are chosen for investigation. Having determined the optimum solvent feed rate to be two times the feed rate of the binary mixture, the total annual costs of extractive distillation processes for each of the chosen mixtures and for various solvents are carried out. The solvents are ranked on the basis of the total annual cost (obtained by design and costing equations) for the extractive distillation processes, and this ranking order is compared with that of selectivity at infinite dilution as determined by the UNIFAC method. This matching of selectivity with total annual cost does not produce a very good correlation.

  11. Accuracy assessment of Kinect for Xbox One in point-based tracking applications

    NASA Astrophysics Data System (ADS)

    Goral, Adrian; Skalski, Andrzej

    2015-12-01

    We present the accuracy assessment of a point-based tracking system built on Kinect v2. In our approach, color, IR and depth data were used to determine the positions of spherical markers. To accomplish this task, we calibrated the depth/infrared and color cameras using a custom method. As a reference tool we used Polaris Spectra optical tracking system. The mean error obtained within the range from 0.9 to 2.9 m was 61.6 mm. Although the depth component of the error turned out to be the largest, the random error of depth estimation was only 1.24 mm on average. Our Kinect-based system also allowed for reliable angular measurements within the range of ±20° from the sensor's optical axis.

  12. Integrated three-dimensional digital assessment of accuracy of anterior tooth movement using clear aligners

    PubMed Central

    Zhang, Xiao-Juan; He, Li; Tian, Jie; Bai, Yu-Xing; Li, Song

    2015-01-01

    Objective To assess the accuracy of anterior tooth movement using clear aligners in integrated three-dimensional digital models. Methods Cone-beam computed tomography was performed before and after treatment with clear aligners in 32 patients. Plaster casts were laser-scanned for virtual setup and aligner fabrication. Differences in predicted and achieved root and crown positions of anterior teeth were compared on superimposed maxillofacial digital images and virtual models and analyzed by Student's t-test. Results The mean discrepancies in maxillary and mandibular crown positions were 0.376 ± 0.041 mm and 0.398 ± 0.037 mm, respectively. Maxillary and mandibular root positions differed by 2.062 ± 0.128 mm and 1.941 ± 0.154 mm, respectively. Conclusions Crowns but not roots of anterior teeth can be moved to designated positions using clear aligners, because these appliances cause tooth movement by tilting motion. PMID:26629473

  13. Accuracy assessment of human trunk surface 3D reconstructions from an optical digitising system.

    PubMed

    Pazos, V; Cheriet, F; Song, L; Labelle, H; Dansereau, J

    2005-01-01

    The lack of reliable techniques to follow up scoliotic deformity from the external asymmetry of the trunk leads to a general use of X-rays and indices of spinal deformity. Young adolescents with idiopathic scoliosis need intensive follow-ups for many years and, consequently, they are repeatedly exposed to ionising radiation, which is hazardous to their long-term health. Furthermore, treatments attempt to improve both spinal and surface deformities, but internal indices do not describe the external asymmetry. The purpose of this study was to assess a commercial, optical 3D digitising system for the 3D reconstruction of the entire trunk for clinical assessment of external asymmetry. The resulting surface is a textured, high-density polygonal mesh. The accuracy assessment was based on repeated reconstructions of a manikin with markers fixed on it. The average normal distance between the reconstructed surfaces and the reference data (markers measured with CMM) was 1.1 +/- 0.9 mm. PMID:15742714

  14. Accuracy of Optimized Branched Algorithms to Assess Activity-Specific PAEE

    PubMed Central

    Edwards, Andy G.; Hill, James O.; Byrnes, William C.; Browning, Raymond C.

    2009-01-01

    PURPOSE To assess the activity-specific accuracy achievable by branched algorithm (BA) analysis of simulated daily-living physical activity energy expenditure (PAEE) within a sedentary population. METHODS Sedentary men (n=8) and women (n=8) first performed a treadmill calibration protocol, during which heart rate (HR), accelerometry (ACC), and PAEE were measured in 1-minute epochs. From these data, HR-PAEE, and ACC-PAEE regressions were constructed and used in each of six analytic models to predict PAEE from ACC and HR data collected during a subsequent simulated daily-living protocol. Criterion PAEE was measured during both protocols via indirect calorimetry. The accuracy achieved by each model was assessed by the root mean square of the difference between model-predicted daily–living PAEE and the criterion daily-living PAEE (expressed here as % of mean daily living PAEE). RESULTS Across the range of activities an unconstrained post hoc optimized branched algorithm best predicted criterion PAEE. Estimates using individual calibration were generally more accurate than those using group calibration (14 vs. 16 % error, respectively). These analyses also performed well within each of the six daily-living activities, but systematic errors appeared for several of those activities, which may be explained by an inability of the algorithm to simultaneously accommodate a heterogeneous range of activities. Analyses of between mean square error by subject and activity suggest that optimization involving minimization of RMS for total daily-living PAEE is associated with decreased error between subjects but increased error between activities. CONCLUSION The performance of post hoc optimized branched algorithms may be limited by heterogeneity in the daily-living activities being performed. PMID:19952842

  15. Improving the assessment of ICESat water altimetry accuracy accounting for autocorrelation

    NASA Astrophysics Data System (ADS)

    Abdallah, Hani; Bailly, Jean-Stéphane; Baghdadi, Nicolas; Lemarquand, Nicolas

    2011-11-01

    Given that water resources are scarce and are strained by competing demands, it has become crucial to develop and improve techniques to observe the temporal and spatial variations in the inland water volume. Due to the lack of data and the heterogeneity of water level stations, remote sensing, and especially altimetry from space, appear as complementary techniques for water level monitoring. In addition to spatial resolution and sampling rates in space or time, one of the most relevant criteria for satellite altimetry on inland water is the accuracy of the elevation data. Here, the accuracy of ICESat LIDAR altimetry product is assessed over the Great Lakes in North America. The accuracy assessment method used in this paper emphasizes on autocorrelation in high temporal frequency ICESat measurements. It also considers uncertainties resulting from both in situ lake level reference data. A probabilistic upscaling process was developed. This process is based on several successive ICESat shots averaged in a spatial transect accounting for autocorrelation between successive shots. The method also applies pre-processing of the ICESat data with saturation correction of ICESat waveforms, spatial filtering to avoid measurement disturbance from the land-water transition effects on waveform saturation and data selection to avoid trends in water elevations across space. Initially this paper analyzes 237 collected ICESat transects, consistent with the available hydrometric ground stations for four of the Great Lakes. By adapting a geostatistical framework, a high frequency autocorrelation between successive shot elevation values was observed and then modeled for 45% of the 237 transects. The modeled autocorrelation was therefore used to estimate water elevations at the transect scale and the resulting uncertainty for the 117 transects without trend. This uncertainty was 8 times greater than the usual computed uncertainty, when no temporal correlation is taken into account. This

  16. Estimating covariate-adjusted measures of diagnostic accuracy based on pooled biomarker assessments.

    PubMed

    McMahan, Christopher S; McLain, Alexander C; Gallagher, Colin M; Schisterman, Enrique F

    2016-07-01

    There is a need for epidemiological and medical researchers to identify new biomarkers (biological markers) that are useful in determining exposure levels and/or for the purposes of disease detection. Often this process is stunted by high testing costs associated with evaluating new biomarkers. Traditionally, biomarker assessments are individually tested within a target population. Pooling has been proposed to help alleviate the testing costs, where pools are formed by combining several individual specimens. Methods for using pooled biomarker assessments to estimate discriminatory ability have been developed. However, all these procedures have failed to acknowledge confounding factors. In this paper, we propose a regression methodology based on pooled biomarker measurements that allow the assessment of the discriminatory ability of a biomarker of interest. In particular, we develop covariate-adjusted estimators of the receiver-operating characteristic curve, the area under the curve, and Youden's index. We establish the asymptotic properties of these estimators and develop inferential techniques that allow one to assess whether a biomarker is a good discriminator between cases and controls, while controlling for confounders. The finite sample performance of the proposed methodology is illustrated through simulation. We apply our methods to analyze myocardial infarction (MI) data, with the goal of determining whether the pro-inflammatory cytokine interleukin-6 is a good predictor of MI after controlling for the subjects' cholesterol levels. PMID:26927583

  17. A retrospective study to validate an intraoperative robotic classification system for assessing the accuracy of kirschner wire (K-wire) placements with postoperative computed tomography classification system for assessing the accuracy of pedicle screw placements

    PubMed Central

    Tsai, Tai-Hsin; Wu, Dong-Syuan; Su, Yu-Feng; Wu, Chieh-Hsin; Lin, Chih-Lung

    2016-01-01

    Abstract This purpose of this retrospective study is validation of an intraoperative robotic grading classification system for assessing the accuracy of Kirschner-wire (K-wire) placements with the postoperative computed tomography (CT)-base classification system for assessing the accuracy of pedicle screw placements. We conducted a retrospective review of prospectively collected data from 35 consecutive patients who underwent 176 robotic assisted pedicle screws instrumentation at Kaohsiung Medical University Hospital from September 2014 to November 2015. During the operation, we used a robotic grading classification system for verifying the intraoperative accuracy of K-wire placements. Three months after surgery, we used the common CT-base classification system to assess the postoperative accuracy of pedicle screw placements. The distributions of accuracy between the intraoperative robot-assisted and various postoperative CT-based classification systems were compared using kappa statistics of agreement. The intraoperative accuracies of K-wire placements before and after repositioning were classified as excellent (131/176, 74.4% and 133/176, 75.6%, respectively), satisfactory (36/176, 20.5% and 41/176, 23.3%, respectively), and malpositioned (9/176, 5.1% and 2/176, 1.1%, respectively) In postoperative CT-base classification systems were evaluated. No screw placements were evaluated as unacceptable under any of these systems. Kappa statistics revealed no significant differences between the proposed system and the aforementioned classification systems (P <0.001). Our results revealed no significant differences between the intraoperative robotic grading system and various postoperative CT-based grading systems. The robotic grading classification system is a feasible method for evaluating the accuracy of K-wire placements. Using the intraoperative robot grading system to classify the accuracy of K-wire placements enables predicting the postoperative accuracy of

  18. A retrospective study to validate an intraoperative robotic classification system for assessing the accuracy of kirschner wire (K-wire) placements with postoperative computed tomography classification system for assessing the accuracy of pedicle screw placements.

    PubMed

    Tsai, Tai-Hsin; Wu, Dong-Syuan; Su, Yu-Feng; Wu, Chieh-Hsin; Lin, Chih-Lung

    2016-09-01

    This purpose of this retrospective study is validation of an intraoperative robotic grading classification system for assessing the accuracy of Kirschner-wire (K-wire) placements with the postoperative computed tomography (CT)-base classification system for assessing the accuracy of pedicle screw placements.We conducted a retrospective review of prospectively collected data from 35 consecutive patients who underwent 176 robotic assisted pedicle screws instrumentation at Kaohsiung Medical University Hospital from September 2014 to November 2015. During the operation, we used a robotic grading classification system for verifying the intraoperative accuracy of K-wire placements. Three months after surgery, we used the common CT-base classification system to assess the postoperative accuracy of pedicle screw placements. The distributions of accuracy between the intraoperative robot-assisted and various postoperative CT-based classification systems were compared using kappa statistics of agreement.The intraoperative accuracies of K-wire placements before and after repositioning were classified as excellent (131/176, 74.4% and 133/176, 75.6%, respectively), satisfactory (36/176, 20.5% and 41/176, 23.3%, respectively), and malpositioned (9/176, 5.1% and 2/176, 1.1%, respectively)In postoperative CT-base classification systems were evaluated. No screw placements were evaluated as unacceptable under any of these systems. Kappa statistics revealed no significant differences between the proposed system and the aforementioned classification systems (P <0.001).Our results revealed no significant differences between the intraoperative robotic grading system and various postoperative CT-based grading systems. The robotic grading classification system is a feasible method for evaluating the accuracy of K-wire placements. Using the intraoperative robot grading system to classify the accuracy of K-wire placements enables predicting the postoperative accuracy of pedicle screw

  19. A simple test to assess the static and dynamic accuracy of an inertial sensors system for human movement analysis.

    PubMed

    Cutti, Andrea Giovanni; Giovanardi, Andrea; Rocchi, Laura; Davalli, Angelo

    2006-01-01

    In the present study we introduced a simple test to assess the orientation error of an inertial sensors system for human movement analysis, both in static and dynamic conditions. In particular, the test was intended to quantify the sensitivity of the orientation error to direction and velocity of rotation. The test procedure was performed on a 5 MT9B sensors Xsens acquisition system, and revealed that the system orientation error, expressed by Euler angles decomposition, was sensitive both to direction and to velocity, being higher for fast movements: for mean rotation velocities of 180 degrees/s and 360 degrees/s, the worst case orientation error was 5.4 degrees and 11.6 degrees, respectively. The test can be suggested therefore as a useful tool to verify the user specific system accuracy without requiring any special equipment. In addition, the test provides further error information concerning direction and velocity of the movement which are not supplied by the producer, since they depend on the specific field of application. PMID:17946728

  20. Violence risk assessment and women: predictive accuracy of the HCR-20 in a civil psychiatric sample.

    PubMed

    Garcia-Mansilla, Alexandra; Rosenfeld, Barry; Cruise, Keith R

    2011-01-01

    Research to date has not adequately demonstrated whether the HCR-20 Violence Risk Assessment Scheme (HCR-20; Webster, Douglas, Eaves, & Hart, 1997), a structured violence risk assessment measure with a robust literature supporting its validity in male samples, is a valid indicator of violence risk in women. This study utilized data from the MacArthur Study of Mental Disorder and Violence to retrospectively score an abbreviated version of HCR-20 in 827 civil psychiatric patients. HCR-20 scores and predictive accuracy of community violence were compared for men and women. Results suggested that the HCR-20 is slightly, but not significantly, better for evaluating future risk for violence in men than in women, although the magnitude of the gender differences was small and was largely limited to historical factors. The results do not indicate that the HCR-20 needs to be tailored for use in women or that it should not be used in women, but they do highlight that the HCR-20 should be used cautiously and with full awareness of its potential limitations in women.

  1. Accuracy Assessment of GO Pro Hero 3 (black) Camera in Underwater Environment

    NASA Astrophysics Data System (ADS)

    Helmholz, , P.; Long, J.; Munsie, T.; Belton, D.

    2016-06-01

    Modern digital cameras are increasing in quality whilst decreasing in size. In the last decade, a number of waterproof consumer digital cameras (action cameras) have become available, which often cost less than 500. A possible application of such action cameras is in the field of Underwater Photogrammetry. Especially with respect to the fact that with the change of the medium to below water can in turn counteract the distortions present. The goal of this paper is to investigate the suitability of such action cameras for underwater photogrammetric applications focusing on the stability of the camera and the accuracy of the derived coordinates for possible photogrammetric applications. For this paper a series of image sequences was capture in a water tank. A calibration frame was placed in the water tank allowing the calibration of the camera and the validation of the measurements using check points. The accuracy assessment covered three test sets operating three GoPro sports cameras of the same model (Hero 3 black). The test set included the handling of the camera in a controlled manner where the camera was only dunked into the water tank using 7MP and 12MP resolution and a rough handling where the camera was shaken as well as being removed from the waterproof case using 12MP resolution. The tests showed that the camera stability was given with a maximum standard deviation of the camera constant σc of 0.0031mm for 7MB (for an average c of 2.720mm) and 0.0072 mm for 12MB (for an average c of 3.642mm). The residual test of the check points gave for the 7MB test series the largest rms value with only 0.450mm and the largest maximal residual of only 2.5 mm. For the 12MB test series the maximum rms value is 0. 653mm.

  2. Research Procedures for Assessing the Effectiveness of Instructional Materials for Vocational Education.

    ERIC Educational Resources Information Center

    Briers, Gary E.; Williams, David L.

    1979-01-01

    The emphases of this investigation were to outline procedures to assess the effectiveness of instructional materials and to examine those procedures. Methods for determining sample size, constructing instruments, and modifying data are described. (SK)

  3. Diagnostic accuracy of refractometry for assessing bovine colostrum quality: A systematic review and meta-analysis.

    PubMed

    Buczinski, S; Vandeweerd, J M

    2016-09-01

    Provision of good quality colostrum [i.e., immunoglobulin G (IgG) concentration ≥50g/L] is the first step toward ensuring proper passive transfer of immunity for young calves. Precise quantification of colostrum IgG levels cannot be easily performed on the farm. Assessment of the refractive index using a Brix scale with a refractometer has been described as being highly correlated with IgG concentration in colostrum. The aim of this study was to perform a systematic review of the diagnostic accuracy of Brix refractometry to diagnose good quality colostrum. From 101 references initially obtain ed, 11 were included in the systematic review meta-analysis representing 4,251 colostrum samples. The prevalence of good colostrum samples with IgG ≥50g/L varied from 67.3 to 92.3% (median 77.9%). Specific estimates of accuracy [sensitivity (Se) and specificity (Sp)] were obtained for different reported cut-points using a hierarchical summary receiver operating characteristic curve model. For the cut-point of 22% (n=8 studies), Se=80.2% (95% CI: 71.1-87.0%) and Sp=82.6% (71.4-90.0%). Decreasing the cut-point to 18% increased Se [96.1% (91.8-98.2%)] and decreased Sp [54.5% (26.9-79.6%)]. Modeling the effect of these Brix accuracy estimates using a stochastic simulation and Bayes theorem showed that a positive result with the 22% Brix cut-point can be used to diagnose good quality colostrum (posttest probability of a good colostrum: 94.3% (90.7-96.9%). The posttest probability of good colostrum with a Brix value <18% was only 22.7% (12.3-39.2%). Based on this study, the 2 cut-points could be alternatively used to select good quality colostrum (sample with Brix ≥22%) or to discard poor quality colostrum (sample with Brix <18%). When sample results are between these 2 values, colostrum supplementation should be considered. PMID:27423958

  4. Assessment of Exceptional Students: Educational and Psychological Procedures. Fifth Edition.

    ERIC Educational Resources Information Center

    Taylor, Ronald L.

    This book provides information on the assessment of students with disabilities. It is divided into six major parts. Part 1, "Introduction to Assessment: Issues and Concerns," discusses the historical, philosophical, and legal foundations of assessment, introduces psychological assessments, and proposes an assessment model. Part 2, "Informal…

  5. Adapting Assessment Procedures for Delivery via an Automated Format.

    ERIC Educational Resources Information Center

    Kelly, Karen L.; And Others

    The Office of Personnel Management (OPM) decided to explore alternative examining procedures for positions covered by the Administrative Careers with America (ACWA) examination. One requirement for new procedures was that they be automated for use with OPM's recently developed Microcomputer Assisted Rating System (MARS), a highly efficient system…

  6. Accuracy in prescriptions compounded by pharmacy students.

    PubMed

    Shrewsbury, R P; Deloatch, K H

    1998-01-01

    Most compounded prescriptions are not analyzed to determine the accuracy of the employed instruments and procedures. The assumption is that the compounded prescription will be +/- 5% the labeled claim. Two classes of School of Pharmcacy students who received repeated instruction and supervision on proper compounding techniques and procedures were assessed to determine their accuracy of compounding a diphenhydramine hydrochloride prescription. After two attempts, only 62% to 68% of the students could compound the prescription within +/- 5% the labeled claim; but 84% to 96% could attain an accuracy of +/- 10%. The results suggest that an accuracy of +/- 10% labeled claim is the least variation a pharmacist can expect when extemporaneously compounding prescriptions.

  7. Towards an assessment of the accuracy of density functional theory for first principles simulations of water

    NASA Astrophysics Data System (ADS)

    Grossman, Jeffrey C.; Schwegler, Eric; Draeger, Erik W.; Gygi, François; Galli, Giulia

    2004-01-01

    A series of Car-Parrinello (CP) molecular dynamics simulations of water are presented, aimed at assessing the accuracy of density functional theory in describing the structural and dynamical properties of water at ambient conditions. We found negligible differences in structural properties obtained using the Perdew-Burke-Ernzerhof or the Becke-Lee-Yang-Parr exchange and correlation energy functionals; we also found that size effects, although not fully negligible when using 32 molecule cells, are rather small. In addition, we identified a wide range of values of the fictitious electronic mass (μ) entering the CP Lagrangian for which the electronic ground state is accurately described, yielding trajectories and average properties that are independent of the value chosen. However, care must be exercised not to carry out simulations outside this range, where structural properties may artificially depend on μ. In the case of an accurate description of the electronic ground state, and in the absence of proton quantum effects, we obtained an oxygen-oxygen correlation function that is overstructured compared to experiment, and a diffusion coefficient which is approximately ten times smaller.

  8. Cold pressor stress induces opposite effects on cardioceptive accuracy dependent on assessment paradigm.

    PubMed

    Schulz, André; Lass-Hennemann, Johanna; Sütterlin, Stefan; Schächinger, Hartmut; Vögele, Claus

    2013-04-01

    Interoception depends on visceral afferent neurotraffic and central control processes. Physiological arousal and organ activation provide the biochemical and mechanical basis for visceral afferent neurotraffic. Perception of visceral symptoms occurs when attention is directed toward body sensations. Clinical studies suggest that stress contributes to the generation of visceral symptoms. However, during stress exposure attention is normally shifted away from bodily signals. Therefore, the net effects of stress on interoception remain unclear. We, therefore, investigated the impact of the cold pressor test or a control intervention (each n=21) on three established laboratory paradigms to assess cardioceptive accuracy (CA): for the Schandry-paradigm, participants were asked to count heartbeats, while during the Whitehead-tasks subjects were asked to rate whether a cardiac sensation appeared simultaneously with an auditory or visual stimulus. CA was increased by stress when attention was focused on visceral sensations (Schandry), while it decreased when attention was additionally directed toward external stimuli (visual Whitehead). Explanations for these results are offered in terms of internal versus external deployment of attention, as well as specific effects of the cold pressor on the cardiovascular system.

  9. Assessment of the Accuracy of the Bethe-Salpeter (BSE/GW) Oscillator Strengths.

    PubMed

    Jacquemin, Denis; Duchemin, Ivan; Blondel, Aymeric; Blase, Xavier

    2016-08-01

    Aiming to assess the accuracy of the oscillator strengths determined at the BSE/GW level, we performed benchmark calculations using three complementary sets of molecules. In the first, we considered ∼80 states in Thiel's set of compounds and compared the BSE/GW oscillator strengths to recently determined ADC(3/2) and CC3 reference values. The second set includes the oscillator strengths of the low-lying states of 80 medium to large dyes for which we have determined CC2/aug-cc-pVTZ values. The third set contains 30 anthraquinones for which experimental oscillator strengths are available. We find that BSE/GW accurately reproduces the trends for all series with excellent correlation coefficients to the benchmark data and generally very small errors. Indeed, for Thiel's sets, the BSE/GW values are more accurate (using CC3 references) than both CC2 and ADC(3/2) values on both absolute and relative scales. For all three sets, BSE/GW errors also tend to be nicely spread with almost equal numbers of positive and negative deviations as compared to reference values.

  10. The FES2014 tidal atlas, accuracy assessment for satellite altimetry and other geophysical applications

    NASA Astrophysics Data System (ADS)

    Lyard, Florent Henri; Carrère, Loren; Cancet, Mathilde; Boy, Jean-Paul; Gégout, Pascal; Lemoine, Jean-Michel

    2016-04-01

    The FES2014 tidal atlas (elaborated in a CNES-supported joint project involving the LEGOS laboratory, CLS and Noveltis) is the last release of the FES atlases series. Based on finite element hydrodynamic modelling with data assimilation, the FES atlases are routinely improved by taken advantage of the increasing duration of satellite altimetry missions. However, the most remarkable improvement in the FES2014 atlas is the unprecedentedly low level of prior misfits (i.e. between the hydrodynamic simulations and data), typically less than 1.3 centimeters RMS for the ocean M2 tide. This makes the data assimilation step much more reliable and more consistent with the true tidal dynamics, especially in shelf and coastal seas, and diminish the sensitivity of the accuracy to the observation distribution (extremely sparse or inexistent in the high latitudes). The FES2014 atlas has been validated and assessed in various geophysical applications (satellite altimetry corrections, gravimetry, etc…), showing significant improvements compared to previous FES releases and other state-of -the-art tidal atlases (such as DTU10, GOT4.8, TPXO8).

  11. Accuracy of Cameriere's cut-off value for third molar in assessing 18 years of age.

    PubMed

    De Luca, S; Biagi, R; Begnoni, G; Farronato, G; Cingolani, M; Merelli, V; Ferrante, L; Cameriere, R

    2014-02-01

    Due to increasingly numerous international migrations, estimating the age of unaccompanied minors is becoming of enormous significance for forensic professionals who are required to deliver expert opinions. The third molar tooth is one of the few anatomical sites available for estimating the age of individuals in late adolescence. This study verifies the accuracy of Cameriere's cut-off value of the third molar index (I3M) in assessing 18 years of age. For this purpose, a sample of orthopantomographs (OPTs) of 397 living subjects aged between 13 and 22 years (192 female and 205 male) was analyzed. Age distribution gradually decreases as I3M increases in both males and females. The results show that the sensitivity of the test was 86.6%, with a 95% confidence interval of (80.8%, 91.1%), and its specificity was 95.7%, with a 95% confidence interval of (92.1%, 98%). The proportion of correctly classified individuals was 91.4%. Estimated post-test probability, p was 95.6%, with a 95% confidence interval of (92%, 98%). Hence, the probability that a subject positive on the test (i.e., I3M<0.08) was 18 years of age or older was 95.6%.

  12. Assessment of the Accuracy of the Bethe-Salpeter (BSE/GW) Oscillator Strengths.

    PubMed

    Jacquemin, Denis; Duchemin, Ivan; Blondel, Aymeric; Blase, Xavier

    2016-08-01

    Aiming to assess the accuracy of the oscillator strengths determined at the BSE/GW level, we performed benchmark calculations using three complementary sets of molecules. In the first, we considered ∼80 states in Thiel's set of compounds and compared the BSE/GW oscillator strengths to recently determined ADC(3/2) and CC3 reference values. The second set includes the oscillator strengths of the low-lying states of 80 medium to large dyes for which we have determined CC2/aug-cc-pVTZ values. The third set contains 30 anthraquinones for which experimental oscillator strengths are available. We find that BSE/GW accurately reproduces the trends for all series with excellent correlation coefficients to the benchmark data and generally very small errors. Indeed, for Thiel's sets, the BSE/GW values are more accurate (using CC3 references) than both CC2 and ADC(3/2) values on both absolute and relative scales. For all three sets, BSE/GW errors also tend to be nicely spread with almost equal numbers of positive and negative deviations as compared to reference values. PMID:27403612

  13. Accuracy assessment of the large-scale dynamic ocean topography from TOPEX/POSEIDON altimetry

    NASA Technical Reports Server (NTRS)

    Tapley, B. D.; Chambers, D. P.; Shum, C. K.; Eanes, R. J.; Ries, J. C.; Stewart, R. H.

    1994-01-01

    The quality of TOPEX/POSEIDON determinations of the global scale dynamic ocean topography have been assessed by determining mean topography solutions for successive 10-day repeat cycles and by examining the temporal changes in the sea surface topography to identify known features. The assessment is based on the analysis of TOPEX altimeter data cycles 1 through 36. Important errors in the tide model used to correct the altimeter data have been identified. The errors were reduced significantly by use of a new tide model derived with the TOPEX/POSEIDON measurements. Maps of the global 1-year mean topography, produced using four of the most accurate of the marine geoid, show that the largest error in the dynamic ocean topography show expected features, such as the known annual hemispherical sea surface rise and fall and the seasonal variability due to monsoon influence in the Indian Ocean. Changes in the sequence of 10-day topography maps show the development and propagation of an equatorial Kelvin wave in the Pacific beginning in December 1992 with a propagation velocity of approximately 3 m/s. The observations are consistent with observed changes in the equatorial trade winds, and with tide gauge and other in situ observations of the strengthening of the El Nino. Comparison of TOPEX-determine sea surface height at points near oceanic tide gauges shows agreement at the 4 cm root-mean-square (RMS) level over the tropical Pacific. The results show that the TOPEX altimeter data set can be used to map the ocean surface with a temporal resolution of 10 days and an accuracy which is insonsistent with traditional in situ methods for the determination of sea level variations.

  14. Assessing the Accuracy of Sentinel-3 SLSTR Sea-Surface Temperature Retrievals Using High Accuracy Infrared Radiiometers on Ships of Opportunity

    NASA Astrophysics Data System (ADS)

    Minnett, P. J.; Izaguirre, M. A.; Szcszodrak, M.; Williams, E.; Reynolds, R. M.

    2015-12-01

    The assessment of errors and uncertainties in satellite-derived SSTs can be achieved by comparisons with independent measurements of skin SST of high accuracy. Such validation measurements are provided by well-calibrated infrared radiometers mounted on ships. The second generation of Marine-Atmospheric Emitted Radiance Interferometers (M-AERIs) have recently been developed and two are now deployed on cruise ships of Royal Caribbean Cruise Lines that operate in the Caribbean Sea, North Atlantic and Mediterranean Sea. In addition, two Infrared SST Autonomous Radiometers (ISARs) are mounted alternately on a vehicle transporter of NYK Lines that crosses the Pacific Ocean between Japan and the USA. Both M-AERIs and ISARs are self-calibrating radiometers having two internal blackbody cavities to provide at-sea calibration of the measured radiances, and the accuracy of the internal calibration is periodically determined by measurements of a NIST-traceable blackbody cavity in the laboratory. This provides SI-traceability for the at-sea measurements. It is anticipated that these sensors will be deployed during the next several years and will be available for the validation of the SLSTRs on Sentinel-3a and -3b.

  15. Mind-reading accuracy in intimate relationships: assessing the roles of the relationship, the target, and the judge.

    PubMed

    Thomas, Geoff; Fletcher, Garth J O

    2003-12-01

    Using a video-review procedure, multiple perceivers carried out mind-reading tasks of multiple targets at different levels of acquaintanceship (50 dating couples, friends of the dating partners, and strangers). As predicted, the authors found that mind-reading accuracy was (a). higher as a function of increased acquaintanceship, (b). relatively unaffected by target effects, (c). influenced by individual differences in perceivers' ability, and (d). higher for female than male perceivers. In addition, superior mind-reading accuracy (for dating couples and friends) was related to higher relationship satisfaction, closeness, and more prior disclosure about the problems discussed, but only under moderating conditions related to sex and relationship length. The authors conclude that the nature of the relationship between the perceiver and the target occupies a pivotal role in determining mind-reading accuracy.

  16. Using Generalizability Theory to Examine the Accuracy and Validity of Large-Scale ESL Writing Assessment

    ERIC Educational Resources Information Center

    Huang, Jinyan

    2012-01-01

    Using generalizability (G-) theory, this study examined the accuracy and validity of the writing scores assigned to secondary school ESL students in the provincial English examinations in Canada. The major research question that guided this study was: Are there any differences between the accuracy and construct validity of the analytic scores…

  17. Using allometric procedures to substantiate the plastochrone method for eelgrass leaf growth assessments.

    PubMed

    Echavarría-Heras, Héctor; Solana-Arellano, Elena; Leal-Ramírez, Cecilia; Castillo, Oscar

    2013-01-01

    Estimation of leaf productivity in eelgrass (Zostera marina L.) is crucial for evaluating the ecological role of this important seagrass species. Although leaf marking techniques are widely used to obtain estimates of leaf productivity, the accuracy of these assessments, has been questioned mainly because these fail to account for leaf growth below the reference mark and also because they apparently disregard the contribution of mature leaf tissues to the growth rate of leaves. On the other hand, the plastochrone method is a simpler technique that has been considered to effectively capture growth in a more realistic way, thereby providing more accurate assessments of both above- and below-ground productivities. But since the actual values of eelgrass growth rates are difficult to obtain, the worth of the plastochrone method has been largely vindicated because it produces assessments that overestimate productivity as compared to estimates obtained by leaf marking. Additionally, whenever eelgrass leaf biomass can be allometrically scaled in terms of matching leaf length in a consistent way, the associated leaf growth rates can be also projected allometrically. In this contribution, we used that approach to derive an authentication of the plastochrone method and formally demonstrate that, as has been claimed to occur for leaf marking approaches, the plastochrone method itself underestimates actual values of eelgrass leaf growth rates. We also show that this unavoidable bias is mainly due to the inadequacy of single-leaf biomass assessments in providing a proxy for the growth of all leaf tissue in a shoot over a given interval. Moreover, the derived formulae give conditions under which assessments of leaf growth rates using the plastochrone method would systematically underestimate matching values obtained by leaf marking procedures. And, assessments of leaf growth rates obtained by using the present data show that plastochrone method estimations underestimated

  18. Using allometric procedures to substantiate the plastochrone method for eelgrass leaf growth assessments

    PubMed Central

    2013-01-01

    Estimation of leaf productivity in eelgrass (Zostera marina L.) is crucial for evaluating the ecological role of this important seagrass species. Although leaf marking techniques are widely used to obtain estimates of leaf productivity, the accuracy of these assessments, has been questioned mainly because these fail to account for leaf growth bellow the reference mark and also because they apparently disregard the contribution of mature leaf tissues to the growth rate of leaves. On the other hand, the plastochrone method is a simpler technique that has been considered to effectively capture growth in a more realistic way, thereby providing more accurate assessments of both above- and below-ground productivities. But since the actual values of eelgrass growth rates are difficult to obtain, the worth of the plastochrone method has been largely vindicated because it produces assessments that overestimate productivity as compared to estimates obtained by leaf marking. Additionally, whenever eelgrass leaf biomass can be allometrically scaled in terms of matching leaf length in a consistent way, the associated leaf growth rates can be also projected allometrically. In this contribution, we used that approach to derive an authentication of the plastochrone method and formally demonstrate that, as has been claimed to occur for leaf marking approaches, the plastochrone method itself underestimates actual values of eelgrass leaf growth rates. We also show that this unavoidable bias is mainly due to the inadequacy of single-leaf biomass assessments in providing a proxy for the growth of all leaf tissue in a shoot over a given interval. Moreover, the derived formulae give conditions under which assessments of leaf growth rates using the plastochrone method would systematically underestimate matching values obtained by leaf marking procedures. And, assessments of leaf growth rates obtained by using the present data show that plastochrone method estimations underestimated

  19. Accuracy Assessment of Mobile Mapping Point Clouds Using the Existing Environment as Terrestrial Reference

    NASA Astrophysics Data System (ADS)

    Hofmann, S.; Brenner, C.

    2016-06-01

    Mobile mapping data is widely used in various applications, what makes it especially important for data users to get a statistically verified quality statement on the geometric accuracy of the acquired point clouds or its processed products. The accuracy of point clouds can be divided into an absolute and a relative quality, where the absolute quality describes the position of the point cloud in a world coordinate system such as WGS84 or UTM, whereas the relative accuracy describes the accuracy within the point cloud itself. Furthermore, the quality of processed products such as segmented features depends on the global accuracy of the point cloud but mainly on the quality of the processing steps. Several data sources with different characteristics and quality can be thought of as potential reference data, such as cadastral maps, orthophoto, artificial control objects or terrestrial surveys using a total station. In this work a test field in a selected residential area was acquired as reference data in a terrestrial survey using a total station. In order to reach high accuracy the stationing of the total station was based on a newly made geodetic network with a local accuracy of less than 3 mm. The global position of the network was determined using a long time GNSS survey reaching an accuracy of 8 mm. Based on this geodetic network a 3D test field with facades and street profiles was measured with a total station, each point with a two-dimensional position and altitude. In addition, the surface of poles of street lights, traffic signs and trees was acquired using the scanning mode of the total station. Comparing this reference data to the acquired mobile mapping point clouds of several measurement campaigns a detailed quality statement on the accuracy of the point cloud data is made. Additionally, the advantages and disadvantages of the described reference data source concerning availability, cost, accuracy and applicability are discussed.

  20. A General Factor-Analytic Procedure for Assessing Response Bias in Questionnaire Measures

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Lorenzo-Seva, Urbano; Chico, Eliseo

    2009-01-01

    This article proposes procedures for simultaneously assessing and controlling acquiescence and social desirability in questionnaire items. The procedures are based on a semi-restricted factor-analytic tridimensional model, and can be used with binary, graded-response, or more continuous items. We discuss procedures for fitting the model (item…

  1. Designing a Multi-Objective Multi-Support Accuracy Assessment of the 2001 National Land Cover Data (NLCD 2001) of the Conterminous United States

    EPA Science Inventory

    The database design and diverse application of NLCD 2001 pose significant challenges for accuracy assessment because numerous objectives are of interest, including accuracy of land cover, percent urban imperviousness, percent tree canopy, land-cover composition, and net change. ...

  2. 78 FR 46905 - Tobacco Transition Program; Final Assessment Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ...; ] DEPARTMENT OF AGRICULTURE Commodity Credit Corporation Tobacco Transition Program; Final Assessment... information about the final quarterly assessments for the Tobacco Transition Program (TTP). Through the Tobacco Transition Payment Program (TTPP), which is part of the TTP, eligible former tobacco quota...

  3. A procedure for merging land cover/use data from Landsat, aerial photography, and map sources - Compatibility, accuracy and cost

    NASA Technical Reports Server (NTRS)

    Enslin, W. R.; Tilmann, S. E.; Hill-Rowley, R.; Rogers, R. H.

    1977-01-01

    A method is developed to merge land cover/use data from Landsat, aerial photography and map sources into a grid-based geographic information system. The method basically involves computer-assisted categorization of Landsat data to provide certain user-specified land cover categories; manual interpretation of aerial photography to identify other selected land cover/use categories that cannot be obtained from Landsat data; identification of special features from aerial photography or map sources; merging of the interpreted data from all the sources into a computer compatible file under a standardized coding structure; and the production of land cover/use maps, thematic maps, and tabular data. The specific tasks accomplished in producing the merged land cover/use data file and subsequent output products are identified and discussed. It is shown that effective implementation of the merging method is critically dependent on selecting the 'best' data source for each user-specified category in terms of accuracy and time/cost tradeoffs.

  4. Formative Assessment in HL Teaching: Purposes, Procedures, and Practices

    ERIC Educational Resources Information Center

    Carreira, Maria M.

    2012-01-01

    Discussions surrounding assessment in the foreign languages generally focus on the two ends of the teaching/learning process: diagnostic assessment, typically used for placement purposes and administered prior to the start of instruction, and summative assessment, which evaluates learning after instruction for purposes of assigning a grade or…

  5. 30 CFR 723.18 - Procedures for assessment conference.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Office shall arrange for a conference to review the proposed assessment or reassessment, upon written... Office and by the person assessed; or (ii) Affirm, raise, lower, or vacate the penalty. (4) An increase... conference officer shall promptly serve the person assessed with a notice of his or her action in the...

  6. Accuracy Assessment of Crown Delineation Methods for the Individual Trees Using LIDAR Data

    NASA Astrophysics Data System (ADS)

    Chang, K. T.; Lin, C.; Lin, Y. C.; Liu, J. K.

    2016-06-01

    Forest canopy density and height are used as variables in a number of environmental applications, including the estimation of biomass, forest extent and condition, and biodiversity. The airborne Light Detection and Ranging (LiDAR) is very useful to estimate forest canopy parameters according to the generated canopy height models (CHMs). The purpose of this work is to introduce an algorithm to delineate crown parameters, e.g. tree height and crown radii based on the generated rasterized CHMs. And accuracy assessment for the extraction of volumetric parameters of a single tree is also performed via manual measurement using corresponding aerial photo pairs. A LiDAR dataset of a golf course acquired by Leica ALS70-HP is used in this study. Two algorithms, i.e. a traditional one with the subtraction of a digital elevation model (DEM) from a digital surface model (DSM), and a pit-free approach are conducted to generate the CHMs firstly. Then two algorithms, a multilevel morphological active-contour (MMAC) and a variable window filter (VWF), are implemented and used in this study for individual tree delineation. Finally, experimental results of two automatic estimation methods for individual trees can be evaluated with manually measured stand-level parameters, i.e. tree height and crown diameter. The resulting CHM generated by a simple subtraction is full of empty pixels (called "pits") that will give vital impact on subsequent analysis for individual tree delineation. The experimental results indicated that if more individual trees can be extracted, tree crown shape will became more completely in the CHM data after the pit-free process.

  7. Accuracy of Panoramic Radiograph in Assessment of the Relationship Between Mandibular Canal and Impacted Third Molars

    PubMed Central

    Tantanapornkul, Weeraya; Mavin, Darika; Prapaiphittayakun, Jaruthai; Phipatboonyarat, Natnicha; Julphantong, Wanchanok

    2016-01-01

    Background: The relationship between impacted mandibular third molar and mandibular canal is important for removal of this tooth. Panoramic radiography is one of the commonly used diagnostic tools for evaluating the relationship of these two structures. Objectives: To evaluate the accuracy of panoramic radiographic findings in predicting direct contact between mandibular canal and impacted third molars on 3D digital images, and to define panoramic criterion in predicting direct contact between the two structures. Methods: Two observers examined panoramic radiographs of 178 patients (256 impacted mandibular third molars). Panoramic findings of interruption of mandibular canal wall, isolated or with darkening of third molar root, diversion of mandibular canal and narrowing of third molar root were evaluated for 3D digital radiography. Direct contact between mandibular canal and impacted third molars on 3D digital images was then correlated with panoramic findings. Panoramic criterion was also defined in predicting direct contact between the two structures. Results: Panoramic findings of interruption of mandibular canal wall, isolated or with darkening of third molar root were statistically significantly correlated with direct contact between mandibular canal and impacted third molars on 3D digital images (p < 0.005), and were defined as panoramic criteria in predicting direct contact between the two structures. Conclusion: Interruption of mandibular canal wall, isolated or with darkening of third molar root observed on panoramic radiographs were effective in predicting direct contact between mandibular canal and impacted third molars on 3D digital images. Panoramic radiography is one of the efficient diagnostic tools for pre-operative assessment of impacted mandibular third molars. PMID:27398105

  8. How Nonrecidivism Affects Predictive Accuracy: Evidence from a Cross-Validation of the Ontario Domestic Assault Risk Assessment (ODARA)

    ERIC Educational Resources Information Center

    Hilton, N. Zoe; Harris, Grant T.

    2009-01-01

    Prediction effect sizes such as ROC area are important for demonstrating a risk assessment's generalizability and utility. How a study defines recidivism might affect predictive accuracy. Nonrecidivism is problematic when predicting specialized violence (e.g., domestic violence). The present study cross-validates the ability of the Ontario…

  9. A TECHNIQUE FOR ASSESSING THE ACCURACY OF SUB-PIXEL IMPERVIOUS SURFACE ESTIMATES DERIVED FROM LANDSAT TM IMAGERY

    EPA Science Inventory

    We developed a technique for assessing the accuracy of sub-pixel derived estimates of impervious surface extracted from LANDSAT TM imagery. We utilized spatially coincident
    sub-pixel derived impervious surface estimates, high-resolution planimetric GIS data, vector--to-
    r...

  10. Diagnostic Accuracy of Computer-Aided Assessment of Intranodal Vascularity in Distinguishing Different Causes of Cervical Lymphadenopathy.

    PubMed

    Ying, Michael; Cheng, Sammy C H; Ahuja, Anil T

    2016-08-01

    Ultrasound is useful in assessing cervical lymphadenopathy. Advancement of computer science technology allows accurate and reliable assessment of medical images. The aim of the study described here was to evaluate the diagnostic accuracy of computer-aided assessment of the intranodal vascularity index (VI) in differentiating the various common causes of cervical lymphadenopathy. Power Doppler sonograms of 347 patients (155 with metastasis, 23 with lymphoma, 44 with tuberculous lymphadenitis, 125 reactive) with palpable cervical lymph nodes were reviewed. Ultrasound images of cervical nodes were evaluated, and the intranodal VI was quantified using a customized computer program. The diagnostic accuracy of using the intranodal VI to distinguish different disease groups was evaluated and compared. Metastatic and lymphomatous lymph nodes tend to be more vascular than tuberculous and reactive lymph nodes. The intranodal VI had the highest diagnostic accuracy in distinguishing metastatic and tuberculous nodes with a sensitivity of 80%, specificity of 73%, positive predictive value of 91%, negative predictive value of 51% and overall accuracy of 68% when a cutoff VI of 22% was used. Computer-aided assessment provides an objective and quantitative way to evaluate intranodal vascularity. The intranodal VI is a useful parameter in distinguishing certain causes of cervical lymphadenopathy and is particularly useful in differentiating metastatic and tuberculous lymph nodes. However, it has limited value in distinguishing lymphomatous nodes from metastatic and reactive nodes.

  11. Classification Accuracy of Oral Reading Fluency and Maze in Predicting Performance on Large-Scale Reading Assessments

    ERIC Educational Resources Information Center

    Decker, Dawn M.; Hixson, Michael D.; Shaw, Amber; Johnson, Gloria

    2014-01-01

    The purpose of this study was to examine whether using a multiple-measure framework yielded better classification accuracy than oral reading fluency (ORF) or maze alone in predicting pass/fail rates for middle-school students on a large-scale reading assessment. Participants were 178 students in Grades 7 and 8 from a Midwestern school district.…

  12. 45 CFR 5.44 - Procedures for assessing and collecting fees.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... have a history of prompt payment. We may also, at our discretion, aggregate the charges for certain... 45 Public Welfare 1 2010-10-01 2010-10-01 false Procedures for assessing and collecting fees. 5.44... INFORMATION REGULATIONS Fees § 5.44 Procedures for assessing and collecting fees. (a) Agreement to pay....

  13. First Language of Test Takers and Fairness Assessment Procedures

    ERIC Educational Resources Information Center

    Sinharay, Sandip; Dorans, Neil J.; Liang, Longjuan

    2011-01-01

    Over the past few decades, those who take tests in the United States have exhibited increasing diversity with respect to native language. Standard psychometric procedures for ensuring item and test fairness that have existed for some time were developed when test-taking groups were predominantly native English speakers. A better understanding of…

  14. Assessing Children's Implicit Attitudes Using the Affect Misattribution Procedure

    ERIC Educational Resources Information Center

    Williams, Amanda; Steele, Jennifer R.; Lipman, Corey

    2016-01-01

    In the current research, we examined whether the Affect Misattribution Procedure (AMP) could be successfully adapted as an implicit measure of children's attitudes. We tested this possibility in 3 studies with 5- to 10-year-old children. In Study 1, we found evidence that children misattribute affect elicited by attitudinally positive (e.g., cute…

  15. Cognitive Styles in Admission Procedures for Assessing Candidates of Architecture

    ERIC Educational Resources Information Center

    Casakin, Hernan; Gigi, Ariela

    2016-01-01

    Cognitive style has a strong predictive power in academic and professional success. This study investigated the cognitive profile of candidates studying architecture. Specifically, it explored the relation between visual and verbal cognitive styles, and the performance of candidates in admission procedures. The cognitive styles of candidates who…

  16. Report of the Project on Contracting, Review, and Assessment Procedures.

    ERIC Educational Resources Information Center

    Vermont Community Colleges, Montpelier.

    The broad purpose of this project was to examine all aspects of the Community College of Vermont (CCV) review and contracting process and to recommend changes to the CCV Review Board and Decision Team for implementation. Examination focused on the present program, the contract format, Local Review Committee procedures, the counseling process, and…

  17. 77 FR 39572 - Assessment of Mediation and Arbitration Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-03

    ... Arbitration Procedures, 75 FR 52,054. The Board received input and issued a decision proposing new regulations..., 77 FR 19,591. The Board sought comments on the proposed regulations by May 17, 2012, and replies by... their positions. BOARD RELEASES AND LIVE VIDEO STREAMING AVAILABLE VIA THE INTERNET: Decisions...

  18. Generalized Procedure for Improved Accuracy of Thermal Contact Resistance Measurements for Materials With Arbitrary Temperature-Dependent Thermal Conductivity

    DOE PAGES

    Sayer, Robert A.

    2014-06-26

    Thermal contact resistance (TCR) is most commonly measured using one-dimensional steady-state calorimetric techniques. In the experimental methods we utilized, a temperature gradient is applied across two contacting beams and the temperature drop at the interface is inferred from the temperature profiles of the rods that are measured at discrete points. During data analysis, thermal conductivity of the beams is typically taken to be an average value over the temperature range imposed during the experiment. Our generalized theory is presented and accounts for temperature-dependent changes in thermal conductivity. The procedure presented enables accurate measurement of TCR for contacting materials whose thermalmore » conductivity is any arbitrary function of temperature. For example, it is shown that the standard technique yields TCR values that are about 15% below the actual value for two specific examples of copper and silicon contacts. Conversely, the generalized technique predicts TCR values that are within 1% of the actual value. The method is exact when thermal conductivity is known exactly and no other errors are introduced to the system.« less

  19. Generalized Procedure for Improved Accuracy of Thermal Contact Resistance Measurements for Materials With Arbitrary Temperature-Dependent Thermal Conductivity

    SciTech Connect

    Sayer, Robert A.

    2014-06-26

    Thermal contact resistance (TCR) is most commonly measured using one-dimensional steady-state calorimetric techniques. In the experimental methods we utilized, a temperature gradient is applied across two contacting beams and the temperature drop at the interface is inferred from the temperature profiles of the rods that are measured at discrete points. During data analysis, thermal conductivity of the beams is typically taken to be an average value over the temperature range imposed during the experiment. Our generalized theory is presented and accounts for temperature-dependent changes in thermal conductivity. The procedure presented enables accurate measurement of TCR for contacting materials whose thermal conductivity is any arbitrary function of temperature. For example, it is shown that the standard technique yields TCR values that are about 15% below the actual value for two specific examples of copper and silicon contacts. Conversely, the generalized technique predicts TCR values that are within 1% of the actual value. The method is exact when thermal conductivity is known exactly and no other errors are introduced to the system.

  20. 49 CFR 1540.205 - Procedures for security threat assessment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) TRANSPORTATION SECURITY ADMINISTRATION, DEPARTMENT OF HOMELAND SECURITY CIVIL AVIATION SECURITY CIVIL AVIATION... TSA determines that the applicant meets the security threat assessment standards in 49 CFR 1540.201(c... the applicant does not meet the security threat assessment standards in 49 CFR 1540.201(c)....

  1. A priori evaluation of two-stage cluster sampling for accuracy assessment of large-area land-cover maps

    USGS Publications Warehouse

    Wickham, J.D.; Stehman, S.V.; Smith, J.H.; Wade, T.G.; Yang, L.

    2004-01-01

    Two-stage cluster sampling reduces the cost of collecting accuracy assessment reference data by constraining sample elements to fall within a limited number of geographic domains (clusters). However, because classification error is typically positively spatially correlated, within-cluster correlation may reduce the precision of the accuracy estimates. The detailed population information to quantify a priori the effect of within-cluster correlation on precision is typically unavailable. Consequently, a convenient, practical approach to evaluate the likely performance of a two-stage cluster sample is needed. We describe such an a priori evaluation protocol focusing on the spatial distribution of the sample by land-cover class across different cluster sizes and costs of different sampling options, including options not imposing clustering. This protocol also assesses the two-stage design's adequacy for estimating the precision of accuracy estimates for rare land-cover classes. We illustrate the approach using two large-area, regional accuracy assessments from the National Land-Cover Data (NLCD), and describe how the a priorievaluation was used as a decision-making tool when implementing the NLCD design.

  2. Ecological risk assessment and natural resource damage assessment: synthesis of assessment procedures.

    PubMed

    Gala, William; Lipton, Joshua; Cernera, Phil; Ginn, Thomas; Haddad, Robert; Henning, Miranda; Jahn, Kathryn; Landis, Wayne; Mancini, Eugene; Nicoll, James; Peters, Vicky; Peterson, Jennifer

    2009-10-01

    The Society of Environmental Toxicology and Chemistry (SETAC) convened an invited workshop (August 2008) to address coordination between ecological risk assessment (ERA) and natural resource damage assessment (NRDA). Although ERA and NRDA activities are performed under a number of statutory and regulatory authorities, the primary focus of the workshop was on ERA and NRDA as currently practiced in the United States under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). This paper presents the findings and conclusions of the Synthesis Work Group, 1 of 3 work groups convened at the workshop. The Synthesis Work Group concluded that the different programmatic objectives and legal requirements of the 2 processes preclude development of a single, integrated ERA/NRDA process. However, although institutional and programmatic impediments exist to integration of the 2 processes, parties are capitalizing on opportunities to coordinate technical and scientific elements of the assessments at a number of locations. Although it is important to recognize and preserve the distinctions between ERA and NRDA, opportunities for data sharing exist, particularly for the characterization of environmental exposures and derivation of ecotoxicological information. Thus, effective coordination is not precluded by the underlying science. Rather, willing participants, accommodating schedules, and recognition of potential efficiencies associated with shared data collection can lead to enhanced coordination and consistency between ERA and NRDA. PMID:19545186

  3. Development of a Haptic Elbow Spasticity Simulator (HESS) for Improving Accuracy and Reliability of Clinical Assessment of Spasticity

    PubMed Central

    Park, Hyung-Soon; Kim, Jonghyun; Damiano, Diane L.

    2013-01-01

    This paper presents the framework for developing a robotic system to improve accuracy and reliability of clinical assessment. Clinical assessment of spasticity tends to have poor reliability because of the nature of the in-person assessment. To improve accuracy and reliability of spasticity assessment, a haptic device, named the HESS (Haptic Elbow Spasticity Simulator) has been designed and constructed to recreate the clinical “feel” of elbow spasticity based on quantitative measurements. A mathematical model representing the spastic elbow joint was proposed based on clinical assessment using the Modified Ashworth Scale (MAS) and quantitative data (position, velocity, and torque) collected on subjects with elbow spasticity. Four haptic models (HMs) were created to represent the haptic feel of MAS 1, 1+, 2, and 3. The four HMs were assessed by experienced clinicians; three clinicians performed both in-person and haptic assessments, and had 100% agreement in MAS scores; and eight clinicians who were experienced with MAS assessed the four HMs without receiving any training prior to the test. Inter-rater reliability among the eight clinicians had substantial agreement (κ = 0.626). The eight clinicians also rated the level of realism (7.63 ± 0.92 out of 10) as compared to their experience with real patients. PMID:22562769

  4. 30 CFR 845.18 - Procedures for assessment conference.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... assessed with a notice of his or her action in the manner provided in 30 CFR 845.17(b) and shall include a worksheet if the penalty has been raised or lowered. The reasons for the conference officer's action...

  5. 30 CFR 845.18 - Procedures for assessment conference.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... assessed with a notice of his or her action in the manner provided in 30 CFR 845.17(b) and shall include a worksheet if the penalty has been raised or lowered. The reasons for the conference officer's action...

  6. 30 CFR 845.18 - Procedures for assessment conference.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... assessed with a notice of his or her action in the manner provided in 30 CFR 845.17(b) and shall include a worksheet if the penalty has been raised or lowered. The reasons for the conference officer's action...

  7. 30 CFR 845.18 - Procedures for assessment conference.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... assessed with a notice of his or her action in the manner provided in 30 CFR 845.17(b) and shall include a worksheet if the penalty has been raised or lowered. The reasons for the conference officer's action...

  8. Accuracy of audio computer-assisted self-interviewing (ACASI) and self-administered questionnaires for the assessment of sexual behavior.

    PubMed

    Morrison-Beedy, Dianne; Carey, Michael P; Tu, Xin

    2006-09-01

    This study examined the accuracy of two retrospective methods and assessment intervals for recall of sexual behavior and assessed predictors of recall accuracy. Using a 2 [mode: audio-computer assisted self-interview (ACASI) vs. self-administered questionnaire (SAQ)] by 2 (frequency: monthly vs. quarterly) design, young women (N =102) were randomly assigned to one of four conditions. Participants completed baseline measures, monitored their behavior with a daily diary, and returned monthly (or quarterly) for assessments. A mixed pattern of accuracy between the four assessment methods was identified. Monthly assessments yielded more accurate recall for protected and unprotected vaginal sex but quarterly assessments yielded more accurate recall for unprotected oral sex. Mode differences were not strong, and hypothesized predictors of accuracy tended not to be associated with recall accuracy. Choice of assessment mode and frequency should be based upon the research question(s), population, resources, and context in which data collection will occur. PMID:16721506

  9. Assessment of fine motor skill in musicians and nonmusicians: differences in timing versus sequence accuracy in a bimanual fingering task.

    PubMed

    Kincaid, Anthony E; Duncan, Scott; Scott, Samuel A

    2002-08-01

    While professional musicians are generally considered to possess better control of finger movements than nonmusicians, relatively few reports have experimentally addressed the nature of this discrepancy in fine motor skills. For example, it is unknown whether musicians perform with greater skill than control subjects in all aspects of different types of fine motor activities. More specifically, it is not known whether musicians perform better than control subjects on a fine motor task that is similar, but not identical, to the playing of their primary instrument. The purpose of this study was to examine the accuracy of finger placement and accuracy of timing in professional musicians and nonmusicians using a simple, rhythmical, bilateral fingering pattern and the technology that allowed separate assessment of these two parameters. Professional musicians (other than pianists) and nonmusicians were given identical, detailed and explicit instructions but not allowed physically to practice the finger pattern. After verbally repeating the correct pattern for the investigator, subjects performed the task on an electric keyboard with both hands simultaneously. Each subject's performance was then converted to a numerical score. While musicians clearly demonstrated better accuracy in timing, no significant difference was found between the groups in their finger placement scores. These findings were not correlated with subjects' age, sex, limb dominance, or primary instrument (for the professional musicians). This study indicates that professional musicians perform better in timing accuracy but not spatial accuracy while executing a simple, novel, bimanual motor sequence. PMID:12365261

  10. Interrater Reliability Estimators Commonly Used in Scoring Language Assessments: A Monte Carlo Investigation of Estimator Accuracy

    ERIC Educational Resources Information Center

    Morgan, Grant B.; Zhu, Min; Johnson, Robert L.; Hodge, Kari J.

    2014-01-01

    Common estimators of interrater reliability include Pearson product-moment correlation coefficients, Spearman rank-order correlations, and the generalizability coefficient. The purpose of this study was to examine the accuracy of estimators of interrater reliability when varying the true reliability, number of scale categories, and number of…

  11. Comparative analysis of Worldview-2 and Landsat 8 for coastal saltmarsh mapping accuracy assessment

    NASA Astrophysics Data System (ADS)

    Rasel, Sikdar M. M.; Chang, Hsing-Chung; Diti, Israt Jahan; Ralph, Tim; Saintilan, Neil

    2016-05-01

    Coastal saltmarsh and their constituent components and processes are of an interest scientifically due to their ecological function and services. However, heterogeneity and seasonal dynamic of the coastal wetland system makes it challenging to map saltmarshes with remotely sensed data. This study selected four important saltmarsh species Pragmitis australis, Sporobolus virginicus, Ficiona nodosa and Schoeloplectus sp. as well as a Mangrove and Pine tree species, Avecinia and Casuarina sp respectively. High Spatial Resolution Worldview-2 data and Coarse Spatial resolution Landsat 8 imagery were selected in this study. Among the selected vegetation types some patches ware fragmented and close to the spatial resolution of Worldview-2 data while and some patch were larger than the 30 meter resolution of Landsat 8 data. This study aims to test the effectiveness of different classifier for the imagery with various spatial and spectral resolutions. Three different classification algorithm, Maximum Likelihood Classifier (MLC), Support Vector Machine (SVM) and Artificial Neural Network (ANN) were tested and compared with their mapping accuracy of the results derived from both satellite imagery. For Worldview-2 data SVM was giving the higher overall accuracy (92.12%, kappa =0.90) followed by ANN (90.82%, Kappa 0.89) and MLC (90.55%, kappa = 0.88). For Landsat 8 data, MLC (82.04%) showed the highest classification accuracy comparing to SVM (77.31%) and ANN (75.23%). The producer accuracy of the classification results were also presented in the paper.

  12. Applying Signal-Detection Theory to the Study of Observer Accuracy and Bias in Behavioral Assessment

    ERIC Educational Resources Information Center

    Lerman, Dorothea C.; Tetreault, Allison; Hovanetz, Alyson; Bellaci, Emily; Miller, Jonathan; Karp, Hilary; Mahmood, Angela; Strobel, Maggie; Mullen, Shelley; Keyl, Alice; Toupard, Alexis

    2010-01-01

    We evaluated the feasibility and utility of a laboratory model for examining observer accuracy within the framework of signal-detection theory (SDT). Sixty-one individuals collected data on aggression while viewing videotaped segments of simulated teacher-child interactions. The purpose of Experiment 1 was to determine if brief feedback and…

  13. Portable device to assess dynamic accuracy of global positioning systems (GPS) receivers used in agricultural aircraft

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A device was designed to test the dynamic accuracy of Global Positioning System (GPS) receivers used in aerial vehicles. The system works by directing a sun-reflected light beam from the ground to the aircraft using mirrors. A photodetector is placed pointing downward from the aircraft and circuitry...

  14. ESA ExoMars: Pre-launch PanCam Geometric Modeling and Accuracy Assessment

    NASA Astrophysics Data System (ADS)

    Li, D.; Li, R.; Yilmaz, A.

    2014-08-01

    ExoMars is the flagship mission of the European Space Agency (ESA) Aurora Programme. The mobile scientific platform, or rover, will carry a drill and a suite of instruments dedicated to exobiology and geochemistry research. As the ExoMars rover is designed to travel kilometres over the Martian surface, high-precision rover localization and topographic mapping will be critical for traverse path planning and safe planetary surface operations. For such purposes, the ExoMars rover Panoramic Camera system (PanCam) will acquire images that are processed into an imagery network providing vision information for photogrammetric algorithms to localize the rover and generate 3-D mapping products. Since the design of the ExoMars PanCam will influence localization and mapping accuracy, quantitative error analysis of the PanCam design will improve scientists' awareness of the achievable level of accuracy, and enable the PanCam design team to optimize its design to achieve the highest possible level of localization and mapping accuracy. Based on photogrammetric principles and uncertainty propagation theory, we have developed a method to theoretically analyze how mapping and localization accuracy would be affected by various factors, such as length of stereo hard-baseline, focal length, and pixel size, etc.

  15. An Accuracy--Response Time Capacity Assessment Function that Measures Performance against Standard Parallel Predictions

    ERIC Educational Resources Information Center

    Townsend, James T.; Altieri, Nicholas

    2012-01-01

    Measures of human efficiency under increases in mental workload or attentional limitations are vital in studying human perception, cognition, and action. Assays of efficiency as workload changes have typically been confined to either reaction times (RTs) or accuracy alone. Within the realm of RTs, a nonparametric measure called the "workload…

  16. Accuracy, Confidence, and Calibration: How Young Children and Adults Assess Credibility

    ERIC Educational Resources Information Center

    Tenney, Elizabeth R.; Small, Jenna E.; Kondrad, Robyn L.; Jaswal, Vikram K.; Spellman, Barbara A.

    2011-01-01

    Do children and adults use the same cues to judge whether someone is a reliable source of information? In 4 experiments, we investigated whether children (ages 5 and 6) and adults used information regarding accuracy, confidence, and calibration (i.e., how well an informant's confidence predicts the likelihood of being correct) to judge informants'…

  17. Accuracy Assessment of Direct Georeferencing for Photogrammetric Applications on Small Unmanned Aerial Platforms

    NASA Astrophysics Data System (ADS)

    Mian, O.; Lutes, J.; Lipa, G.; Hutton, J. J.; Gavelle, E.; Borghini, S.

    2016-03-01

    Microdrones md4-1000 quad-rotor VTOL UAV. The Sony A7R and each lens combination were focused and calibrated terrestrially using the Applanix camera calibration facility, and then integrated with the APX-15 GNSS-Inertial system using a custom mount specifically designed for UAV applications. The mount is constructed in such a way as to maintain the stability of both the interior orientation and IMU boresight calibration over shock and vibration, thus turning the Sony A7R into a metric imaging solution. In July and August 2015, Applanix and Avyon carried out a series of test flights of this system. The goal of these test flights was to assess the performance of DMS APX-15 direct georeferencing system under various scenarios. Furthermore, an examination of how DMS APX-15 can be used to produce accurate map products without the use of ground control points and with reduced sidelap was also carried out. Reducing the side lap for survey missions performed by small UAVs can significantly increase the mapping productivity of these platforms. The area mapped during the first flight campaign was a 250m x 300m block and a 775m long railway corridor in a rural setting in Ontario, Canada. The second area mapped was a 450m long corridor over a dam known as Fryer Dam (over Richelieu River in Quebec, Canada). Several ground control points were distributed within both test areas. The flight over the block area included 8 North-South lines and 1 cross strip flown at 80m AGL, resulting in a ~1cm GSD. The flight over the railway corridor included 2 North-South lines also flown at 80m AGL. Similarly, the flight over the dam corridor included 2 North-South lines flown at 50m AGL. The focus of this paper was to analyse the results obtained from the two corridors. Test results from both areas were processed using Direct Georeferencing techniques, and then compared for accuracy against the known positions of ground control points in each test area. The GNSS-Inertial data collected by the APX-15 was

  18. Accuracy Assessment of Direct Georeferencing for Photogrammetric Applications on Small Unmanned Aerial Platforms

    NASA Astrophysics Data System (ADS)

    Mian, O.; Lutes, J.; Lipa, G.; Hutton, J. J.; Gavelle, E.; Borghini, S.

    2016-03-01

    Microdrones md4-1000 quad-rotor VTOL UAV. The Sony A7R and each lens combination were focused and calibrated terrestrially using the Applanix camera calibration facility, and then integrated with the APX-15 GNSS-Inertial system using a custom mount specifically designed for UAV applications. The mount is constructed in such a way as to maintain the stability of both the interior orientation and IMU boresight calibration over shock and vibration, thus turning the Sony A7R into a metric imaging solution. In July and August 2015, Applanix and Avyon carried out a series of test flights of this system. The goal of these test flights was to assess the performance of DMS APX-15 direct georeferencing system under various scenarios. Furthermore, an examination of how DMS APX-15 can be used to produce accurate map products without the use of ground control points and with reduced sidelap was also carried out. Reducing the side lap for survey missions performed by small UAVs can significantly increase the mapping productivity of these platforms. The area mapped during the first flight campaign was a 250m x 300m block and a 775m long railway corridor in a rural setting in Ontario, Canada. The second area mapped was a 450m long corridor over a dam known as Fryer Dam (over Richelieu River in Quebec, Canada). Several ground control points were distributed within both test areas. The flight over the block area included 8 North-South lines and 1 cross strip flown at 80m AGL, resulting in a ~1cm GSD. The flight over the railway corridor included 2 North-South lines also flown at 80m AGL. Similarly, the flight over the dam corridor included 2 North-South lines flown at 50m AGL. The focus of this paper was to analyse the results obtained from the two corridors. Test results from both areas were processed using Direct Georeferencing techniques, and then compared for accuracy against the known positions of ground control points in each test area. The GNSS-Inertial data collected by the APX-15 was

  19. Recent Developments in Assessment and Examination Procedures in France.

    ERIC Educational Resources Information Center

    Broadfoot, Patricia

    Recent changes in educational assessment in France reflect pressures to modernize the French educational system to align it with prevailing democratic and egalitarian values and to respond to the economy's vocational training needs. After providing background on the French educational system, this paper discusses two areas of secondary school…

  20. An evaluation of procedures for assessing competency to stand trial.

    PubMed

    Schreiber, J; Roesch, R; Golding, S

    1987-01-01

    In a field experiment involving 120 defendants at Bridgewater State Hospital in Massachusetts, the authors evaluated three instruments for assessing competency to stand trial: the Competency Screening Test (CST), Competency Assessment Instrument (CAI), and Interdisciplinary Fitness Interview (IFI). The CST (a paper-and-pencil test) was administered by a research assistant and scored by trained graduate students. Lawyers, psychologists, and social workers were recruited and trained in the use of the other instruments, then assigned as individuals (CAI) or teams (IFI) to conduct interviews and assess subjects. The performance of the project interviewers was compared against two yardsticks: (1) actual decisions reached by the regular Bridgewater staff, and (2) a consensus of two nationally respected experts who reviewed the cases and formed independent competency judgments. Both the CAI and IFI performed well under these conditions, indicating that one-time interviews by well-trained persons can lead to accurate competency decisions in the majority of cases. The authors conclude that hospitalization for competency assessment is rarely necessary.

  1. 77 FR 23208 - Assessment of Mediation and Arbitration Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-18

    ... FR 19,591). The Board favors the resolution of disputes through the use of mediation and arbitration... of information described below and in greater detail at 77 FR 19,591 is necessary for the proper... Surface Transportation Board 49 CFR Parts 1108 and 1109 Assessment of Mediation and Arbitration...

  2. Do Intervention-Embedded Assessment Procedures Successfully Measure Student Growth in Reading?

    ERIC Educational Resources Information Center

    Begeny, John C.; Whitehouse, Mary H.; Methe, Scott A.; Codding, Robin S.; Stage, Scott A.; Nuepert, Shevaun

    2015-01-01

    Effective intervention delivery requires ongoing assessment to determine whether students are learning at the desired rate. Intervention programs with embedded assessment procedures (i.e., assessment that occurs naturally "during" the process of delivering intervention) can potentially enhance instructional decisions. However, there is…

  3. Assessing Dimensionality in Complex Data Structures: A Performance Comparison of DETECT and NOHARM Procedures

    ERIC Educational Resources Information Center

    Svetina, Dubravka

    2011-01-01

    The purpose of this study was to investigate the effect of complex structure on dimensionality assessment in compensatory and noncompensatory multidimensional item response models (MIRT) of assessment data using dimensionality assessment procedures based on conditional covariances (i.e., DETECT) and a factor analytical approach (i.e., NOHARM). …

  4. Acceptability of Functional Behavioral Assessment Procedures to Special Educators and School Psychologists

    ERIC Educational Resources Information Center

    O'Neill, Robert E.; Bundock, Kaitlin; Kladis, Kristin; Hawken, Leanne S.

    2015-01-01

    This survey study assessed the acceptability of a variety of functional behavioral assessment (FBA) procedures (i.e., functional assessment interviews, rating scales/questionnaires, systematic direct observations, functional analysis manipulations) to a national sample of 123 special educators and a state sample of 140 school psychologists.…

  5. Moving beyond standard procedures to assess spontaneous recognition memory.

    PubMed

    Ameen-Ali, K E; Easton, A; Eacott, M J

    2015-06-01

    This review will consider how spontaneous tasks have been applied alongside neuroscientific techniques to test complex forms of recognition memory for objects and their environmental features, e.g. the spatial location of an object or the context in which it is presented. We discuss studies that investigate the roles of the perirhinal cortex and the hippocampus in recognition memory using standard testing paradigms, and consider how these findings contribute to the ongoing debate about whether recognition memory is a single unitary process or multiple processes that can be dissociated anatomically and functionally. Due to the wide use of spontaneous tasks, the need for improved procedures that reduce animal use is acknowledged, with multiple trial paradigms discussed as a novel way of reducing variability and animal numbers in these tasks. The importance of improving translation of animal models to humans is highlighted, with emphasis on a shift away from relying on the phenomenological experience of human subjects.

  6. Assessing the accuracy of the Second Military Survey for the Doren Landslide (Vorarlberg, Austria)

    NASA Astrophysics Data System (ADS)

    Zámolyi, András.; Székely, Balázs; Biszak, Sándor

    2010-05-01

    Reconstruction of the early and long-term evolution of landslide areas is especially important for determining the proportion of anthropogenic influence on the evolution of the region affected by mass movements. The recent geologic and geomorphological setting of the prominent Doren landslide in Vorarlberg (Western Austria) has been studied extensively by various research groups and civil engineering companies. Civil aerial imaging of the area dates back to the 1950's. Modern monitoring techniques include aerial imaging as well as airborne and terrestrial laser scanning (LiDAR) providing us with almost yearly assessment of the changing geomorphology of the area. However, initiation of the landslide occurred most probably earlier than the application of these methods, since there is evidence that the landslide was already active in the 1930's. For studying the initial phase of landslide formation one possibility is to get back on information recorded on historic photographs or historic maps. In this case study we integrated topographic information from the map sheets of the Second Military Survey of the Habsburg Empire that was conducted in Vorarlberg during the years 1816-1821 (Kretschmer et al., 2004) into a comprehensive GIS. The region of interest around the Doren landslide was georeferenced using the method of Timár et al. (2006) refined by Molnár (2009) thus providing a geodetically correct positioning and the possibility of matching the topographic features from the historic map with features recognized in the LiDAR DTM. The landslide of Doren is clearly visible in the historic map. Additionally, prominent geomorphologic features such as morphological scarps, rills and gullies, mass movement lobes and the course of the Weißach rivulet can be matched. Not only the shape and character of these elements can be recognized and matched, but also the positional accuracy is adequate for geomorphological studies. Since the settlement structure is very stable in the

  7. Accuracy assessment of airborne photogrammetrically derived high-resolution digital elevation models in a high mountain environment

    NASA Astrophysics Data System (ADS)

    Müller, Johann; Gärtner-Roer, Isabelle; Thee, Patrick; Ginzler, Christian

    2014-12-01

    High-resolution digital elevation models (DEMs) generated by airborne remote sensing are frequently used to analyze landform structures (monotemporal) and geomorphological processes (multitemporal) in remote areas or areas of extreme terrain. In order to assess and quantify such structures and processes it is necessary to know the absolute accuracy of the available DEMs. This study assesses the absolute vertical accuracy of DEMs generated by the High Resolution Stereo Camera-Airborne (HRSC-A), the Leica Airborne Digital Sensors 40/80 (ADS40 and ADS80) and the analogue camera system RC30. The study area is located in the Turtmann valley, Valais, Switzerland, a glacially and periglacially formed hanging valley stretching from 2400 m to 3300 m a.s.l. The photogrammetrically derived DEMs are evaluated against geodetic field measurements and an airborne laser scan (ALS). Traditional and robust global and local accuracy measurements are used to describe the vertical quality of the DEMs, which show a non Gaussian distribution of errors. The results show that all four sensor systems produce DEMs with similar accuracy despite their different setups and generations. The ADS40 and ADS80 (both with a ground sampling distance of 0.50 m) generate the most accurate DEMs in complex high mountain areas with a RMSE of 0.8 m and NMAD of 0.6 m They also show the highest accuracy relating to flying height (0.14‰). The pushbroom scanning system HRSC-A produces a RMSE of 1.03 m and a NMAD of 0.83 m (0.21‰ accuracy of the flying height and 10 times the ground sampling distance). The analogue camera system RC30 produces DEMs with a vertical accuracy of 1.30 m RMSE and 0.83 m NMAD (0.17‰ accuracy of the flying height and two times the ground sampling distance). It is also shown that the performance of the DEMs strongly depends on the inclination of the terrain. The RMSE of areas up to an inclination <40° is better than 1 m. In more inclined areas the error and outlier occurrence

  8. Assessing the accuracy of software predictions of mammalian and microbial metabolites

    EPA Science Inventory

    New chemical development and hazard assessments benefit from accurate predictions of mammalian and microbial metabolites. Fourteen biotransformation libraries encoded in eight software packages that predict metabolite structures were assessed for their sensitivity (proportion of ...

  9. An assessment of accuracy, error, and conflict with support values from genome-scale phylogenetic data.

    PubMed

    Taylor, Derek J; Piel, William H

    2004-08-01

    Despite the importance of molecular phylogenetics, few of its assumptions have been tested with real data. It is commonly assumed that nonparametric bootstrap values are an underestimate of the actual support, Bayesian posterior probabilities are an overestimate of the actual support, and among-gene phylogenetic conflict is low. We directly tested these assumptions by using a well-supported yeast reference tree. We found that bootstrap values were not significantly different from accuracy. Bayesian support values were, however, significant overestimates of accuracy but still had low false-positive error rates (0% to 2.8%) at the highest values (>99%). Although we found evidence for a branch-length bias contributing to conflict, there was little evidence for widespread, strongly supported among-gene conflict from bootstraps. The results demonstrate that caution is warranted concerning conclusions of conflict based on the assumption of underestimation for support values in real data. PMID:15140947

  10. Accuracy in Student Self-Assessment: Directions and Cautions for Research

    ERIC Educational Resources Information Center

    Brown, Gavin T. L.; Andrade, Heidi L.; Chen, Fei

    2015-01-01

    Student self-assessment is a central component of current conceptions of formative and classroom assessment. The research on self-assessment has focused on its efficacy in promoting both academic achievement and self-regulated learning, with little concern for issues of validity. Because reliability of testing is considered a sine qua non for the…

  11. Accuracy assessment of high frequency 3D ultrasound for digital impression-taking of prepared teeth

    NASA Astrophysics Data System (ADS)

    Heger, Stefan; Vollborn, Thorsten; Tinschert, Joachim; Wolfart, Stefan; Radermacher, Klaus

    2013-03-01

    Silicone based impression-taking of prepared teeth followed by plaster casting is well-established but potentially less reliable, error-prone and inefficient, particularly in combination with emerging techniques like computer aided design and manufacturing (CAD/CAM) of dental prosthesis. Intra-oral optical scanners for digital impression-taking have been introduced but until now some drawbacks still exist. Because optical waves can hardly penetrate liquids or soft-tissues, sub-gingival preparations still need to be uncovered invasively prior to scanning. High frequency ultrasound (HFUS) based micro-scanning has been recently investigated as an alternative to optical intra-oral scanning. Ultrasound is less sensitive against oral fluids and in principal able to penetrate gingiva without invasively exposing of sub-gingival preparations. Nevertheless, spatial resolution as well as digitization accuracy of an ultrasound based micro-scanning system remains a critical parameter because the ultrasound wavelength in water-like media such as gingiva is typically smaller than that of optical waves. In this contribution, the in-vitro accuracy of ultrasound based micro-scanning for tooth geometry reconstruction is being investigated and compared to its extra-oral optical counterpart. In order to increase the spatial resolution of the system, 2nd harmonic frequencies from a mechanically driven focused single element transducer were separated and corresponding 3D surface models were calculated for both fundamentals and 2nd harmonics. Measurements on phantoms, model teeth and human teeth were carried out for evaluation of spatial resolution and surface detection accuracy. Comparison of optical and ultrasound digital impression taking indicate that, in terms of accuracy, ultrasound based tooth digitization can be an alternative for optical impression-taking.

  12. Future dedicated Venus-SGG flight mission: Accuracy assessment and performance analysis

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Hsu, Houtse; Zhong, Min; Yun, Meijuan

    2016-01-01

    This study concentrates principally on the systematic requirements analysis for the future dedicated Venus-SGG (spacecraft gravity gradiometry) flight mission in China in respect of the matching measurement accuracies of the spacecraft-based scientific instruments and the orbital parameters of the spacecraft. Firstly, we created and proved the single and combined analytical error models of the cumulative Venusian geoid height influenced by the gravity gradient error of the spacecraft-borne atom-interferometer gravity gradiometer (AIGG) and the orbital position error and orbital velocity error tracked by the deep space network (DSN) on the Earth station. Secondly, the ultra-high-precision spacecraft-borne AIGG is propitious to making a significant contribution to globally mapping the Venusian gravitational field and modeling the geoid with unprecedented accuracy and spatial resolution through weighing the advantages and disadvantages among the electrostatically suspended gravity gradiometer, the superconducting gravity gradiometer and the AIGG. Finally, the future dedicated Venus-SGG spacecraft had better adopt the optimal matching accuracy indices consisting of 3 × 10-13/s2 in gravity gradient, 10 m in orbital position and 8 × 10-4 m/s in orbital velocity and the preferred orbital parameters comprising an orbital altitude of 300 ± 50 km, an observation time of 60 months and a sampling interval of 1 s.

  13. 43 CFR 11.33 - What types of assessment procedures are available?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... procedures: a procedure for coastal or marine environments, which incorporates the Natural Resource Damage... Lakes environments, which incorporates the Natural Resource Damage Assessment Model for Great Lakes... available? 11.33 Section 11.33 Public Lands: Interior Office of the Secretary of the Interior...

  14. 43 CFR 11.33 - What types of assessment procedures are available?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... procedures: a procedure for coastal or marine environments, which incorporates the Natural Resource Damage... Lakes environments, which incorporates the Natural Resource Damage Assessment Model for Great Lakes... available? 11.33 Section 11.33 Public Lands: Interior Office of the Secretary of the Interior...

  15. A Procedural Skills OSCE: Assessing Technical and Non-Technical Skills of Internal Medicine Residents

    ERIC Educational Resources Information Center

    Pugh, Debra; Hamstra, Stanley J.; Wood, Timothy J.; Humphrey-Murto, Susan; Touchie, Claire; Yudkowsky, Rachel; Bordage, Georges

    2015-01-01

    Internists are required to perform a number of procedures that require mastery of technical and non-technical skills, however, formal assessment of these skills is often lacking. The purpose of this study was to develop, implement, and gather validity evidence for a procedural skills objective structured clinical examination (PS-OSCE) for internal…

  16. Unrestricted Factor Analytic Procedures for Assessing Acquiescent Responding in Balanced, Theoretically Unidimensional Personality Scales

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Lorenzo-Seva, Urbano; Chico, Eliseo

    2003-01-01

    This article describes and proposes an unrestricted factor analytic procedure to: (a) assess the dimensionality and structure of a balanced personality scale taking into account the potential effects of acquiescent responding, and (b) correct the individual trait estimates for acquiescence. The procedure can be considered as an extension of ten…

  17. Newborn and Four-Week Retest on a Normative Population Using the Brazelton Newborn Assessment Procedure.

    ERIC Educational Resources Information Center

    Horowitz, Frances Degan; And Others

    A survey of assessment procedures of the newborn and of the infant during the first month of life was conducted; the survey indicated that there were instruments for evaluating the newborn and for evaluating the four-week-old infant, but there was no single procedure which included an evaluation of both the newborn and the four-week-old infant.…

  18. Assessing the Item Response Theory with Covariate (IRT-C) Procedure for Ascertaining Differential Item Functioning

    ERIC Educational Resources Information Center

    Tay, Louis; Vermunt, Jeroen K.; Wang, Chun

    2013-01-01

    We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform and/or nonuniform DIF using the Wald statistic.…

  19. Increased Throwing Accuracy Improves Children's Catching Performance in a Ball-Catching Task from the Movement Assessment Battery (MABC-2)

    PubMed Central

    Dirksen, Tim; De Lussanet, Marc H. E.; Zentgraf, Karen; Slupinski, Lena; Wagner, Heiko

    2016-01-01

    The Movement Assessment Battery for Children (MABC-2) is a functional test for identifying deficits in the motor performance of children. The test contains a ball-catching task that requires the children to catch a self-thrown ball with one hand. As the task can be executed with a variety of different catching strategies, it is assumed that the task success can also vary considerably. Even though it is not clear, whether the performance merely depends on the catching skills or also to some extent on the throwing skills, the MABC-2 takes into account only the movement outcome. Therefore, the purpose of the current study was to examine (1) to what extent the throwing accuracy has an effect on the children's catching performance and (2) to what extent the throwing accuracy influences their choice of catching strategy. In line with the test manual, the children's catching performance was quantified on basis of the number of correctly caught balls. The throwing accuracy and the catching strategy were quantified by applying a kinematic analysis on the ball's trajectory and the hand movements. Based on linear regression analyses, we then investigated the relation between throwing accuracy, catching performance and catching strategy. The results show that an increased throwing accuracy is significantly correlated with an increased catching performance. Moreover, a higher throwing accuracy is significantly correlated with a longer duration of the hand on the ball's parabola, which indicates that throwing the ball more accurately could enable the children to effectively reduce the requirements on temporal precision. As the children's catching performance and their choice of catching strategy in the ball-catching task of the MABC-2 are substantially determined by their throwing accuracy, the test evaluation should not be based on the movement outcome alone, but should also take into account the children's throwing performance. Our findings could be of particular value for the

  20. Increased Throwing Accuracy Improves Children's Catching Performance in a Ball-Catching Task from the Movement Assessment Battery (MABC-2).

    PubMed

    Dirksen, Tim; De Lussanet, Marc H E; Zentgraf, Karen; Slupinski, Lena; Wagner, Heiko

    2016-01-01

    The Movement Assessment Battery for Children (MABC-2) is a functional test for identifying deficits in the motor performance of children. The test contains a ball-catching task that requires the children to catch a self-thrown ball with one hand. As the task can be executed with a variety of different catching strategies, it is assumed that the task success can also vary considerably. Even though it is not clear, whether the performance merely depends on the catching skills or also to some extent on the throwing skills, the MABC-2 takes into account only the movement outcome. Therefore, the purpose of the current study was to examine (1) to what extent the throwing accuracy has an effect on the children's catching performance and (2) to what extent the throwing accuracy influences their choice of catching strategy. In line with the test manual, the children's catching performance was quantified on basis of the number of correctly caught balls. The throwing accuracy and the catching strategy were quantified by applying a kinematic analysis on the ball's trajectory and the hand movements. Based on linear regression analyses, we then investigated the relation between throwing accuracy, catching performance and catching strategy. The results show that an increased throwing accuracy is significantly correlated with an increased catching performance. Moreover, a higher throwing accuracy is significantly correlated with a longer duration of the hand on the ball's parabola, which indicates that throwing the ball more accurately could enable the children to effectively reduce the requirements on temporal precision. As the children's catching performance and their choice of catching strategy in the ball-catching task of the MABC-2 are substantially determined by their throwing accuracy, the test evaluation should not be based on the movement outcome alone, but should also take into account the children's throwing performance. Our findings could be of particular value for the

  1. Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.

    PubMed

    Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer

    2016-09-01

    Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types. PMID:27343591

  2. Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.

    PubMed

    Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer

    2016-09-01

    Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types.

  3. Operant procedures for assessing behavioral flexibility in rats.

    PubMed

    Brady, Anne Marie; Floresco, Stan B

    2015-02-15

    Executive functions consist of multiple high-level cognitive processes that drive rule generation and behavioral selection. An emergent property of these processes is the ability to adjust behavior in response to changes in one's environment (i.e., behavioral flexibility). These processes are essential to normal human behavior, and may be disrupted in diverse neuropsychiatric conditions, including schizophrenia, alcoholism, depression, stroke, and Alzheimer's disease. Understanding of the neurobiology of executive functions has been greatly advanced by the availability of animal tasks for assessing discrete components of behavioral flexibility, particularly strategy shifting and reversal learning. While several types of tasks have been developed, most are non-automated, labor intensive, and allow testing of only one animal at a time. The recent development of automated, operant-based tasks for assessing behavioral flexibility streamlines testing, standardizes stimulus presentation and data recording, and dramatically improves throughput. Here, we describe automated strategy shifting and reversal tasks, using operant chambers controlled by custom written software programs. Using these tasks, we have shown that the medial prefrontal cortex governs strategy shifting but not reversal learning in the rat, similar to the dissociation observed in humans. Moreover, animals with a neonatal hippocampal lesion, a neurodevelopmental model of schizophrenia, are selectively impaired on the strategy shifting task but not the reversal task. The strategy shifting task also allows the identification of separate types of performance errors, each of which is attributable to distinct neural substrates. The availability of these automated tasks, and the evidence supporting the dissociable contributions of separate prefrontal areas, makes them particularly well-suited assays for the investigation of basic neurobiological processes as well as drug discovery and screening in disease models.

  4. Operant procedures for assessing behavioral flexibility in rats.

    PubMed

    Brady, Anne Marie; Floresco, Stan B

    2015-01-01

    Executive functions consist of multiple high-level cognitive processes that drive rule generation and behavioral selection. An emergent property of these processes is the ability to adjust behavior in response to changes in one's environment (i.e., behavioral flexibility). These processes are essential to normal human behavior, and may be disrupted in diverse neuropsychiatric conditions, including schizophrenia, alcoholism, depression, stroke, and Alzheimer's disease. Understanding of the neurobiology of executive functions has been greatly advanced by the availability of animal tasks for assessing discrete components of behavioral flexibility, particularly strategy shifting and reversal learning. While several types of tasks have been developed, most are non-automated, labor intensive, and allow testing of only one animal at a time. The recent development of automated, operant-based tasks for assessing behavioral flexibility streamlines testing, standardizes stimulus presentation and data recording, and dramatically improves throughput. Here, we describe automated strategy shifting and reversal tasks, using operant chambers controlled by custom written software programs. Using these tasks, we have shown that the medial prefrontal cortex governs strategy shifting but not reversal learning in the rat, similar to the dissociation observed in humans. Moreover, animals with a neonatal hippocampal lesion, a neurodevelopmental model of schizophrenia, are selectively impaired on the strategy shifting task but not the reversal task. The strategy shifting task also allows the identification of separate types of performance errors, each of which is attributable to distinct neural substrates. The availability of these automated tasks, and the evidence supporting the dissociable contributions of separate prefrontal areas, makes them particularly well-suited assays for the investigation of basic neurobiological processes as well as drug discovery and screening in disease models

  5. Assessing the impacts of precipitation bias on distributed hydrologic model calibration and prediction accuracy

    NASA Astrophysics Data System (ADS)

    Looper, Jonathan P.; Vieux, Baxter E.; Moreno, Maria A.

    2012-02-01

    SummaryPhysics-based distributed (PBD) hydrologic models predict runoff throughout a basin using the laws of conservation of mass and momentum, and benefit from more accurate and representative precipitation input. V flo™ is a gridded distributed hydrologic model that predicts runoff and continuously updates soil moisture. As a participating model in the second Distributed Model Intercomparison Project (DMIP2), V flo™ is applied to the Illinois and Blue River basins in Oklahoma. Model parameters are derived from geospatial data for initial setup, and then adjusted to reproduce the observed flow under continuous time-series simulations and on an event basis. Simulation results demonstrate that certain runoff events are governed by saturation excess processes, while in others, infiltration-rate excess processes dominate. Streamflow prediction accuracy is enhanced when multi-sensor precipitation estimates (MPE) are bias corrected through re-analysis of the MPE provided in the DMIP2 experiment, resulting in gauge-corrected precipitation estimates (GCPE). Model calibration identified a set of parameters that minimized objective functions for errors in runoff volume and instantaneous discharge. Simulated streamflow for the Blue and Illinois River basins, have Nash-Sutcliffe efficiency coefficients between 0.61 and 0.68, respectively, for the 1996-2002 period using GCPE. The streamflow prediction accuracy improves by 74% in terms of Nash-Sutcliffe efficiency when GCPE is used during the calibration period. Without model calibration, excellent agreement between hourly simulated and observed discharge is obtained for the Illinois, whereas in the Blue River, adjustment of parameters affecting both saturation and infiltration-rate excess processes were necessary. During the 1996-2002 period, GCPE input was more important than model calibration for the Blue River, while model calibration proved more important for the Illinois River. During the verification period (2002

  6. Creating a Standard Set of Metrics to Assess Accuracy of Solar Forecasts: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Banunarayanan, V.; Brockway, A.; Marquis, M.; Haupt, S. E.; Brown, B.; Fowler, T.; Jensen, T.; Hamann, H.; Lu, S.; Hodge, B.; Zhang, J.; Florita, A.

    2013-12-01

    The U.S. Department of Energy (DOE) SunShot Initiative, launched in 2011, seeks to reduce the cost of solar energy systems by 75% from 2010 to 2020. In support of the SunShot Initiative, the DOE Office of Energy Efficiency and Renewable Energy (EERE) is partnering with the National Oceanic and Atmospheric Administration (NOAA) and solar energy stakeholders to improve solar forecasting. Through a funding opportunity announcement issued in the April, 2012, DOE is funding two teams - led by National Center for Atmospheric Research (NCAR), and by IBM - to perform three key activities in order to improve solar forecasts. The teams will: (1) With DOE and NOAA's leadership and significant stakeholder input, develop a standardized set of metrics to evaluate forecast accuracy, and determine the baseline and target values for these metrics; (2) Conduct research that yields a transformational improvement in weather models and methods for forecasting solar irradiance and power; and (3) Incorporate solar forecasts into the system operations of the electric power grid, and evaluate the impact of forecast accuracy on the economics and reliability of operations using the defined, standard metrics. This paper will present preliminary results on the first activity: the development of a standardized set of metrics, baselines and target values. The results will include a proposed framework for metrics development, key categories of metrics, descriptions of each of the proposed set of specific metrics to measure forecast accuracy, feedback gathered from a range of stakeholders on the metrics, and processes to determine baselines and target values for each metric. The paper will also analyze the temporal and spatial resolutions under which these metrics would apply, and conclude with a summary of the work in progress on solar forecasting activities funded by DOE.

  7. Accuracy assessment of the ERP prediction method based on analysis of 100-year ERP series

    NASA Astrophysics Data System (ADS)

    Malkin, Z.; Tissen, V. M.

    2012-12-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole motion (PM). In this study, a detailed comparison was made of real-time UT1 predictions made in 2006-2011 and PMpredictions made in 2009-2011making use of the SNIIM method with simultaneous predictions computed at the International Earth Rotation and Reference Systems Service (IERS), USNO. Obtained results have shown that proposed method provides better accuracy at different prediction lengths.

  8. Assessment of accuracy of in-situ methods for measuring building-envelope thermal resistance

    SciTech Connect

    Fang, J.B.; Grot, R.A.; Park, H.S.

    1986-03-01

    A series of field and laboratory tests were conducted to evaluate the accuracy of in-situ thermal-resistance-measurement techniques. The results of thermal-performance evaluation of the exterior walls of six thermal mass test houses situated in Gaithersburg, Maryland are presented. The wall construction of these one-room houses includes insulated light-weight wood frame, uninsulated light-weight wood frame, insulated masonry with outside mass, uninsulated masonry, log, and insulated masonry with inside mass. In-situ measurements of heat transfer through building envelopes were made with heat flux transducers and portable calorimeters.

  9. Assessing Accuracy of Exchange-Correlation Functionals for the Description of Atomic Excited States

    NASA Astrophysics Data System (ADS)

    Makowski, Marcin; Hanas, Martyna

    2016-09-01

    The performance of exchange-correlation functionals for the description of atomic excitations is investigated. A benchmark set of excited states is constructed and experimental data is compared to Time-Dependent Density Functional Theory (TDDFT) calculations. The benchmark results show that for the selected group of functionals good accuracy may be achieved and the quality of predictions provided is competitive to computationally more demanding coupled-cluster approaches. Apart from testing the standard TDDFT approaches, also the role of self-interaction error plaguing DFT calculations and the adiabatic approximation to the exchange-correlation kernels is given some insight.

  10. Methods in Use for Sensitivity Analysis, Uncertainty Evaluation, and Target Accuracy Assessment

    SciTech Connect

    G. Palmiotti; M. Salvatores; G. Aliberti

    2007-10-01

    Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. In this paper the theory, based on the adjoint approach, that is implemented in the ERANOS fast reactor code system is presented along with some unique tools and features related to specific types of problems as is the case for nuclide transmutation, reactivity loss during the cycle, decay heat, neutron source associated to fuel fabrication, and experiment representativity.

  11. Assessment of the accuracy of infrared and electromagnetic navigation using an industrial robot: Which factors are influencing the accuracy of navigation?

    PubMed

    Liodakis, Emmanouil; Chu, Kongfai; Westphal, Ralf; Krettek, Christian; Citak, Musa; Gosling, Thomas; Kenawey, Mohamed

    2011-10-01

    Our objectives were to detect factors that influence the accuracy of surgical navigation (magnitude of deformity, plane of deformity, position of the navigation bases) and compare the accuracy of infrared with electromagnetic navigation. Human cadaveric femora were used. A robot connected with a computer moved one of the bony fragments in a desired direction. The bases of the infrared navigation (BrainLab) and the receivers of the electromagnetic device (Fastrak-Pohlemus) were attached to the proximal and distal parts of the bone. For the first part of the study, deformities were classified in eight groups (e.g., 0 to 5(°)). For the second part, the bases were initially placed near the osteotomy and then far away. The mean absolute differences between both navigation system measurements and the robotic angles were significantly affected by the magnitude of angulation with better accuracy for smaller angulations (p < 0.001). The accuracy of infrared navigation was significantly better in the frontal and sagittal plane. Changing the position of the navigation bases near and far away from the deformity apex had no significant effect on the accuracy of infrared navigation; however, it influenced the accuracy of electromagnetic navigation in the frontal plane (p < 0.001). In conclusion, the use of infrared navigation systems for corrections of small angulation-deformities in the frontal or sagittal plane provides the most accurate results, irrespectively from the positioning of the navigation bases.

  12. Assessing the accuracy of the International Classification of Diseases codes to identify abusive head trauma: a feasibility study

    PubMed Central

    Berger, Rachel P; Parks, Sharyn; Fromkin, Janet; Rubin, Pamela; Pecora, Peter J

    2016-01-01

    Objective To assess the accuracy of an International Classification of Diseases (ICD) code-based operational case definition for abusive head trauma (AHT). Methods Subjects were children <5 years of age evaluated for AHT by a hospital-based Child Protection Team (CPT) at a tertiary care paediatric hospital with a completely electronic medical record (EMR) system. Subjects were designated as non-AHT traumatic brain injury (TBI) or AHT based on whether the CPT determined that the injuries were due to AHT. The sensitivity and specificity of the ICD-based definition were calculated. Results There were 223 children evaluated for AHT: 117 AHT and 106 non-AHT TBI. The sensitivity and specificity of the ICD-based operational case definition were 92% (95% CI 85.8 to 96.2) and 96% (95% CI 92.3 to 99.7), respectively. All errors in sensitivity and three of the four specificity errors were due to coder error; one specificity error was a physician error. Conclusions In a paediatric tertiary care hospital with an EMR system, the accuracy of an ICD-based case definition for AHT was high. Additional studies are needed to assess the accuracy of this definition in all types of hospitals in which children with AHT are cared for. PMID:24167034

  13. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  14. Standardisation of platelet counting accuracy in blood banks by reference to an automated immunoplatelet procedure: comparative evaluation of Cell-Dyn CD4000 impedance and optical platelet counts.

    PubMed

    Johannessen, B; Haugen, T; Scott, C S

    2001-10-01

    were RBC-free, that an inappropriate correction factor was applied. Consequently, the CD4000 impedance platelet count will provide reliable platelet counts, irrespective of the day of platelet unit storage, when a factor of 1.25 is applied to the system-reported result. By comparison, optical methods are more likely to be affected by subtle morphological changes that may result from anticoagulants or progressive storage time. The method limitations documented by this study may well affect many other analysers and mean that the implementation of process control statistics related to platelet counts may be less reliable than previously assumed. It is suggested that standardisation could be much better achieved if there was some form of system cross-calibration that was referenced to an independent method such as an immunoplatelet assay. It is proposed that studies of this type should be extended to a wide assessment of platelet count accuracy of blood bank instruments in order to standardise data within national organisations. If consistent inter-instrument correction factors such as those documented here can be identified, it would considerably increase the relevance of determining platelet counts in production control processes.

  15. Accuracy assessment of land cover/land use classifiers in dry and humid areas of Iran.

    PubMed

    Yousefi, Saleh; Khatami, Reza; Mountrakis, Giorgos; Mirzaee, Somayeh; Pourghasemi, Hamid Reza; Tazeh, Mehdi

    2015-10-01

    Land cover/land use (LCLU) maps are essential inputs for environmental analysis. Remote sensing provides an opportunity to construct LCLU maps of large geographic areas in a timely fashion. Knowing the most accurate classification method to produce LCLU maps based on site characteristics is necessary for the environment managers. The aim of this research is to examine the performance of various classification algorithms for LCLU mapping in dry and humid climates (from June to August). Testing is performed in three case studies from each of the two climates in Iran. The reference dataset of each image was randomly selected from the entire images and was randomly divided into training and validation set. Training sets included 400 pixels, and validation sets included 200 pixels of each LCLU. Results indicate that the support vector machine (SVM) and neural network methods can achieve higher overall accuracy (86.7 and 86.6%) than other examined algorithms, with a slight advantage for the SVM. Dry areas exhibit higher classification difficulty as man-made features often have overlapping spectral responses to soil. A further observation is that spatial segregation and lower mixture of LCLU classes can increase classification overall accuracy.

  16. Assessment of Classification Accuracies of SENTINEL-2 and LANDSAT-8 Data for Land Cover / Use Mapping

    NASA Astrophysics Data System (ADS)

    Hale Topaloğlu, Raziye; Sertel, Elif; Musaoğlu, Nebiye

    2016-06-01

    This study aims to compare classification accuracies of land cover/use maps created from Sentinel-2 and Landsat-8 data. Istanbul metropolitan city of Turkey, with a population of around 14 million, having different landscape characteristics was selected as study area. Water, forest, agricultural areas, grasslands, transport network, urban, airport- industrial units and barren land- mine land cover/use classes adapted from CORINE nomenclature were used as main land cover/use classes to identify. To fulfil the aims of this research, recently acquired dated 08/02/2016 Sentinel-2 and dated 22/02/2016 Landsat-8 images of Istanbul were obtained and image pre-processing steps like atmospheric and geometric correction were employed. Both Sentinel-2 and Landsat-8 images were resampled to 30m pixel size after geometric correction and similar spectral bands for both satellites were selected to create a similar base for these multi-sensor data. Maximum Likelihood (MLC) and Support Vector Machine (SVM) supervised classification methods were applied to both data sets to accurately identify eight different land cover/ use classes. Error matrix was created using same reference points for Sentinel-2 and Landsat-8 classifications. After the classification accuracy, results were compared to find out the best approach to create current land cover/use map of the region. The results of MLC and SVM classification methods were compared for both images.

  17. Accuracy Assessment of Geostationary-Earth-Orbit with Simplified Perturbations Models

    NASA Astrophysics Data System (ADS)

    Ma, Lihua; Xu, Xiaojun; Pang, Feng

    2016-06-01

    A two-line element set (TLE) is a data format encoding orbital elements of an Earth-orbiting object for a given epoch. Using suitable prediction formula, the motion state of the object can be obtained at any time. The TLE data representation is specific to the simplified perturbations models, so any algorithm using a TLE as a data source must implement one of these models to correctly compute the state at a specific time. Accurately adjustment of antenna direction on the earth station is the key to satellite communications. With the TLE set topocentric elevation and azimuth direction angles can be calculated. The accuracy of perturbations models directly affect communication signal quality. Therefore, finding the error variations of the satellite orbits is really meaningful. In this present paper, the authors investigate the accuracy of the Geostationary-Earth-Orbit (GEO) with simplified perturbations models. The coordinate residuals of the simplified perturbations models in this paper can give references for engineers to predict the satellite orbits with TLE.

  18. Hazard identification and risk assessment procedure for genetically modified plants in the field--GMHAZID.

    PubMed

    Koivisto, Raija A; Törmäkangas, Kirsi M; Kauppinen, Veli S

    2002-01-01

    The safe application of genetically modified organisms (GMOs) requires a risk assessment prior to their proposed use. Based on methods from the chemical industry, we developed a hazard identification procedure for the risk assessment of field tests with genetically modified plants. This risk assessment method, GMHAZID, is carried out in the form of guided brainstorm sessions. GMHAZID was tested with a case study for which a risk assessment had previously been made, and the results of the assessments were compared. The results showed that some new hazards potentially leading to uncontrolled spreading, in addition to those from the previous assessment, were identified using GMHAZID. GMHAZID also recognised some hazards leading to failures in the field experiments. We suggest that GMHAZID provides systematics, reliability, and transparency to the risk assessment procedure.

  19. Assessing the accuracy and repeatability of automated photogrammetrically generated digital surface models from unmanned aerial system imagery

    NASA Astrophysics Data System (ADS)

    Chavis, Christopher

    Using commercial digital cameras in conjunction with Unmanned Aerial Systems (UAS) to generate 3-D Digital Surface Models (DSMs) and orthomosaics is emerging as a cost-effective alternative to Light Detection and Ranging (LiDAR). Powerful software applications such as Pix4D and APS can automate the generation of DSM and orthomosaic products from a handful of inputs. However, the accuracy of these models is relatively untested. The objectives of this study were to generate multiple DSM and orthomosaic pairs of the same area using Pix4D and APS from flights of imagery collected with a lightweight UAS. The accuracy of each individual DSM was assessed in addition to the consistency of the method to model one location over a period of time. Finally, this study determined if the DSMs automatically generated using lightweight UAS and commercial digital cameras could be used for detecting changes in elevation and at what scale. Accuracy was determined by comparing DSMs to a series of reference points collected with survey grade GPS. Other GPS points were also used as control points to georeference the products within Pix4D and APS. The effectiveness of the products for change detection was assessed through image differencing and observance of artificially induced, known elevation changes. The vertical accuracy with the optimal data and model is ≈ 25 cm and the highest consistency over repeat flights is a standard deviation of ≈ 5 cm. Elevation change detection based on such UAS imagery and DSM models should be viable for detecting infrastructure change in urban or suburban environments with little dense canopy vegetation.

  20. Assessment of accuracy of adopted centre of mass corrections for the Etalon geodetic satellites

    NASA Astrophysics Data System (ADS)

    Appleby, Graham; Dunn, Peter; Otsubo, Toshimichi; Rodriguez, Jose

    2016-04-01

    Accurate centre-of-mass corrections are key parameters in the analysis of satellite laser ranging observations. In order to meet current accuracy requirements, the vector from the reflection point of a laser retroreflector array to the centre of mass of the orbiting spacecraft must be known with mm-level accuracy. In general, the centre-of-mass correction will be dependent on the characteristics of the target (geometry, construction materials, type of retroreflectors), the hardware employed by the tracking station (laser system, detector type), the intensity of the returned laser pulses, and the post-processing strategy employed to reduce the observations [1]. For the geodetic targets used by the ILRS to produce the SLR contribution to the ITRF, the LAGEOS and Etalon satellite pairs, there are centre-of-mass correction tables available for each tracking station [2]. These values are based on theoretical considerations, empirical determination of the optical response functions of each satellite, and knowledge of the tracking technology and return intensity employed [1]. Here we present results that put into question the accuracy of some of the current values for the centre-of-mass corrections of the Etalon satellites. We have computed weekly reference frame solutions using LAGEOS and Etalon observations for the period 1996-2014, estimating range bias parameters for each satellite type along with station coordinates. Analysis of the range bias time series reveals an unexplained, cm-level positive bias for the Etalon satellites in the case of most stations operating at high energy return levels. The time series of tracking stations that have undergone a transition from different modes of operation provide the evidence pointing to an inadequate centre-of-mass modelling. [1] Otsubo, T., and G.M. Appleby, System-dependent centre-of-mass correction for spherical geodetic satellites, J Geophys. Res., 108(B4), 2201, 2003 [2] Appleby, G.M., and T. Otsubo, Centre of Mass

  1. Accuracy assessment of photogrammetric digital elevation models generated for the Schultz Fire burn area

    NASA Astrophysics Data System (ADS)

    Muise, Danna K.

    This paper evaluates the accuracy of two digital photogrammetric software programs (ERDAS Imagine LPS and PCI Geomatica OrthoEngine) with respect to high-resolution terrain modeling in a complex topographic setting affected by fire and flooding. The site investigated is the 2010 Schultz Fire burn area, situated on the eastern edge of the San Francisco Peaks approximately 10 km northeast of Flagstaff, Arizona. Here, the fire coupled with monsoon rains typical of northern Arizona drastically altered the terrain of the steep mountainous slopes and residential areas below the burn area. To quantify these changes, high resolution (1 m and 3 m) digital elevation models (DEMs) were generated of the burn area using color stereoscopic aerial photographs taken at a scale of approximately 1:12000. Using a combination of pre-marked and post-marked ground control points (GCPs), I first used ERDAS Imagine LPS to generate a 3 m DEM covering 8365 ha of the affected area. This data was then compared to a reference DEM (USGS 10 m) to evaluate the accuracy of the resultant DEM. Findings were then divided into blunders (errors) and bias (slight differences) and further analyzed to determine if different factors (elevation, slope, aspect and burn severity) affected the accuracy of the DEM. Results indicated that both blunders and bias increased with an increase in slope, elevation and burn severity. It was also found that southern facing slopes contained the highest amount of bias while northern facing slopes contained the highest proportion of blunders. Further investigations compared a 1 m DEM generated using ERDAS Imagine LPS with a 1 m DEM generated using PCI Geomatica OrthoEngine for a specific region of the burn area. This area was limited to the overlap of two images due to OrthoEngine requiring at least three GCPs to be located in the overlap of the imagery. Results indicated that although LPS produced a less accurate DEM, it was much more flexible than OrthoEngine. It was also

  2. A SUB-PIXEL ACCURACY ASSESSMENT FRAMEWORK FOR DETERMINING LANDSAT TM DERIVED IMPERVIOUS SURFACE ESTIMATES.

    EPA Science Inventory

    The amount of impervious surface in a watershed is a landscape indicator integrating a number of concurrent interactions that influence a watershed's hydrology. Remote sensing data and techniques are viable tools to assess anthropogenic impervious surfaces. However a fundamental ...

  3. Assessment of surgical wounds in the home health patient: definitions and accuracy with OASIS-C.

    PubMed

    Trexler, Rhonda A

    2011-10-01

    The number of surgical patients receiving home care continues to grow as hospitals discharge patients sooner. Home health clinicians must gain knowledge of the wound healing stages and surgical wound classification to collect accurate data in the Outcome and Assessment Information Set-C (OASIS-C). This article provides the information clinicians need to accurately assess surgical wounds and implement best practices for improving surgical wounds in the home health patient.

  4. Florida Statewide Assessment Program 1971-72 Technical Report; Section 1: Introduction, Procedures, and Program Recommendations.

    ERIC Educational Resources Information Center

    Haynes, Judy L.; Impara, James C.

    The first section of a four-part technical report of Florida's statewide program for assessing reading-related skills in grades 2 and 4 provides an introduction to the program, a description of procedures used, and recommendations regarding program operation. Program background, design, and responsibility for assessment activities are discussed in…

  5. The Implicit Relational Assessment Procedure as a Measure of Self-Esteem

    ERIC Educational Resources Information Center

    Timko, C. Alix; England, Erica L.; Herbert, James D.; Forman, Evan M.

    2010-01-01

    Two studies were conducted to pilot the Implicit Relational Assessment Procedure (IRAP) in measuring attitudes toward the self: one related to body image specifically and another assessing the broader construct of self-esteem. Study 1 utilized the IRAP with female college students to examine self-referential beliefs regarding body image. Results…

  6. The Importance of Assessment Procedures to Student Learning Outcomes in Religious Education.

    ERIC Educational Resources Information Center

    Cox, Philip; Godfrey, John R.

    In Perth, Western Australia, summative assessment has not been a teaching tool in the teaching of religious education courses in the Catholic schools. This study investigated whether the use of formal assessment procedures in the teaching of religion had an effect on student learning outcomes. Subjects were 128 students (4 classes) in year 8 of an…

  7. Needs Assessment Procedure: Mainstreaming Handicapped. Volume II. A Manual for Vocational Education Administrators. Final Report.

    ERIC Educational Resources Information Center

    Hughes, James H.; Rice, Eric

    Intended to assist local vocational education administrators in needs assessment and planning procedures for mainstreaming handicapped students, this manual presents a five-step process: (1) needs assessment and barrier identification (includes instruction in the nominal group technique process); (2), goal, objective, and strategy development (a…

  8. Considering Consistency: Conceptual and Procedural Guidance for Reliability in a Local Assessment System.

    ERIC Educational Resources Information Center

    Maine Department of Education, 2004

    2004-01-01

    The purpose of "Considering Consistency" is to provide conceptual and procedural guidance regarding reliability within the context of a local assessment system. Some of this document's contents echo "Measured Measures." For instance, methods for estimating the reliability of individual assessments, particularly as appropriate for…

  9. Assessment of the labelling accuracy of spanish semipreserved anchovies products by FINS (forensically informative nucleotide sequencing).

    PubMed

    Velasco, Amaya; Aldrey, Anxela; Pérez-Martín, Ricardo I; Sotelo, Carmen G

    2016-06-01

    Anchovies have been traditionally captured and processed for human consumption for millennia. In the case of Spain, ripened and salted anchovies are a delicacy, which, in some cases, can reach high commercial values. Although there have been a number of studies presenting DNA methodologies for the identification of anchovies, this is one of the first studies investigating the level of mislabelling in this kind of products in Europe. Sixty-three commercial semipreserved anchovy products were collected in different types of food markets in four Spanish cities to check labelling accuracy. Species determination in these commercial products was performed by sequencing two different cyt-b mitochondrial DNA fragments. Results revealed mislabelling levels higher than 15%, what authors consider relatively high considering the importance of the product. The most frequent substitute species was the Argentine anchovy, Engraulis anchoita, which can be interpreted as an economic fraud.

  10. Assessment of Required Accuracy of Digital Elevation Data for Hydrologic Modeling

    NASA Technical Reports Server (NTRS)

    Kenward, T.; Lettenmaier, D. P.

    1997-01-01

    The effect of vertical accuracy of Digital Elevation Models (DEMs) on hydrologic models is evaluated by comparing three DEMs and resulting hydrologic model predictions applied to a 7.2 sq km USDA - ARS watershed at Mahantango Creek, PA. The high resolution (5 m) DEM was resempled to a 30 m resolution using method that constrained the spatial structure of the elevations to be comparable with the USGS and SIR-C DEMs. This resulting 30 m DEM was used as the reference product for subsequent comparisons. Spatial fields of directly derived quantities, such as elevation differences, slope, and contributing area, were compared to the reference product, as were hydrologic model output fields derived using each of the three DEMs at the common 30 m spatial resolution.

  11. ON THE ACCURACY OF THE PROPAGATION THEORY AND THE QUALITY OF BACKGROUND OBSERVATIONS IN A SCHUMANN RESONANCE INVERSION PROCEDURE Vadim MUSHTAK, Earle WILLIAMS PARSONS LABORATORY, MIT

    NASA Astrophysics Data System (ADS)

    Mushtak, V. C.

    2009-12-01

    Observations of electromagnetic fields in the Schumann resonance (SR) frequency range (5 to 40 Hz) contain information about both the major source of the electromagnetic radiation (repeatedly confirmed to be global lightning activity) and the source-to-observer propagation medium (the Earth-ionosphere waveguide). While the electromagnetic signatures from individual lightning discharges provide preferable experimental material for exploring the medium, the properties of the world-wide lightning process are best reflected in background spectral SR observations. In the latter, electromagnetic contributions from thousands of lightning discharges are accumulated in intervals of about 10-15 minutes - long enough to present a statistically significant (and so theoretically treatable) ensemble of individual flashes, and short enough to reflect the spatial-temporal dynamics of global lightning activity. Thanks to the small (well below 1 dB/Mm) attenuation in the SR range and the accumulated nature of background SR observations, the latter present globally integrated information about lightning activity not available via other (satellite, meteorological) techniques. The most interesting characteristics to be extracted in an inversion procedure are the rates of vertical charge moment change (and their temporal variations) in the major global lightning “chimneys”. The success of such a procedure depends critically on the accuracy of the propagation theory (used to carry out “direct” calculations for the inversion) and the quality of experimental material. Due to the nature of the problem, both factors - the accuracy and the quality - can only be estimated indirectly, which requires specific approaches to assure that the estimates are realistic and more importantly, that the factors could be improved. For the first factor, simulations show that the widely exploited theory of propagation in a uniform (spherically symmetrical) waveguide provides unacceptable (up to

  12. Prostate Localization on Daily Cone-Beam Computed Tomography Images: Accuracy Assessment of Similarity Metrics

    SciTech Connect

    Kim, Jinkoo; Hammoud, Rabih; Pradhan, Deepak; Zhong Hualiang; Jin, Ryan Y.; Movsas, Benjamin; Chetty, Indrin J.

    2010-07-15

    Purpose: To evaluate different similarity metrics (SM) using natural calcifications and observation-based measures to determine the most accurate prostate and seminal vesicle localization on daily cone-beam CT (CBCT) images. Methods and Materials: CBCT images of 29 patients were retrospectively analyzed; 14 patients with prostate calcifications (calcification data set) and 15 patients without calcifications (no-calcification data set). Three groups of test registrations were performed. Test 1: 70 CT/CBCT pairs from calcification dataset were registered using 17 SMs (6,580 registrations) and compared using the calcification mismatch error as an endpoint. Test 2: Using the four best SMs from Test 1, 75 CT/CBCT pairs in the no-calcification data set were registered (300 registrations). Accuracy of contour overlays was ranked visually. Test 3: For the best SM from Tests 1 and 2, accuracy was estimated using 356 CT/CBCT registrations. Additionally, target expansion margins were investigated for generating registration regions of interest. Results: Test 1-Incremental sign correlation (ISC), gradient correlation (GC), gradient difference (GD), and normalized cross correlation (NCC) showed the smallest errors ({mu} {+-} {sigma}: 1.6 {+-} 0.9 {approx} 2.9 {+-} 2.1 mm). Test 2-Two of the three reviewers ranked GC higher. Test 3-Using GC, 96% of registrations showed <3-mm error when calcifications were filtered. Errors were left/right: 0.1 {+-} 0.5mm, anterior/posterior: 0.8 {+-} 1.0mm, and superior/inferior: 0.5 {+-} 1.1 mm. The existence of calcifications increased the success rate to 97%. Expansion margins of 4-10 mm were equally successful. Conclusion: Gradient-based SMs were most accurate. Estimated error was found to be <3 mm (1.1 mm SD) in 96% of the registrations. Results suggest that the contour expansion margin should be no less than 4 mm.

  13. Accuracy of task recall for epidemiological exposure assessment to construction noise

    PubMed Central

    Reeb-Whitaker, C; Seixas, N; Sheppard, L; Neitzel, R

    2004-01-01

    Aims: To validate the accuracy of construction worker recall of task and environment based information; and to evaluate the effect of task recall on estimates of noise exposure. Methods: A cohort of 25 construction workers recorded tasks daily and had dosimetry measurements weekly for six weeks. Worker recall of tasks reported on the daily activity cards was validated with research observations and compared directly to task recall at a six month interview. Results: The mean LEQ noise exposure level (dBA) from dosimeter measurements was 89.9 (n = 61) and 83.3 (n = 47) for carpenters and electricians, respectively. The percentage time at tasks reported during the interview was compared to that calculated from daily activity cards; only 2/22 tasks were different at the nominal 5% significance level. The accuracy, based on bias and precision, of percentage time reported for tasks from the interview was 53–100% (median 91%). For carpenters, the difference in noise estimates derived from activity cards (mean 91.9 dBA) was not different from those derived from the questionnaire (mean 91.7 dBA). This trend held for electricians as well. For all subjects, noise estimates derived from the activity card and the questionnaire were strongly correlated with dosimetry measurements. The average difference between the noise estimate derived from the questionnaire and dosimetry measurements was 2.0 dBA, and was independent of the actual exposure level. Conclusions: Six months after tasks were performed, construction workers were able to accurately recall the percentage time they spent at various tasks. Estimates of noise exposure based on long term recall (questionnaire) were no different from estimates derived from daily activity cards and were strongly correlated with dosimetry measurements, overestimating the level on average by 2.0 dBA. PMID:14739379

  14. Assessment of the relationship between lesion segmentation accuracy and computer-aided diagnosis scheme performance

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Pu, Jiantao; Park, Sang Cheol; Zuley, Margarita; Gur, David

    2008-03-01

    In this study we randomly select 250 malignant and 250 benign mass regions as a training dataset. The boundary contours of these regions were manually identified and marked. Twelve image features were computed for each region. An artificial neural network (ANN) was trained as a classifier. To select a specific testing dataset, we applied a topographic multi-layer region growth algorithm to detect boundary contours of 1,903 mass regions in an initial pool of testing regions. All processed regions are sorted based on a size difference ratio between manual and automated segmentation. We selected a testing dataset involving 250 malignant and 250 benign mass regions with larger size difference ratios. Using the area under ROC curve (A Z value) as performance index we investigated the relationship between the accuracy of mass segmentation and the performance of a computer-aided diagnosis (CAD) scheme. CAD performance degrades as the size difference ratio increases. Then, we developed and tested a hybrid region growth algorithm that combined the topographic region growth with an active contour approach. In this hybrid algorithm, the boundary contour detected by the topographic region growth is used as the initial contour of the active contour algorithm. The algorithm iteratively searches for the optimal region boundaries. A CAD likelihood score of the growth region being a true-positive mass is computed in each iteration. The region growth is automatically terminated once the first maximum CAD score is reached. This hybrid region growth algorithm reduces the size difference ratios between two areas segmented automatically and manually to less than +/-15% for all testing regions and the testing A Z value increases to from 0.63 to 0.90. The results indicate that CAD performance heavily depends on the accuracy of mass segmentation. In order to achieve robust CAD performance, reducing lesion segmentation error is important.

  15. Qualitative and quantitative procedures for health risk assessment.

    PubMed

    Lohman, P H

    1999-07-16

    Numerous reactive mutagenic electrophiles are present in the environment or are formed in the human body through metabolizing processes. Those electrophiles can directly react with DNA and are considered to be ultimate carcinogens. In the past decades more than 200 in vitro and in vivo genotoxic tests have been described to identify, monitor and characterize the exposure of humans to such agents. When the responses of such genotoxic tests are quantified by a weight-of-evidence analysis, it is found that the intrinsic potency of electrophiles being mutagens does not differ much for the majority of the agents studied. Considering the fact that under normal environmental circumstances human are exposed to low concentration of about a million electrophiles, the relation between exposure to such agents and adverse health effects (e.g., cancer) will become a 'Pandora's box'. For quantitative risk assessment it will be necessary not only to detect whether the agent is genotoxic, but also understand the mechanism of interaction of the agent with the DNA in target cells needs to be taken into account. Examples are given for a limited group of important environmental and carcinogenic agents for which such an approach is feasible. The groups identified are agents that form cross-links with DNA or are mono-alkylating agents that react with base-moieties in the DNA strands. Quantitative hazard ranking of the mutagenic potency of these groups of chemical can be performed and there is ample evidence that such a ranking corresponds with the individual carcinogenic potency of those agents in rodents. Still, in practice, with the exception of certain occupational or accidental exposure situations, these approaches have not be successful in preventing cancer death in the human population. However, this is not only due to the described 'Pandora's box' situation. At least three other factors are described. Firstly, in the industrial world the medical treatment of cancer in patients

  16. Clinical skills assessment of procedural and advanced communication skills: performance expectations of residency program directors

    PubMed Central

    Langenau, Erik E.; Zhang, Xiuyuan; Roberts, William L.; DeChamplain, Andre F.; Boulet, John R.

    2012-01-01

    Background High stakes medical licensing programs are planning to augment and adapt current examinations to be relevant for a two-decision point model for licensure: entry into supervised practice and entry into unsupervised practice. Therefore, identifying which skills should be assessed at each decision point is critical for informing examination development, and gathering input from residency program directors is important. Methods Using data from previously developed surveys and expert panels, a web-delivered survey was distributed to 3,443 residency program directors. For each of the 28 procedural and 18 advanced communication skills, program directors were asked which clinical skills should be assessed, by whom, when, and how. Descriptive statistics were collected, and Intraclass Correlations (ICC) were conducted to determine consistency across different specialties. Results Among 347 respondents, program directors reported that all advanced communication and some procedural tasks are important to assess. The following procedures were considered ‘important’ or ‘extremely important’ to assess: sterile technique (93.8%), advanced cardiovascular life support (ACLS) (91.1%), basic life support (BLS) (90.0%), interpretation of electrocardiogram (89.4%) and blood gas (88.7%). Program directors reported that most clinical skills should be assessed at the end of the first year of residency (or later) and not before graduation from medical school. A minority were considered important to assess prior to the start of residency training: demonstration of respectfulness (64%), sterile technique (67.2%), BLS (68.9%), ACLS (65.9%) and phlebotomy (63.5%). Discussion Results from this study support that assessing procedural skills such as cardiac resuscitation, sterile technique, and phlebotomy would be amenable to assessment at the end of medical school, but most procedural and advanced communications skills would be amenable to assessment at the end of the first

  17. Assessment of Geometrical Accuracy of Multimodal Images Used for Treatment Planning in Stereotactic Radiotherapy and Radiosurgery: CT, MRI and PET

    SciTech Connect

    Garcia-Garduno, O. A.; Larraga-Gutierrez, J. M.; Celis, M. A.; Suarez-Campos, J. J.; Rodriguez-Villafuerte, M.; Martinez-Davalos, A.

    2006-09-08

    An acrylic phantom was designed and constructed to assess the geometrical accuracy of CT, MRI and PET images for stereotactic radiotherapy (SRT) and radiosurgery (SRS) applications. The phantom was suited for each image modality with a specific tracer and compared with CT images to measure the radial deviation between the reference marks in the phantom. It was found that for MRI the maximum mean deviation is 1.9 {+-} 0.2 mm compared to 2.4 {+-} 0.3 mm reported for PET. These results will be used for margin outlining in SRS and SRT treatment planning.

  18. Assessing the Accuracy of the Tracer Dilution Method with Atmospheric Dispersion Modeling

    NASA Astrophysics Data System (ADS)

    Taylor, D.; Delkash, M.; Chow, F. K.; Imhoff, P. T.

    2015-12-01

    Landfill methane emissions are difficult to estimate due to limited observations and data uncertainty. The mobile tracer dilution method is a widely used and cost-effective approach for predicting landfill methane emissions. The method uses a tracer gas released on the surface of the landfill and measures the concentrations of both methane and the tracer gas downwind. Mobile measurements are conducted with a gas analyzer mounted on a vehicle to capture transects of both gas plumes. The idea behind the method is that if the measurements are performed far enough downwind, the methane plume from the large area source of the landfill and the tracer plume from a small number of point sources will be sufficiently well-mixed to behave similarly, and the ratio between the concentrations will be a good estimate of the ratio between the two emissions rates. The mobile tracer dilution method is sensitive to different factors of the setup such as placement of the tracer release locations and distance from the landfill to the downwind measurements, which have not been thoroughly examined. In this study, numerical modeling is used as an alternative to field measurements to study the sensitivity of the tracer dilution method and provide estimates of measurement accuracy. Using topography and wind conditions for an actual landfill, a landfill emissions rate is prescribed in the model and compared against the emissions rate predicted by application of the tracer dilution method. Two different methane emissions scenarios are simulated: homogeneous emissions over the entire surface of the landfill, and heterogeneous emissions with a hot spot containing 80% of the total emissions where the daily cover area is located. Numerical modeling of the tracer dilution method is a useful tool for evaluating the method without having the expense and labor commitment of multiple field campaigns. Factors tested include number of tracers, distance between tracers, distance from landfill to transect

  19. Estimating orientation using magnetic and inertial sensors and different sensor fusion approaches: accuracy assessment in manual and locomotion tasks.

    PubMed

    Bergamini, Elena; Ligorio, Gabriele; Summa, Aurora; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria

    2014-10-09

    Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter) and complementary (Non-linear observer) filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles) and heading (yaw angle) errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided.

  20. Assessment of the accuracy of plasma shape reconstruction by the Cauchy condition surface method in JT-60SA

    SciTech Connect

    Miyata, Y.; Suzuki, T.; Takechi, M.; Urano, H.; Ide, S.

    2015-07-15

    For the purpose of stable plasma equilibrium control and detailed analysis, it is essential to reconstruct an accurate plasma boundary on the poloidal cross section in tokamak devices. The Cauchy condition surface (CCS) method is a numerical approach for calculating the spatial distribution of the magnetic flux outside a hypothetical surface and reconstructing the plasma boundary from the magnetic measurements located outside the plasma. The accuracy of the plasma shape reconstruction has been assessed by comparing the CCS method and an equilibrium calculation in JT-60SA with a high elongation and triangularity of plasma shape. The CCS, on which both Dirichlet and Neumann conditions are unknown, is defined as a hypothetical surface located inside the real plasma region. The accuracy of the plasma shape reconstruction is sensitive to the CCS free parameters such as the number of unknown parameters and the shape in JT-60SA. It is found that the optimum number of unknown parameters and the size of the CCS that minimizes errors in the reconstructed plasma shape are in proportion to the plasma size. Furthermore, it is shown that the accuracy of the plasma shape reconstruction is greatly improved using the optimum number of unknown parameters and shape of the CCS, and the reachable reconstruction errors in plasma shape and locations of strike points are within the target ranges in JT-60SA.

  1. Estimating Orientation Using Magnetic and Inertial Sensors and Different Sensor Fusion Approaches: Accuracy Assessment in Manual and Locomotion Tasks

    PubMed Central

    Bergamini, Elena; Ligorio, Gabriele; Summa, Aurora; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria

    2014-01-01

    Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter) and complementary (Non-linear observer) filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles) and heading (yaw angle) errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided. PMID:25302810

  2. Assessing the accuracy of auralizations computed using a hybrid geometrical-acoustics and wave-acoustics method

    NASA Astrophysics Data System (ADS)

    Summers, Jason E.; Takahashi, Kengo; Shimizu, Yasushi; Yamakawa, Takashi

    2001-05-01

    When based on geometrical acoustics, computational models used for auralization of auditorium sound fields are physically inaccurate at low frequencies. To increase accuracy while keeping computation tractable, hybrid methods using computational wave acoustics at low frequencies have been proposed and implemented in small enclosures such as simplified models of car cabins [Granier et al., J. Audio Eng. Soc. 44, 835-849 (1996)]. The present work extends such an approach to an actual 2400-m3 auditorium using the boundary-element method for frequencies below 100 Hz. The effect of including wave-acoustics at low frequencies is assessed by comparing the predictions of the hybrid model with those of the geometrical-acoustics model and comparing both with measurements. Conventional room-acoustical metrics are used together with new methods based on two-dimensional distance measures applied to time-frequency representations of impulse responses. Despite in situ measurements of boundary impedance, uncertainties in input parameters limit the accuracy of the computed results at low frequencies. However, aural perception ultimately defines the required accuracy of computational models. An algorithmic method for making such evaluations is proposed based on correlating listening-test results with distance measures between time-frequency representations derived from auditory models of the ear-brain system. Preliminary results are presented.

  3. Do Students Know What They Know? Exploring the Accuracy of Students' Self-Assessments

    ERIC Educational Resources Information Center

    Lindsey, Beth A.; Nagel, Megan L.

    2015-01-01

    We have conducted an investigation into how well students in introductory science classes (both physics and chemistry) are able to predict which questions they will or will not be able to answer correctly on an upcoming assessment. An examination of the data at the level of students' overall scores reveals results consistent with the…

  4. Disease severity estimates - effects of rater accuracy and assessments methods for comparing treatments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Assessment of disease is fundamental to the discipline of plant pathology, and estimates of severity are often made visually. However, it is established that visual estimates can be inaccurate and unreliable. In this study estimates of Septoria leaf blotch on leaves of winter wheat from non-treated ...

  5. Exploring Writing Accuracy and Writing Complexity as Predictors of High-Stakes State Assessments

    ERIC Educational Resources Information Center

    Edman, Ellie Whitner

    2012-01-01

    The advent of No Child Left Behind led to increased teacher accountability for student performance and placed strict sanctions in place for failure to meet a certain level of performance each year. With instructional time at a premium, it is imperative that educators have brief academic assessments that accurately predict performance on…

  6. Image intensifier distortion correction for fluoroscopic RSA: the need for independent accuracy assessment.

    PubMed

    Kedgley, Angela E; Fox, Anne-Marie V; Jenkyn, Thomas R

    2012-01-01

    Fluoroscopic images suffer from multiple modes of image distortion. Therefore, the purpose of this study was to compare the effects of correction using a range of two-dimensional polynomials and a global approach. The primary measure of interest was the average error in the distances between four beads of an accuracy phantom, as measured using RSA. Secondary measures of interest were the root mean squared errors of the fit of the chosen polynomial to the grid of beads used for correction, and the errors in the corrected distances between the points of the grid in a second position. Based upon the two-dimensional measures, a polynomial of order three in the axis of correction and two in the perpendicular axis was preferred. However, based upon the RSA reconstruction, a polynomial of order three in the axis of correction and one in the perpendicular axis was preferred. The use of a calibration frame for these three-dimensional applications most likely tempers the effects of distortion. This study suggests that distortion correction should be validated for each of its applications with an independent "gold standard" phantom.

  7. Accuracy assessment of building point clouds automatically generated from iphone images

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Lindenbergh, R.

    2014-06-01

    Low-cost sensor generated 3D models can be useful for quick 3D urban model updating, yet the quality of the models is questionable. In this article, we evaluate the reliability of an automatic point cloud generation method using multi-view iPhone images or an iPhone video file as an input. We register such automatically generated point cloud on a TLS point cloud of the same object to discuss accuracy, advantages and limitations of the iPhone generated point clouds. For the chosen example showcase, we have classified 1.23% of the iPhone point cloud points as outliers, and calculated the mean of the point to point distances to the TLS point cloud as 0.11 m. Since a TLS point cloud might also include measurement errors and noise, we computed local noise values for the point clouds from both sources. Mean (μ) and standard deviation (σ) of roughness histograms are calculated as (μ1 = 0.44 m., σ1 = 0.071 m.) and (μ2 = 0.025 m., σ2 = 0.037 m.) for the iPhone and TLS point clouds respectively. Our experimental results indicate possible usage of the proposed automatic 3D model generation framework for 3D urban map updating, fusion and detail enhancing, quick and real-time change detection purposes. However, further insights should be obtained first on the circumstances that are needed to guarantee a successful point cloud generation from smartphone images.

  8. Accuracy Assessments of Cloud Droplet Size Retrievals from Polarized Reflectance Measurements by the Research Scanning Polarimeter

    NASA Technical Reports Server (NTRS)

    Alexandrov, Mikhail Dmitrievic; Cairns, Brian; Emde, Claudia; Ackerman, Andrew S.; vanDiedenhove, Bastiaan

    2012-01-01

    We present an algorithm for the retrieval of cloud droplet size distribution parameters (effective radius and variance) from the Research Scanning Polarimeter (RSP) measurements. The RSP is an airborne prototype for the Aerosol Polarimetery Sensor (APS), which was on-board of the NASA Glory satellite. This instrument measures both polarized and total reflectance in 9 spectral channels with central wavelengths ranging from 410 to 2260 nm. The cloud droplet size retrievals use the polarized reflectance in the scattering angle range between 135deg and 165deg, where they exhibit the sharply defined structure known as the rain- or cloud-bow. The shape of the rainbow is determined mainly by the single scattering properties of cloud particles. This significantly simplifies both forward modeling and inversions, while also substantially reducing uncertainties caused by the aerosol loading and possible presence of undetected clouds nearby. In this study we present the accuracy evaluation of our algorithm based on the results of sensitivity tests performed using realistic simulated cloud radiation fields.

  9. Assessment of the accuracy of density functional theory for first principles simulations of water

    NASA Astrophysics Data System (ADS)

    Grossman, J. C.; Schwegler, E.; Draeger, E.; Gygi, F.; Galli, G.

    2004-03-01

    We present a series of Car-Parrinello (CP) molecular dynamics simulation in order to better understand the accuracy of density functional theory for the calculation of the properties of water [1]. Through 10 separate ab initio simulations, each for 20 ps of ``production'' time, a number of approximations are tested by varying the density functional employed, the fictitious electron mass, μ, in the CP Langrangian, the system size, and the ionic mass, M (we considered both H_2O and D_2O). We present the impact of these approximations on properties such as the radial distribution function [g(r)], structure factor [S(k)], diffusion coefficient and dipole moment. Our results show that structural properties may artificially depend on μ, and that in the case of an accurate description of the electronic ground state, and in the absence of proton quantum effects, we obtained an oxygen-oxygen correlation function that is over-structured compared to experiment, and a diffusion coefficient which is approximately 10 times smaller. ^1 J.C. Grossman et. al., J. Chem. Phys. (in press, 2004).

  10. Assessing inter-sensor variability and sensible heat flux derivation accuracy for a large aperture scintillometer.

    PubMed

    Rambikur, Evan H; Chávez, José L

    2014-01-01

    The accuracy in determining sensible heat flux (H) of three Kipp and Zonen large aperture scintillometers (LAS) was evaluated with reference to an eddy covariance (EC) system over relatively flat and uniform grassland near Timpas (CO, USA). Other tests have revealed inherent variability between Kipp and Zonen LAS units and bias to overestimate H. Average H fluxes were compared between LAS units and between LAS and EC. Despite good correlation, inter-LAS biases in H were found between 6% and 13% in terms of the linear regression slope. Physical misalignment was observed to result in increased scatter and bias between H solutions of a well-aligned and poorly-aligned LAS unit. Comparison of LAS and EC H showed little bias for one LAS unit, while the other two units overestimated EC H by more than 10%. A detector alignment issue may have caused the inter-LAS variability, supported by the observation in this study of differing power requirements between LAS units. It is possible that the LAS physical misalignment may have caused edge-of-beam signal noise as well as vulnerability to signal noise from wind-induced vibrations, both having an impact on the solution of H. In addition, there were some uncertainties in the solutions of H from the LAS and EC instruments, including lack of energy balance closure with the EC unit. However, the results obtained do not show clear evidence of inherent bias for the Kipp and Zonen LAS to overestimate H as found in other studies.

  11. Accuracy of Cameriere's third molar maturity index in assessing legal adulthood on Serbian population.

    PubMed

    Zelic, Ksenija; Galic, Ivan; Nedeljkovic, Nenad; Jakovljevic, Aleksandar; Milosevic, Olga; Djuric, Marija; Cameriere, Roberto

    2016-02-01

    At the moment, a large number of asylum seekers from the Middle East are passing through Serbia. Most of them do not have identification documents. Also, the past wars in the Balkan region have left many unidentified victims and missing persons. From a legal point of view, it is crucial to determine whether a person is a minor or an adult (≥18 years of age). In recent years, methods based on the third molar development have been used for this purpose. The present article aims to verify the third molar maturity index (I3M) based on the correlation between the chronological age and normalized measures of the open apices and height of the third mandibular molar. The sample consisted of 598 panoramic radiographs (290 males and 299 females) from 13 to 24 years of age. The cut-off value of I3M=0.08 was used to discriminate adults and minors. The results demonstrated high sensitivity (0.96, 0.86) and specificity (0.94, 0.98) in males and females, respectively. The proportion of correctly classified individuals was 0.95 in males and 0.91 in females. In conclusion, the suggested value of I3M=0.08 can be used on Serbian population with high accuracy.

  12. Accuracy of Cameriere's third molar maturity index in assessing legal adulthood on Serbian population.

    PubMed

    Zelic, Ksenija; Galic, Ivan; Nedeljkovic, Nenad; Jakovljevic, Aleksandar; Milosevic, Olga; Djuric, Marija; Cameriere, Roberto

    2016-02-01

    At the moment, a large number of asylum seekers from the Middle East are passing through Serbia. Most of them do not have identification documents. Also, the past wars in the Balkan region have left many unidentified victims and missing persons. From a legal point of view, it is crucial to determine whether a person is a minor or an adult (≥18 years of age). In recent years, methods based on the third molar development have been used for this purpose. The present article aims to verify the third molar maturity index (I3M) based on the correlation between the chronological age and normalized measures of the open apices and height of the third mandibular molar. The sample consisted of 598 panoramic radiographs (290 males and 299 females) from 13 to 24 years of age. The cut-off value of I3M=0.08 was used to discriminate adults and minors. The results demonstrated high sensitivity (0.96, 0.86) and specificity (0.94, 0.98) in males and females, respectively. The proportion of correctly classified individuals was 0.95 in males and 0.91 in females. In conclusion, the suggested value of I3M=0.08 can be used on Serbian population with high accuracy. PMID:26773223

  13. Assessing diagnostic accuracy of Haemoglobin Colour Scale in real-life setting.

    PubMed

    Shah, Pankaj P; Desai, Shrey A; Modi, Dhiren K; Shah, Shobha P

    2014-03-01

    The study was undertaken to determine diagnostic accuracy of Haemoglobin Colour Scale (HCS) in hands of village-based community health workers (CHWs) in real-life community setting in India. Participants (501 women) were randomly selected from 8 villages belonging to a project area of SEWA-Rural, a voluntary organization located in India. After receiving a brief training, CHWs and a research assistant obtained haemoglobin readings using HCS and HemoCue (reference) respectively. Sensitivity, specificity, positive and negative predictive-values, and likelihood ratios were calculated. Bland-Altman plot was constructed. Mean haemoglobin value, using HCS and HemoCue were 11.02 g/dL (CI 10.9-11.2) and 11.07 g/dL (CI 10.9-11.2) respectively. Mean difference between haemoglobin readings was 0.95 g/dL. Sensitivity of HCS was 0.74 (CI 0.65-0.81) and 0.84 (CI 0.8-0.87) whereas specificity was 0.84 (CI:0.51-0.98) and 0.99 (CI:0.97-0.99) using haemoglobin cutoff limits of 10 g/dL and 7 g/dL respectively. CHWs can accurately diagnose severe and moderately-severe anaemia by using HCS in real-life field condition after a brief training.

  14. Assessment of Completeness and Positional Accuracy of Linear Features in Volunteered Geographic Information (vgi)

    NASA Astrophysics Data System (ADS)

    Eshghi, M.; Alesheikh, A. A.

    2015-12-01

    Recent advances in spatial data collection technologies and online services dramatically increase the contribution of ordinary people to produce, share, and use geographic information. Collecting spatial data as well as disseminating them on the internet by citizens has led to a huge source of spatial data termed as Volunteered Geographic Information (VGI) by Mike Goodchild. Although, VGI has produced previously unavailable data assets, and enriched existing ones. But its quality can be highly variable and challengeable. This presents several challenges to potential end users who are concerned about the validation and the quality assurance of the data which are collected. Almost, all the existing researches are based on how to find accurate VGI data from existing VGI data which consist of a) comparing the VGI data with the accurate official data, or b) in cases that there is no access to correct data; therefore, looking for an alternative way to determine the quality of VGI data is essential, and so forth. In this paper it has been attempt to develop a useful method to reach this goal. In this process, the positional accuracy of linear feature of Iran, Tehran OSM data have been analyzed.

  15. Accuracy Assessment of Lidar-Derived Digital Terrain Model (dtm) with Different Slope and Canopy Cover in Tropical Forest Region

    NASA Astrophysics Data System (ADS)

    Salleh, M. R. M.; Ismail, Z.; Rahman, M. Z. A.

    2015-10-01

    Airborne Light Detection and Ranging (LiDAR) technology has been widely used recent years especially in generating high accuracy of Digital Terrain Model (DTM). High density and good quality of airborne LiDAR data promises a high quality of DTM. This study focussing on the analysing the error associated with the density of vegetation cover (canopy cover) and terrain slope in a LiDAR derived-DTM value in a tropical forest environment in Bentong, State of Pahang, Malaysia. Airborne LiDAR data were collected can be consider as low density captured by Reigl system mounted on an aircraft. The ground filtering procedure use adaptive triangulation irregular network (ATIN) algorithm technique in producing ground points. Next, the ground control points (GCPs) used in generating the reference DTM and these DTM was used for slope classification and the point clouds belong to non-ground are then used in determining the relative percentage of canopy cover. The results show that terrain slope has high correlation for both study area (0.993 and 0.870) with the RMSE of the LiDAR-derived DTM. This is similar to canopy cover where high value of correlation (0.989 and 0.924) obtained. This indicates that the accuracy of airborne LiDAR-derived DTM is significantly affected by terrain slope and canopy caver of study area.

  16. Accuracy assessment of planimetric large-scale map data for decision-making

    NASA Astrophysics Data System (ADS)

    Doskocz, Adam

    2016-06-01

    This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data). In this paper, empirical cumulative distribution function models for decision-making risk assessment were established. The established models of the empirical cumulative distribution functions of shift vectors of control points involve polynomial equations. An evaluation of the compatibility degree of the polynomial with empirical data was stated by the convergence coefficient and by the indicator of the mean relative compatibility of model. The application of an empirical cumulative distribution function allows an estimation of the probability of the occurrence of position errors of points in a database. The estimated decision-making risk assessment is represented by the probability of the errors of points stored in the database.

  17. Assessing Inter-Sensor Variability and Sensible Heat Flux Derivation Accuracy for a Large Aperture Scintillometer

    PubMed Central

    Rambikur, Evan H.; Chávez, José L.

    2014-01-01

    The accuracy in determining sensible heat flux (H) of three Kipp and Zonen large aperture scintillometers (LAS) was evaluated with reference to an eddy covariance (EC) system over relatively flat and uniform grassland near Timpas (CO, USA). Other tests have revealed inherent variability between Kipp and Zonen LAS units and bias to overestimate H. Average H fluxes were compared between LAS units and between LAS and EC. Despite good correlation, inter-LAS biases in H were found between 6% and 13% in terms of the linear regression slope. Physical misalignment was observed to result in increased scatter and bias between H solutions of a well-aligned and poorly-aligned LAS unit. Comparison of LAS and EC H showed little bias for one LAS unit, while the other two units overestimated EC H by more than 10%. A detector alignment issue may have caused the inter-LAS variability, supported by the observation in this study of differing power requirements between LAS units. It is possible that the LAS physical misalignment may have caused edge-of-beam signal noise as well as vulnerability to signal noise from wind-induced vibrations, both having an impact on the solution of H. In addition, there were some uncertainties in the solutions of H from the LAS and EC instruments, including lack of energy balance closure with the EC unit. However, the results obtained do not show clear evidence of inherent bias for the Kipp and Zonen LAS to overestimate H as found in other studies. PMID:24473285

  18. In vitro assessment of the accuracy of extraoral periapical radiography in root length determination

    PubMed Central

    Nazeer, Muhammad Rizwan; Khan, Farhan Raza; Rahman, Munawwar

    2016-01-01

    Objective: To determine the accuracy of extra oral periapical radiography in obtaining root length by comparing it with the radiographs obtained from standard intraoral approach and extended distance intraoral approach. Materials and Methods: It was an in vitro, comparative study conducted at the dental clinics of Aga Khan University Hospital. ERC exemption was obtained for this work, ref number 3407Sur-ERC-14. We included premolars and molars of a standard phantom head mounted with metal and radiopaque teeth. Radiation was exposed using three radiographic approaches: Standard intraoral, extended length intraoral and extraoral. Since, the unit of analysis was individual root, thus, we had a total of 24 images. The images were stored in VixWin software. The length of the roots was determined using the scale function of the measuring tool inbuilt in the software. Data were analyzed using SPSS version 19.0 and GraphPad software. Pearson correlation coefficient and Bland–Altman test was applied to determine whether the tooth length readings obtained from three different approaches were correlated. P = 0.05 was taken as statistically significant. Results: The correlation between standard intraoral and extended intraoral was 0.97; the correlation between standard intraoral and extraoral method was 0.82 while the correlation between extended intraoral and extraoral was 0.76. The results of Bland–Altman test showed that the average discrepancy between these methods is not large enough to be considered as significant. Conclusions: It appears that the extraoral radiographic method can be used in root length determination in subjects where intraoral radiography is not possible. PMID:27011737

  19. Accuracy and quality assessment of 454 GS-FLX Titanium pyrosequencing

    PubMed Central

    2011-01-01

    Background The rapid evolution of 454 GS-FLX sequencing technology has not been accompanied by a reassessment of the quality and accuracy of the sequences obtained. Current strategies for decision-making and error-correction are based on an initial analysis by Huse et al. in 2007, for the older GS20 system based on experimental sequences. We analyze here the quality of 454 sequencing data and identify factors playing a role in sequencing error, through the use of an extensive dataset for Roche control DNA fragments. Results We obtained a mean error rate for 454 sequences of 1.07%. More importantly, the error rate is not randomly distributed; it occasionally rose to more than 50% in certain positions, and its distribution was linked to several experimental variables. The main factors related to error are the presence of homopolymers, position in the sequence, size of the sequence and spatial localization in PT plates for insertion and deletion errors. These factors can be described by considering seven variables. No single variable can account for the error rate distribution, but most of the variation is explained by the combination of all seven variables. Conclusions The pattern identified here calls for the use of internal controls and error-correcting base callers, to correct for errors, when available (e.g. when sequencing amplicons). For shotgun libraries, the use of both sequencing primers and deep coverage, combined with the use of random sequencing primer sites should partly compensate for even high error rates, although it may prove more difficult than previous thought to distinguish between low-frequency alleles and errors. PMID:21592414

  20. Accuracy and feasibility of video analysis for assessing hamstring flexibility and validity of the sit-and-reach test.

    PubMed

    Mier, Constance M

    2011-12-01

    The accuracy of video analysis of the passive straight-leg raise test (PSLR) and the validity of the sit-and-reach test (SR) were tested in 60 men and women. Computer software measured static hip-joint flexion accurately. High within-session reliability of the PSLR was demonstrated (R > .97). Test-retest (separate days) reliability for SR was high in men (R = .97) and women R = .98) moderate for PSLR in men (R = .79) and women (R = .89). SR validity (PSLR as criterion) was higher in women (Day 1, r = .69; Day 2, r = .81) than men (Day 1, r = .64; Day 2, r = .66). In conclusion, video analysis is accurate and feasible for assessing static joint angles, PSLR and SR tests are very reliable methods for assessing flexibility, and the SR validity for hamstring flexibility was found to be moderate in women and low in men.

  1. The diagnostic accuracy of pharmacological stress echocardiography for the assessment of coronary artery disease: a meta-analysis

    PubMed Central

    Picano, Eugenio; Molinaro, Sabrina; Pasanisi, Emilio

    2008-01-01

    Background Recent American Heart Association/American College of Cardiology guidelines state that "dobutamine stress echo has substantially higher sensitivity than vasodilator stress echo for detection of coronary artery stenosis" while the European Society of Cardiology guidelines and the European Association of Echocardiography recommendations conclude that "the two tests have very similar applications". Who is right? Aim To evaluate the diagnostic accuracy of dobutamine versus dipyridamole stress echocardiography through an evidence-based approach. Methods From PubMed search, we identified all papers with coronary angiographic verification and head-to-head comparison of dobutamine stress echo (40 mcg/kg/min ± atropine) versus dipyridamole stress echo performed with state-of-the art protocols (either 0.84 mg/kg in 10' plus atropine, or 0.84 mg/kg in 6' without atropine). A total of 5 papers have been found. Pooled weight meta-analysis was performed. Results the 5 analyzed papers recruited 435 patients, 299 with and 136 without angiographically assessed coronary artery disease (quantitatively assessed stenosis > 50%). Dipyridamole and dobutamine showed similar accuracy (87%, 95% confidence intervals, CI, 83–90, vs. 84%, CI, 80–88, p = 0.48), sensitivity (85%, CI 80–89, vs. 86%, CI 78–91, p = 0.81) and specificity (89%, CI 82–94 vs. 86%, CI 75–89, p = 0.15). Conclusion When state-of-the art protocols are considered, dipyridamole and dobutamine stress echo have similar accuracy, specificity and – most importantly – sensitivity for detection of CAD. European recommendations concluding that "dobutamine and vasodilators (at appropriately high doses) are equally potent ischemic stressors for inducing wall motion abnormalities in presence of a critical coronary artery stenosis" are evidence-based. PMID:18565214

  2. Accuracy of dual energy X-ray absorptiometry (DXA) in assessing carcass composition from different pig populations.

    PubMed

    Soladoye, O P; López Campos, Ó; Aalhus, J L; Gariépy, C; Shand, P; Juárez, M

    2016-11-01

    The accuracy of dual energy X-ray absorptiometry (DXA) in assessing carcass composition from pigs with diverse characteristics was examined in the present study. A total of 648 pigs from three different sire breeds, two sexes, two slaughter weights and three different diets were employed. DXA estimations were used to predict the dissected/chemical yield for lean and fat of carcass sides and primal cuts. The accuracy of the predictions was assessed based on coefficient of determination (R(2)) and residual standard deviation (RSD). The linear relationships for dissected fat and lean for all the primal cuts and carcass sides were high (R(2)>0.94, P<0.01), with low RSD (<1.9%). Relationships between DXA and chemical fat and lean of pork bellies were also high (R(2)>0.94, P<0.01), with RSD <2.9%. These linear relationships remained high over the full range of variation in the pig population, except for sire breed, where the coefficient of determination decreased when carcasses were classified based on this variable. PMID:27395824

  3. Cascade impactor (CI) mensuration--an assessment of the accuracy and precision of commercially available optical measurement systems.

    PubMed

    Chambers, Frank; Ali, Aziz; Mitchell, Jolyon; Shelton, Christopher; Nichols, Steve

    2010-03-01

    Multi-stage cascade impactors (CIs) are the preferred measurement technique for characterizing the aerodynamic particle size distribution of an inhalable aerosol. Stage mensuration is the recommended pharmacopeial method for monitoring CI "fitness for purpose" within a GxP environment. The Impactor Sub-Team of the European Pharmaceutical Aerosol Group has undertaken an inter-laboratory study to assess both the precision and accuracy of a range of makes and models of instruments currently used for optical inspection of impactor stages. Measurement of two Andersen 8-stage 'non-viable' cascade impactor "reference" stages that were representative of jet sizes for this instrument type (stages 2 and 7) confirmed that all instruments evaluated were capable of reproducible jet measurement, with the overall capability being within the current pharmacopeial stage specifications for both stages. In the assessment of absolute accuracy, small, but consistent differences (ca. 0.6% of the certified value) observed between 'dots' and 'spots' of a calibrated chromium-plated reticule were observed, most likely the result of treatment of partially lit pixels along the circumference of this calibration standard. Measurements of three certified ring gauges, the smallest having a nominal diameter of 1.0 mm, were consistent with the observation where treatment of partially illuminated pixels at the periphery of the projected image can result in undersizing. However, the bias was less than 1% of the certified diameter. The optical inspection instruments evaluated are fully capable of confirming cascade impactor suitability in accordance with pharmacopeial practice.

  4. Accuracy of forced oscillation technique to assess lung function in geriatric COPD population

    PubMed Central

    Tse, Hoi Nam; Tseng, Cee Zhung Steven; Wong, King Ying; Yee, Kwok Sang; Ng, Lai Yun

    2016-01-01

    Introduction Performing lung function test in geriatric patients has never been an easy task. With well-established evidence indicating impaired small airway function and air trapping in patients with geriatric COPD, utilizing forced oscillation technique (FOT) as a supplementary tool may aid in the assessment of lung function in this population. Aims To study the use of FOT in the assessment of airflow limitation and air trapping in geriatric COPD patients. Study design A cross-sectional study in a public hospital in Hong Kong. ClinicalTrials.gov ID: NCT01553812. Methods Geriatric patients who had spirometry-diagnosed COPD were recruited, with both FOT and plethysmography performed. “Resistance” and “reactance” FOT parameters were compared to plethysmography for the assessment of air trapping and airflow limitation. Results In total, 158 COPD subjects with a mean age of 71.9±0.7 years and percentage of forced expiratory volume in 1 second of 53.4±1.7 L were recruited. FOT values had a good correlation (r=0.4–0.7) to spirometric data. In general, X values (reactance) were better than R values (resistance), showing a higher correlation with spirometric data in airflow limitation (r=0.07–0.49 vs 0.61–0.67), small airway (r=0.05–0.48 vs 0.56–0.65), and lung volume (r=0.12–0.29 vs 0.43–0.49). In addition, resonance frequency (Fres) and frequency dependence (FDep) could well identify the severe type (percentage of forced expiratory volume in 1 second <50%) of COPD with high sensitivity (0.76, 0.71) and specificity (0.72, 0.64) (area under the curve: 0.8 and 0.77, respectively). Moreover, X values could stratify different severities of air trapping, while R values could not. Conclusion FOT may act as a simple and accurate tool in the assessment of severity of airflow limitation, small and central airway function, and air trapping in patients with geriatric COPD who have difficulties performing conventional lung function test. Moreover, reactance

  5. An accuracy assessment of different rigid body image registration methods and robotic couch positional corrections using a novel phantom

    SciTech Connect

    Arumugam, Sankar; Xing Aitang; Jameson, Michael G.; Holloway, Lois

    2013-03-15

    Purpose: Image guided radiotherapy (IGRT) using cone beam computed tomography (CBCT) images greatly reduces interfractional patient positional uncertainties. An understanding of uncertainties in the IGRT process itself is essential to ensure appropriate use of this technology. The purpose of this study was to develop a phantom capable of assessing the accuracy of IGRT hardware and software including a 6 degrees of freedom patient positioning system and to investigate the accuracy of the Elekta XVI system in combination with the HexaPOD robotic treatment couch top. Methods: The constructed phantom enabled verification of the three automatic rigid body registrations (gray value, bone, seed) available in the Elekta XVI software and includes an adjustable mount that introduces known rotational offsets to the phantom from its reference position. Repeated positioning of the phantom was undertaken to assess phantom rotational accuracy. Using this phantom the accuracy of the XVI registration algorithms was assessed considering CBCT hardware factors and image resolution together with the residual error in the overall image guidance process when positional corrections were performed through the HexaPOD couch system. Results: The phantom positioning was found to be within 0.04 ({sigma}= 0.12) Degree-Sign , 0.02 ({sigma}= 0.13) Degree-Sign , and -0.03 ({sigma}= 0.06) Degree-Sign in X, Y, and Z directions, respectively, enabling assessment of IGRT with a 6 degrees of freedom patient positioning system. The gray value registration algorithm showed the least error in calculated offsets with maximum mean difference of -0.2({sigma}= 0.4) mm in translational and -0.1({sigma}= 0.1) Degree-Sign in rotational directions for all image resolutions. Bone and seed registration were found to be sensitive to CBCT image resolution. Seed registration was found to be most sensitive demonstrating a maximum mean error of -0.3({sigma}= 0.9) mm and -1.4({sigma}= 1.7) Degree-Sign in translational

  6. Effect of training, education, professional experience, and need for cognition on accuracy of exposure assessment decision-making.

    PubMed

    Vadali, Monika; Ramachandran, Gurumurthy; Banerjee, Sudipto

    2012-04-01

    Results are presented from a study that investigated the effect of characteristics of occupational hygienists relating to educational and professional experience and task-specific experience on the accuracy of occupational exposure judgments. A total of 49 occupational hygienists from six companies participated in the study and 22 tasks were evaluated. Participating companies provided monitoring data on specific tasks. Information on nine educational and professional experience determinants (e.g. educational background, years of occupational hygiene and exposure assessment experience, professional certifications, statistical training and experience, and the 'need for cognition (NFC)', which is a measure of an individual's motivation for thinking) and four task-specific determinants was also collected from each occupational hygienist. Hygienists had a wide range of educational and professional backgrounds for tasks across a range of industries with different workplace and task characteristics. The American Industrial Hygiene Association exposure assessment strategy was used to make exposure judgments on the probability of the 95th percentile of the underlying exposure distribution being located in one of four exposure categories relative to the occupational exposure limit. After reviewing all available job/task/chemical information, hygienists were asked to provide their judgment in probabilistic terms. Both qualitative (judgments without monitoring data) and quantitative judgments (judgments with monitoring data) were recorded. Ninety-three qualitative judgments and 2142 quantitative judgments were obtained. Data interpretation training, with simple rules of thumb for estimating the 95th percentiles of lognormal distributions, was provided to all hygienists. A data interpretation test (DIT) was also administered and judgments were elicited before and after training. General linear models and cumulative logit models were used to analyze the relationship between

  7. Increasing the Rigor of Procedural Fidelity Assessment: An Empirical Comparison of Direct Observation and Permanent Product Review Methods

    ERIC Educational Resources Information Center

    Sanetti, Lisa M. Hagermoser; Collier-Meek, Melissa A.

    2014-01-01

    Although it is widely accepted that procedural fidelity data are important for making valid decisions about intervention effectiveness, there is little empirical guidance for researchers and practitioners regarding how to assess procedural fidelity. A first step in moving procedural fidelity assessment research forward is to develop a…

  8. Validating the Accuracy of Reaction Time Assessment on Computer-Based Tablet Devices.

    PubMed

    Schatz, Philip; Ybarra, Vincent; Leitner, Donald

    2015-08-01

    Computer-based assessment has evolved to tablet-based devices. Despite the availability of tablets and "apps," there is limited research validating their use. We documented timing delays between stimulus presentation and (simulated) touch response on iOS devices (3rd- and 4th-generation Apple iPads) and Android devices (Kindle Fire, Google Nexus, Samsung Galaxy) at response intervals of 100, 250, 500, and 1,000 milliseconds (ms). Results showed significantly greater timing error on Google Nexus and Samsung tablets (81-97 ms), than Kindle Fire and Apple iPads (27-33 ms). Within Apple devices, iOS 7 obtained significantly lower timing error than iOS 6. Simple reaction time (RT) trials (250 ms) on tablet devices represent 12% to 40% error (30-100 ms), depending on the device, which decreases considerably for choice RT trials (3-5% error at 1,000 ms). Results raise implications for using the same device for serial clinical assessment of RT using tablets, as well as the need for calibration of software and hardware. PMID:25612627

  9. Validating the Accuracy of Reaction Time Assessment on Computer-Based Tablet Devices.

    PubMed

    Schatz, Philip; Ybarra, Vincent; Leitner, Donald

    2015-08-01

    Computer-based assessment has evolved to tablet-based devices. Despite the availability of tablets and "apps," there is limited research validating their use. We documented timing delays between stimulus presentation and (simulated) touch response on iOS devices (3rd- and 4th-generation Apple iPads) and Android devices (Kindle Fire, Google Nexus, Samsung Galaxy) at response intervals of 100, 250, 500, and 1,000 milliseconds (ms). Results showed significantly greater timing error on Google Nexus and Samsung tablets (81-97 ms), than Kindle Fire and Apple iPads (27-33 ms). Within Apple devices, iOS 7 obtained significantly lower timing error than iOS 6. Simple reaction time (RT) trials (250 ms) on tablet devices represent 12% to 40% error (30-100 ms), depending on the device, which decreases considerably for choice RT trials (3-5% error at 1,000 ms). Results raise implications for using the same device for serial clinical assessment of RT using tablets, as well as the need for calibration of software and hardware.

  10. Sequences assessed by declarative and procedural tests of memory in amnesic patients with hippocampal damage.

    PubMed

    Hopkins, Ramona O; Waldram, Kellie; Kesner, Raymond P

    2004-01-01

    Previous research indicates that amnesic subjects tested on sequential learning or serial reaction time tasks can learn a repeated procedural sequence but are unable to explicitly recall the correct sequence when asked to generate the sequence. Rats with hippocampal lesions are also able to learn and remember procedural or implicit sequences but were impaired for declarative sequences. We used analogous procedures used in rats to assess the role of the hippocampus in the acquisition of declarative and procedural sequences in amnesic and control participants. Amnesic participants with damage restricted to the hippocampus and control participants were administered analogous tasks of declarative and procedural sequential learning using a computer version of the radial arm maze. The amnesic participants had slower response times during the acquisition of procedural sequences, but were not impaired compared to controls when switched to a random sequence, suggesting that both groups learned the sequence. Alternatively, the amnesic but not control participants were significantly impaired in the declarative sequence task. Our findings provide support for evolutionary continuity in cognitive function of the hippocampus in rats and humans and the dissociation between the declarative and procedural sequential learning. The performance differences on the two sequence learning tasks are likely due to the use of different strategies associated with learning sequences based on procedural versus declarative knowledge.

  11. Accuracy Assessment for PPP by Comparing Various Online PPP Service Solutions with Bernese 5.2 Network Solution

    NASA Astrophysics Data System (ADS)

    Ozgur Uygur, Sureyya; Aydin, Cuneyt; Demir, Deniz Oz; Cetin, Seda; Dogan, Ugur

    2016-04-01

    GNSS precise point positioning (PPP) technique is frequently used for geodetic applications such as monitoring of reference stations and estimation of tropospheric parameters. This technique uses the undifferenced GNSS observations along with the IGS products to reach high level positioning accuracy. The accuracy level depends on the GNSS data quality as well as the length of the observation duration and the quality of the external data products. It is possible to reach the desired positioning accuracy in the reference frame of satellite coordinates by using a single receiver GNSS data applying PPP technique. PPP technique is provided to users by scientific GNSS processing software packages (like GIPSY of NASA-JPL and Bernese Processing Software of AIUB) as well as several online PPP services. The related services are Auto-GIPSY provided by JPL California Institute of Technology, CSRS-PPP provided by Natural Resources Canada, GAPS provided by the University of New Brunswick and Magic-PPP provided by GMV. In this study, we assess the accuracy of PPP by comparing the solutions from the online PPP services with Bernese 5.2 network solutions. Seven days (DoY 256-262 in 2015) of GNSS observations with 24 hours session duration on the CORS-TR network in Turkey collected on a set of 14 stations were processed in static mode using the above-mentioned PPP services. The average of daily coordinates from Bernese 5.2 static network solution related to 12 IGS stations were taken as the true coordinates. Our results indicate that the distributions of the north, east and up daily position differences are characterized by means and RMS of 1.9±0.5, 2.1±0.7, 4.7±2.1 mm for CSRS, 1.6±0.6, 1.4±0.8, 5.5±3.9 mm for Auto-GIPSY, 3.0±0.8, 3.0±1.2, 6.0±3.2 mm for Magic GNSS, 2.1±1.3, 2.8±1.7, 5.0±2.3 mm for GAPS, with respect to Bernese 5.2 network solution. Keywords: PPP, Online GNSS Service, Bernese, Accuracy

  12. The OPAL Project: Open source Procedure for Assessment of Loss using Global Earthquake Modelling software

    NASA Astrophysics Data System (ADS)

    Daniell, James

    2010-05-01

    This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an "Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software" (OPAL). The OPAL procedure has been developed to provide a framework for optimisation of a Global Earthquake Modelling process through: 1) Overview of current and new components of earthquake loss assessment (vulnerability, hazard, exposure, specific cost and technology); 2) Preliminary research, acquisition and familiarisation with all available ELE software packages; 3) Assessment of these 30+ software packages in order to identify the advantages and disadvantages of the ELE methods used; and 4) Loss analysis for a deterministic earthquake (Mw7.2) for the Zeytinburnu district, Istanbul, Turkey, by applying 3 software packages (2 new and 1 existing): a modified displacement-based method based on DBELA (Displacement Based Earthquake Loss Assessment), a capacity spectrum based method HAZUS (HAZards United States) and the Norwegian HAZUS-based SELENA (SEismic Loss EstimatioN using a logic tree Approach) software which was adapted for use in order to compare the different processes needed for the production of damage, economic and social loss estimates. The modified DBELA procedure was found to be more computationally expensive, yet had less variability, indicating the need for multi-tier approaches to global earthquake loss estimation. Similar systems planning and ELE software produced through the OPAL procedure can be applied to worldwide applications, given exposure data. Keywords: OPAL, displacement-based, DBELA, earthquake loss estimation, earthquake loss assessment, open source, HAZUS

  13. Do students know what they know? Exploring the accuracy of students' self-assessments

    NASA Astrophysics Data System (ADS)

    Lindsey, Beth A.; Nagel, Megan L.

    2015-12-01

    We have conducted an investigation into how well students in introductory science classes (both physics and chemistry) are able to predict which questions they will or will not be able to answer correctly on an upcoming assessment. An examination of the data at the level of students' overall scores reveals results consistent with the Dunning-Kruger effect, in which low-performing students tend to overestimate their abilities, while high-performing students estimate their abilities more accurately. Similar results have been widely reported in the science education literature. Breaking results out by students' responses to individual questions, however, reveals that students of all ability levels have difficulty distinguishing questions which they are able to answer correctly from those that they are not able to answer correctly. These results have implications for the future study and reporting of students' metacognitive abilities.

  14. Accuracy of field methods in assessing body fat in collegiate baseball players.

    PubMed

    Loenneke, Jeremy P; Wray, Mandy E; Wilson, Jacob M; Barnes, Jeremy T; Kearney, Monica L; Pujol, Thomas J

    2013-01-01

    When assessing the fitness levels of athletes, body composition is usually estimated, as it may play a role in athletic performance. Therefore, the purpose of this study was to determine the validity of bioelectrical impedance analysis (BIA) and skinfold (SKF) methods compared with dual-energy X-ray absorptiometry (DXA) for estimating percent body fat (%BF) in Division 1 collegiate baseball players (n = 35). The results of this study indicate that the field methods investigated were not valid compared with DXA for estimating %BF. In conclusion, this study does not support the use of the TBF-350, HBF-306, HBF-500, or SKF thickness for estimating %BF in collegiate baseball players. The reliability of these BIA devices remains unknown; therefore, it is currently uncertain if they may be used to track changes over time.

  15. Malignant mesothelioma, airborne asbestos, and the need for accuracy in chrysotile risk assessments.

    PubMed

    Meisenkothen, Christopher

    2013-01-01

    A man diagnosed with pleural mesothelioma sought legal representation with the author's law firm. He worked 33 years in a wire and cable factory in the northeastern United States (Connecticut) that exclusively used chrysotile asbestos in its manufacturing process. This is the first report of mesothelioma arising from employees of this factory. This report provides additional support for the proposition that chrysotile asbestos can cause malignant mesothelioma in humans. If chrysotile risk assessments are to be accurate, then the literature should contain an accurate accounting of all mesotheliomas alleged to be caused by chrysotile asbestos. This is important not just for public health professionals but also for individuals and companies involved in litigation over asbestos-related diseases. If reports such as these remain unknown, it is probable that cases of mesothelioma among chrysotile-exposed cohorts would go unrecognized and chrysotile-using factories would be incorrectly cited as having no mesotheliomas among their employees.

  16. Accuracy Assessment of Three-dimensional Surface Reconstructions of In vivo Teeth from Cone-beam Computed Tomography

    PubMed Central

    Sang, Yan-Hui; Hu, Hong-Cheng; Lu, Song-He; Wu, Yu-Wei; Li, Wei-Ran; Tang, Zhi-Hui

    2016-01-01

    Background: The accuracy of three-dimensional (3D) reconstructions from cone-beam computed tomography (CBCT) has been particularly important in dentistry, which will affect the effectiveness of diagnosis, treatment plan, and outcome in clinical practice. The aims of this study were to assess the linear, volumetric, and geometric accuracy of 3D reconstructions from CBCT and to investigate the influence of voxel size and CBCT system on the reconstructions results. Methods: Fifty teeth from 18 orthodontic patients were assigned to three groups as NewTom VG 0.15 mm group (NewTom VG; voxel size: 0.15 mm; n = 17), NewTom VG 0.30 mm group (NewTom VG; voxel size: 0.30 mm; n = 16), and VATECH DCTPRO 0.30 mm group (VATECH DCTPRO; voxel size: 0.30 mm; n = 17). The 3D reconstruction models of the teeth were segmented from CBCT data manually using Mimics 18.0 (Materialise Dental, Leuven, Belgium), and the extracted teeth were scanned by 3Shape optical scanner (3Shape A/S, Denmark). Linear and volumetric deviations were separately assessed by comparing the length and volume of the 3D reconstruction model with physical measurement by paired t-test. Geometric deviations were assessed by the root mean square value of the imposed 3D reconstruction and optical models by one-sample t-test. To assess the influence of voxel size and CBCT system on 3D reconstruction, analysis of variance (ANOVA) was used (α = 0.05). Results: The linear, volumetric, and geometric deviations were −0.03 ± 0.48 mm, −5.4 ± 2.8%, and 0.117 ± 0.018 mm for NewTom VG 0.15 mm group; −0.45 ± 0.42 mm, −4.5 ± 3.4%, and 0.116 ± 0.014 mm for NewTom VG 0.30 mm group; and −0.93 ± 0.40 mm, −4.8 ± 5.1%, and 0.194 ± 0.117 mm for VATECH DCTPRO 0.30 mm group, respectively. There were statistically significant differences between groups in terms of linear measurement (P < 0.001), but no significant difference in terms of volumetric measurement (P = 0.774). No statistically significant difference were

  17. Comparison of the predictive validity and consistency among preference assessment procedures: a review of the literature.

    PubMed

    Kang, Soyeon; O'Reilly, Mark; Lancioni, Giulio; Falcomata, Terry S; Sigafoos, Jeff; Xu, Ziwei

    2013-04-01

    We reviewed 14 experimental studies comparing different preference assessments for individuals with developmental disabilities that were published in peer-reviewed journals between 1985 and 2012. Studies were summarized based on the following six variables: (a) the number of participants, (b) the type of disability, (c) the number and type of stimuli, (d) the average duration of administration, (e) compared procedures, and (f) results. Studies were also classified in terms of the predictive validity and consistency of the preference assessment results. The results suggest the preference assessment procedures that may produce more accurate predictions for the reinforcing effects of identified stimuli and consistent preference results. The findings are discussed in relation to the previous literature. Evidence based modifications of the most efficient preference assessment are also discussed. PMID:23357675

  18. Reliability of the direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia.

    PubMed

    Chuan, A; Thillainathan, S; Graham, P L; Jolly, B; Wong, D M; Smith, N; Barrington, M J

    2016-03-01

    The Direct Observation of Procedural Skills (DOPS) form is used as a workplace-based assessment tool in the current Australian and New Zealand College of Anaesthetists curriculum. The objective of this study was to evaluate the reliability of DOPS when used to score trainees performing ultrasound-guided regional anaesthesia. Reliability of an assessment tool is defined as the reproducibility of scores given by different assessors viewing the same trainee. Forty-nine anaesthetists were recruited to score two scripted videos of trainees performing a popliteal sciatic nerve block and an axillary brachial plexus block. Reliability, as measured by intraclass correlation coefficients, was -0.01 to 0.43 for the individual items in DOPS, and 0.15 for the 'Overall Performance for this Procedure' item. Assessors demonstrated consistency of scoring within DOPS, with significant correlation of sum of individual item scores with the 'Overall Performance for this Procedure' item (r=0.78 to 0.80, P<0.001), and with yes versus no responses to the 'Was the procedure completed satisfactorily?' item (W=24, P=0.0004, Video 1, and W=65, P=0.003, Video 2). While DOPS demonstrated a good degree of internal consistency in this setting, inter-rater reliability did not reach levels generally recommended for formative assessment tools. Feasibility of the form could be improved by removing the 'Was the procedure completed satisfactorily?' item without loss of information.

  19. Accuracy of qualitative analysis for assessment of skilled baseball pitching technique.

    PubMed

    Nicholls, Rochelle; Fleisig, Glenn; Elliott, Bruce; Lyman, Stephen; Osinski, Edmund

    2003-07-01

    Baseball pitching must be performed with correct technique if injuries are to be avoided and performance maximized. High-speed video analysis is accepted as the most accurate and objective method for evaluation of baseball pitching mechanics. The aim of this research was to develop an equivalent qualitative analysis method for use with standard video equipment. A qualitative analysis protocol (QAP) was developed for 24 kinematic variables identified as important to pitching performance. Twenty male baseball pitchers were videotaped using 60 Hz camcorders, and their technique evaluated using the QAP, by two independent raters. Each pitcher was also assessed using a 6-camera 200 Hz Motion Analysis system (MAS). Four QAP variables (22%) showed significant similarity with MAS results. Inter-rater reliability showed agreement on 33% of QAP variables. It was concluded that a complete and accurate profile of an athlete's pitching mechanics cannot be made using the QAP in its current form, but it is possible such simple forms of biomechanical analysis could yield accurate results before 3-D methods become obligatory. PMID:14737929

  20. An assessment of coefficient accuracy in linear regression models with spatially varying coefficients

    NASA Astrophysics Data System (ADS)

    Wheeler, David C.; Calder, Catherine A.

    2007-06-01

    The realization in the statistical and geographical sciences that a relationship between an explanatory variable and a response variable in a linear regression model is not always constant across a study area has led to the development of regression models that allow for spatially varying coefficients. Two competing models of this type are geographically weighted regression (GWR) and Bayesian regression models with spatially varying coefficient processes (SVCP). In the application of these spatially varying coefficient models, marginal inference on the regression coefficient spatial processes is typically of primary interest. In light of this fact, there is a need to assess the validity of such marginal inferences, since these inferences may be misleading in the presence of explanatory variable collinearity. In this paper, we present the results of a simulation study designed to evaluate the sensitivity of the spatially varying coefficients in the competing models to various levels of collinearity. The simulation study results show that the Bayesian regression model produces more accurate inferences on the regression coefficients than does GWR. In addition, the Bayesian regression model is overall fairly robust in terms of marginal coefficient inference to moderate levels of collinearity, and degrades less substantially than GWR with strong collinearity.

  1. Accuracy of qualitative analysis for assessment of skilled baseball pitching technique.

    PubMed

    Nicholls, Rochelle; Fleisig, Glenn; Elliott, Bruce; Lyman, Stephen; Osinski, Edmund

    2003-07-01

    Baseball pitching must be performed with correct technique if injuries are to be avoided and performance maximized. High-speed video analysis is accepted as the most accurate and objective method for evaluation of baseball pitching mechanics. The aim of this research was to develop an equivalent qualitative analysis method for use with standard video equipment. A qualitative analysis protocol (QAP) was developed for 24 kinematic variables identified as important to pitching performance. Twenty male baseball pitchers were videotaped using 60 Hz camcorders, and their technique evaluated using the QAP, by two independent raters. Each pitcher was also assessed using a 6-camera 200 Hz Motion Analysis system (MAS). Four QAP variables (22%) showed significant similarity with MAS results. Inter-rater reliability showed agreement on 33% of QAP variables. It was concluded that a complete and accurate profile of an athlete's pitching mechanics cannot be made using the QAP in its current form, but it is possible such simple forms of biomechanical analysis could yield accurate results before 3-D methods become obligatory.

  2. Developing best practices teaching procedures for skinfold assessment: observational examination using the Think Aloud method.

    PubMed

    Holmstrup, Michael E; Verba, Steven D; Lynn, Jeffrey S

    2015-12-01

    Skinfold assessment is valid and economical; however, it has a steep learning curve, and many programs only include one exposure to the technique. Increasing the number of exposures to skinfold assessment within an undergraduate curriculum would likely increase skill proficiency. The present study combined observational and Think Aloud methodologies to quantify procedural and cognitive characteristics of skinfold assessment. It was hypothesized that 1) increased curricular exposure to skinfold assessment would improve proficiency and 2) the combination of an observational and Think Aloud analysis would provide quantifiable areas of emphasis for instructing skinfold assessment. Seventy-five undergraduates with varied curricular exposure performed a seven-site skinfold assessment on a test subject while expressing their thoughts aloud. A trained practitioner recorded procedural observations, with transcripts generated from audio recordings to capture cognitive information. Skinfold measurements were compared with a criterion value, and bias scores were generated. Participants whose total bias fell within ±3.5% of the criterion value were proficient, with the remainder nonproficient. An independent-samples t-test was used to compare procedural and cognitive observations across experience and proficiency groups. Additional curricular exposure improved performance of skinfold assessment in areas such as the measurement of specific sites (e.g., chest, abdomen, and thigh) and procedural (e.g., landmark identification) and cognitive skills (e.g., complete site explanation). Furthermore, the Think Aloud method is a valuable tool for determining curricular strengths and weaknesses with skinfold assessment and as a pedagogical tool for individual instruction and feedback in the classroom. PMID:26628650

  3. Developing best practices teaching procedures for skinfold assessment: observational examination using the Think Aloud method.

    PubMed

    Holmstrup, Michael E; Verba, Steven D; Lynn, Jeffrey S

    2015-12-01

    Skinfold assessment is valid and economical; however, it has a steep learning curve, and many programs only include one exposure to the technique. Increasing the number of exposures to skinfold assessment within an undergraduate curriculum would likely increase skill proficiency. The present study combined observational and Think Aloud methodologies to quantify procedural and cognitive characteristics of skinfold assessment. It was hypothesized that 1) increased curricular exposure to skinfold assessment would improve proficiency and 2) the combination of an observational and Think Aloud analysis would provide quantifiable areas of emphasis for instructing skinfold assessment. Seventy-five undergraduates with varied curricular exposure performed a seven-site skinfold assessment on a test subject while expressing their thoughts aloud. A trained practitioner recorded procedural observations, with transcripts generated from audio recordings to capture cognitive information. Skinfold measurements were compared with a criterion value, and bias scores were generated. Participants whose total bias fell within ±3.5% of the criterion value were proficient, with the remainder nonproficient. An independent-samples t-test was used to compare procedural and cognitive observations across experience and proficiency groups. Additional curricular exposure improved performance of skinfold assessment in areas such as the measurement of specific sites (e.g., chest, abdomen, and thigh) and procedural (e.g., landmark identification) and cognitive skills (e.g., complete site explanation). Furthermore, the Think Aloud method is a valuable tool for determining curricular strengths and weaknesses with skinfold assessment and as a pedagogical tool for individual instruction and feedback in the classroom.

  4. Constraining OCT with Knowledge of Device Design Enables High Accuracy Hemodynamic Assessment of Endovascular Implants

    PubMed Central

    Brown, Jonathan; Lopes, Augusto C.; Kunio, Mie; Kolachalama, Vijaya B.; Edelman, Elazer R.

    2016-01-01

    Background Stacking cross-sectional intravascular images permits three-dimensional rendering of endovascular implants, yet introduces between-frame uncertainties that limit characterization of device placement and the hemodynamic microenvironment. In a porcine coronary stent model, we demonstrate enhanced OCT reconstruction with preservation of between-frame features through fusion with angiography and a priori knowledge of stent design. Methods and Results Strut positions were extracted from sequential OCT frames. Reconstruction with standard interpolation generated discontinuous stent structures. By computationally constraining interpolation to known stent skeletons fitted to 3D ‘clouds’ of OCT-Angio-derived struts, implant anatomy was resolved, accurately rendering features from implant diameter and curvature (n = 1 vessels, r2 = 0.91, 0.90, respectively) to individual strut-wall configurations (average displacement error ~15 μm). This framework facilitated hemodynamic simulation (n = 1 vessel), showing the critical importance of accurate anatomic rendering in characterizing both quantitative and basic qualitative flow patterns. Discontinuities with standard approaches systematically introduced noise and bias, poorly capturing regional flow effects. In contrast, the enhanced method preserved multi-scale (local strut to regional stent) flow interactions, demonstrating the impact of regional contexts in defining the hemodynamic consequence of local deployment errors. Conclusion Fusion of planar angiography and knowledge of device design permits enhanced OCT image analysis of in situ tissue-device interactions. Given emerging interests in simulation-derived hemodynamic assessment as surrogate measures of biological risk, such fused modalities offer a new window into patient-specific implant environments. PMID:26906566

  5. PAIN--perception and assessment of painful procedures in the NICU.

    PubMed

    Britto, Carl Denis; Rao Pn, Suman; Nesargi, Saudamini; Nair, Sitara; Rao, Shashidhar; Thilagavathy, Theradian; Ramesh, Armugam; Bhat, Swarnarekha

    2014-12-01

    This prospective cross-sectional study was undertaken to determine the frequency of procedural pain among 101 neonates in the first 14 days of admission to a neonatal intensive care unit (NICU) in South India and to study the perception of health-care professionals (HCP) about newborn procedural pain. The total number of painful procedures was 8.09 ± 5.53 per baby per day and 68.32 ± 64.78 per baby during hospital stay. The most common procedure was heel prick (30%). The HCP were administered a questionnaire to assess their perception of pain for various procedures. Procedures were perceived as more painful by nurses than by doctors. Chest tube placements and lumbar puncture were considered most painful. This study shows that the neonates in the NICU in developing countries experience many painful procedures. The awareness about this intensity of pain should provide a valuable tool in formulating pain-reduction protocols for management in low resource settings.

  6. A Classical Conditioning Procedure for the Hearing Assessment of Multiply Handicapped Persons.

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; And Others

    1989-01-01

    Hearing assessments of multiply handicapped children/adolescents were conducted using classical conditioning (with an air puff as unconditioned stimulus) and operant conditioning (with a modified visual reinforcement audiometry procedure or edible reinforcement). Findings indicate that classical conditioning was successful with 21 of the 23…

  7. Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software (OPAL)

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.

    2011-07-01

    This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an "Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software" (OPAL). The OPAL procedure was created to provide a framework for optimisation of a Global Earthquake Modelling process through: 1. overview of current and new components of earthquake loss assessment (vulnerability, hazard, exposure, specific cost, and technology); 2. preliminary research, acquisition, and familiarisation for available ELE software packages; 3. assessment of these software packages in order to identify the advantages and disadvantages of the ELE methods used; and 4. loss analysis for a deterministic earthquake (Mw = 7.2) for the Zeytinburnu district, Istanbul, Turkey, by applying 3 software packages (2 new and 1 existing): a modified displacement-based method based on DBELA (Displacement Based Earthquake Loss Assessment, Crowley et al., 2006), a capacity spectrum based method HAZUS (HAZards United States, FEMA, USA, 2003) and the Norwegian HAZUS-based SELENA (SEismic Loss EstimatioN using a logic tree Approach, Lindholm et al., 2007) software which was adapted for use in order to compare the different processes needed for the production of damage, economic, and social loss estimates. The modified DBELA procedure was found to be more computationally expensive, yet had less variability, indicating the need for multi-tier approaches to global earthquake loss estimation. Similar systems planning and ELE software produced through the OPAL procedure can be applied to worldwide applications, given exposure data.

  8. Proposed Planning Procedures: Gaming-Simulation as a Method for Early Assessment.

    ERIC Educational Resources Information Center

    Smit, Peter H.

    1982-01-01

    Examines the use of simulation gaming as a research tool in the early assessment of proposed planning procedures in urban renewal projects. About one-half of the citations in the 36-item bibliography are in Dutch; the remainder are in English. (Author/JJD)

  9. 46 CFR 502.603 - Assessment of civil penalties: Procedure; criteria for determining amount; limitations; relation...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 9 2010-10-01 2010-10-01 false Assessment of civil penalties: Procedure; criteria for determining amount; limitations; relation to compromise. 502.603 Section 502.603 Shipping FEDERAL MARITIME... policies for deterrence and future compliance with the Commission's rules and regulations and...

  10. Meeting on Common Ground: Assessing Parent-Child Relationships through the Joint Painting Procedure

    ERIC Educational Resources Information Center

    Gavron, Tami

    2013-01-01

    A basic assumption in psychotherapy with children is that the parent-child relationship is central to the child's development. This article describes the Joint Painting Procedure, an art-based assessment for evaluating relationships with respect to the two main developmental tasks of middle childhood: (a) the parent's ability to monitor and…

  11. EVALUATION OF ENVIRONMENTAL HAZARD ASSESSMENT PROCEDURES FOR NEAR-COASTAL AREAS OF THE GULF OF MEXICO

    EPA Science Inventory

    Lewis, Michael A. In press. Evaluation of Environmental Hazard Assessment Procedures for Near-Coastal Areas of the Gulf of Mexico (Abstract). To be presented at the Annual Meeting of the the Australasian Society of Ecotoxicology, July 2004, Gold Coast, Australia. 1 p. (ERL,GB R98...

  12. VAP-CAP: A Procedure to Assess the Visual Functioning of Young Visually Impaired Children.

    ERIC Educational Resources Information Center

    Blanksby, D. C.; Langford, P. E.

    1993-01-01

    This article describes a visual assessment procedure (VAP) which evaluates capacity, attention, and processing (CAP) of infants and preschool children with visual impairments. The two-level battery considers, first, visual capacity and basic visual attention and, second, visual perceptual and cognitive abilities. A theoretical analysis of the…

  13. The Implicit Relational Assessment Procedure (IRAP) as a Measure of Spider Fear

    ERIC Educational Resources Information Center

    Nicholson, Emma; Barnes-Holmes, Dermot

    2012-01-01

    A greater understanding of implicit cognition can provide important information regarding the etiology and maintenance of psychological disorders. The current study sought to determine the utility of the Implicit Relational Assessment Procedure (IRAP) as a measure of implicit aversive bias toward spiders in two groups of known variation, high fear…

  14. The Risky Situation: A Procedure for Assessing the Father-Child Activation Relationship

    ERIC Educational Resources Information Center

    Paquette, Daniel; Bigras, Marc

    2010-01-01

    Initial validation data are presented for the Risky Situation (RS), a 20-minute observational procedure designed to assess the father-child activation relationship with children aged 12-18 months. The coding grid, which is simple and easy to use, allows parent-child dyads to be classified into three categories and provides an activation score. By…

  15. 19 CFR 163.11 - Compliance assessment and other audit procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Compliance assessment and other audit procedures. 163.11 Section 163.11 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND... Border Protection, Washington, DC 20229. Upon receipt of such a request, the Director shall provide...

  16. 19 CFR 163.11 - Compliance assessment and other audit procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 2 2011-04-01 2011-04-01 false Compliance assessment and other audit procedures. 163.11 Section 163.11 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND... Border Protection, Washington, DC 20229. Upon receipt of such a request, the Director shall provide...

  17. 40 CFR 63.1107 - Equipment leaks: applicability assessment procedures and methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... assessment procedures and methods. 63.1107 Section 63.1107 Protection of Environment ENVIRONMENTAL PROTECTION... Categories: Generic Maximum Achievable Control Technology Standards § 63.1107 Equipment leaks: applicability... 40 CFR part 60, appendix A shall be used. (b) An owner or operator may use good engineering...

  18. A Simultaneous Screening/Assessment Procedure for Identifying the Gifted Student.

    ERIC Educational Resources Information Center

    Linn, Margaret; Lopatin, Edward

    1990-01-01

    Used four-subtest short form of Wechsler Intelligence Scale for Children-Revised to develop efficient, simultaneous screening/assessment procedure for identifying gifted students for special programs. Estimated statistical relationship between subtest and full-test scores using test score data from 203 students. Found subtest score of 52 both…

  19. Intonation Features of the Expression of Emotions in Spanish: Preliminary Study for a Prosody Assessment Procedure

    ERIC Educational Resources Information Center

    Martinez-Castilla, Pastora; Peppe, Susan

    2008-01-01

    This study aimed to find out what intonation features reliably represent the emotions of "liking" as opposed to "disliking" in the Spanish language, with a view to designing a prosody assessment procedure for use with children with speech and language disorders. 18 intonationally different prosodic realisations (tokens) of one word (limon) were…

  20. The Stoplight Task: A Procedure for Assessing Risk Taking in Humans

    ERIC Educational Resources Information Center

    Reilly, Mark P.; Greenwald, Mark K.; Johanson, Chris-Ellyn

    2006-01-01

    The Stoplight Task, a procedure involving a computer analog of a stoplight, was evaluated for assessing risk taking in humans. Seventeen participants earned points later exchangeable for money by completing a response requirement before the red light appeared on a simulated traffic light. The green light signaled to start responding; it changed to…

  1. Statistics, Measures, and Quality Standards for Assessing Digital Reference Library Services: Guidelines and Procedures.

    ERIC Educational Resources Information Center

    McClure, Charles R.; Lankes, R. David; Gross, Melissa; Choltco-Devlin, Beverly

    This manual is a first effort to begin to identify, describe, and develop procedures for assessing various aspects of digital reference service. Its overall purpose is to improve the quality of digital reference services and assist librarians to design and implement better digital reference services. More specifically, its aim is to: assist…

  2. A Choice Procedure to Assess the Aversive Effects of Drugs in Rodents

    ERIC Educational Resources Information Center

    Podlesnik, Christopher A.; Jimenez-Gomez, Corina; Woods, James H.

    2010-01-01

    The goal of this series of experiments was to develop an operant choice procedure to examine rapidly the punishing effects of intravenous drugs in rats. First, the cardiovascular effects of experimenter-administered intravenous histamine, a known aversive drug, were assessed to determine a biologically active dose range. Next, rats responded on…

  3. Standardised Observation Analogue Procedure (SOAP) for Assessing Parent and Child Behaviours in Clinical Trials

    ERIC Educational Resources Information Center

    Johnson, Cynthia R.; Butter, Eric M.; Handen, Benjamin L.; Sukhodolsky, Denis G.; Mulick, James; Lecavalier, Luc; Aman, Michael G.; Arnold, Eugene L.; Scahill, Lawrence; Swiezy, Naomi; Sacco, Kelley; Stigler, Kimberly A.; McDougle, Christopher J.

    2009-01-01

    Background: Observational measures of parent and child behaviours have a long history in child psychiatric and psychological intervention research, including the field of autism and developmental disability. We describe the development of the Standardised Observational Analogue Procedure (SOAP) for the assessment of parent-child behaviour before…

  4. Assessment of Breast Specimens With or Without Calcifications in Diagnosing Malignant and Atypia for Mammographic Breast Microcalcifications Without Mass: A STARD-Compliant Diagnostic Accuracy Article.

    PubMed

    Cheung, Yun-Chung; Juan, Yu-Hsiang; Ueng, Shir-Hwa; Lo, Yung-Feng; Huang, Pei-Chin; Lin, Yu-Ching; Chen, Shin-Cheh

    2015-10-01

    Presence of microcalcifications within the specimens frequently signifies a successful attempt of stereotactic vacuum-assisted breast biopsy (VABB) in obtaining a pathologic diagnosis of the breast microcalcifications. In this study, the authors aimed to assess and compare the accuracy and consistency of calcified or noncalcified specimens obtained from same sites of sampling on isolated microcalcifications without mass in diagnosing high-risk and malignant lesions. To the best of our knowledge, an individual case-based prospective comparison has not been reported.With the approval from institutional review board of our hospital (Chang Gung Memorial Hospital), the authors retrospectively reviewed all clinical cases of stereotactic VABBs on isolated breast microcalcifications without mass from our database. The authors included those having either surgery performed or had clinical follow-up of at least 3 years for analysis. All the obtained specimens with or without calcification were identified using specimen radiographs and separately submitted for pathologic evaluation. The concordance of diagnosis was assessed for both atypia and malignant lesions.A total of 390 stereotactic VABB procedures (1206 calcified and 1456 noncalcified specimens) were collected and reviewed. The consistent rates between calcified and noncalcified specimens were low for atypia and malignant microcalcifications (44.44% in flat epithelial atypia, 46.51% in atypical ductal hyperplasia, 55.73% in ductal carcinoma in situ, and 71.42% in invasive ductal carcinoma). The discordance in VABB diagnoses indicated that 41.33% of malignant lesions would be misdiagnosed by noncalcified specimens. Furthermore, calcified specimens showed higher diagnostic accuracy of breast cancer as compared with the noncalcified specimens (91.54 % versus 69.49%, respectively). The evaluation of both noncalcified specimens and calcified specimens did not show improvement of diagnostic accuracy as compared with

  5. Accuracy assessment of land cover dynamic in hill land on integration of DEM data and TM image

    NASA Astrophysics Data System (ADS)

    Li, Yunmei; Wang, Xin; Wang, Qiao; Wu, Chuanqing; Huang, Jiazhu

    2010-04-01

    To accurately assess the area of land cover in hill land, we integrated DEM data and remote sensing image in Lihe River Valley, China. Firstly, the DEM data was combined into decision tree to increase the accuracy of land cover classification. Secondly, a slope corrected model was built to transfer the projected area to surface area by DEM data. At last, the area of different land cover was calculated and the dynamic of land cover in Lihe River Valley were analyzed from 1998 to 2003. The results show that: the area of forestland increased more than 10% by the slope corrected model, that indicates the area correcting is very important for hill land; the accuracy of classification especially for forestland and garden plot is enhanced by integrating of DEM data. It can be greater than 85%. The indexes of land use extent were 266.2 in 1998, 273.1 in 2001, and 276.7 in 2003. The change rates of land use extent were 2.59 during 1998 to 2001 and 1.34 during 2001 to 2003.

  6. Pareto-based evolutionary algorithms for the calculation of transformation parameters and accuracy assessment of historical maps

    NASA Astrophysics Data System (ADS)

    Manzano-Agugliaro, F.; San-Antonio-Gómez, C.; López, S.; Montoya, F. G.; Gil, C.

    2013-08-01

    When historical map data are compared with modern cartography, the old map coordinates must be transformed to the current system. However, historical data often exhibit heterogeneous quality. In calculating the transformation parameters between the historical and modern maps, it is often necessary to discard highly uncertain data. An optimal balance between the objectives of minimising the transformation error and eliminating as few points as possible can be achieved by generating a Pareto front of solutions using evolutionary genetic algorithms. The aim of this paper is to assess the performance of evolutionary algorithms in determining the accuracy of historical maps in regard to modern cartography. When applied to the 1787 Tomas Lopez map, the use of evolutionary algorithms reduces the linear error by 40% while eliminating only 2% of the data points. The main conclusion of this paper is that evolutionary algorithms provide a promising alternative for the transformation of historical map coordinates and determining the accuracy of historical maps in regard to modern cartography, particularly when the positional quality of the data points used cannot be assured.

  7. Assessing the prediction accuracy of cure in the Cox proportional hazards cure model: an application to breast cancer data.

    PubMed

    Asano, Junichi; Hirakawa, Akihiro; Hamada, Chikuma

    2014-01-01

    A cure rate model is a survival model incorporating the cure rate with the assumption that the population contains both uncured and cured individuals. It is a powerful statistical tool for prognostic studies, especially in cancer. The cure rate is important for making treatment decisions in clinical practice. The proportional hazards (PH) cure model can predict the cure rate for each patient. This contains a logistic regression component for the cure rate and a Cox regression component to estimate the hazard for uncured patients. A measure for quantifying the predictive accuracy of the cure rate estimated by the Cox PH cure model is required, as there has been a lack of previous research in this area. We used the Cox PH cure model for the breast cancer data; however, the area under the receiver operating characteristic curve (AUC) could not be estimated because many patients were censored. In this study, we used imputation-based AUCs to assess the predictive accuracy of the cure rate from the PH cure model. We examined the precision of these AUCs using simulation studies. The results demonstrated that the imputation-based AUCs were estimable and their biases were negligibly small in many cases, although ordinary AUC could not be estimated. Additionally, we introduced the bias-correction method of imputation-based AUCs and found that the bias-corrected estimate successfully compensated the overestimation in the simulation studies. We also illustrated the estimation of the imputation-based AUCs using breast cancer data.

  8. A procedure for NEPA assessment of selenium hazards associated with mining.

    PubMed

    Lemly, A Dennis

    2007-02-01

    This paper gives step-by-step instructions for assessing aquatic selenium hazards associated with mining. The procedure was developed to provide the U.S. Forest Service with a proactive capability for determining the risk of selenium pollution when it reviews mine permit applications in accordance with the National Environmental Policy Act (NEPA). The procedural framework is constructed in a decision-tree format in order to guide users through the various steps, provide a logical sequence for completing individual tasks, and identify key decision points. There are five major components designed to gather information on operational parameters of the proposed mine as well as key aspects of the physical, chemical, and biological environment surrounding it--geological assessment, mine operation assessment, hydrological assessment, biological assessment, and hazard assessment. Validation tests conducted at three mines where selenium pollution has occurred confirmed that the procedure will accurately predict ecological risks. In each case, it correctly identified and quantified selenium hazard, and indicated the steps needed to reduce this hazard to an acceptable level. By utilizing the procedure, NEPA workers can be confident in their ability to understand the risk of aquatic selenium pollution and take appropriate action. Although the procedure was developed for the Forest Service it should also be useful to other federal land management agencies that conduct NEPA assessments, as well as regulatory agencies responsible for issuing coal mining permits under the authority of the Surface Mining Control and Reclamation Act (SMCRA) and associated Section 401 water quality certification under the Clean Water Act. Mining companies will also benefit from the application of this procedure because priority selenium sources can be identified in relation to specific mine operating parameters. The procedure will reveal the point(s) at which there is a need to modify operating

  9. A probabilistic seismic risk assessment procedure for nuclear power plants: (II) Application

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    This paper presents the procedures and results of intensity- and time-based seismic risk assessments of a sample nuclear power plant (NPP) to demonstrate the risk-assessment methodology proposed in its companion paper. The intensity-based assessments include three sets of sensitivity studies to identify the impact of the following factors on the seismic vulnerability of the sample NPP, namely: (1) the description of fragility curves for primary and secondary components of NPPs, (2) the number of simulations of NPP response required for risk assessment, and (3) the correlation in responses between NPP components. The time-based assessment is performed as a series of intensity-based assessments. The studies illustrate the utility of the response-based fragility curves and the inclusion of the correlation in the responses of NPP components directly in the risk computation. ?? 2011 Published by Elsevier B.V.

  10. An epidemiologic critique of current microbial risk assessment practices: the importance of prevalence and test accuracy data.

    PubMed

    Gardner, Ian A

    2004-09-01

    Data deficiencies are impeding the development and validation of microbial risk assessment models. One such deficiency is the failure to adjust test-based (apparent) prevalence estimates to true prevalence estimates by correcting for the imperfect accuracy of tests that are used. Such adjustments will facilitate comparability of data from different populations and from the same population over time as tests change and the unbiased quantification of effects of mitigation strategies. True prevalence can be estimated from apparent prevalence using frequentist and Bayesian methods, but the latter are more flexible and can incorporate uncertainty in test accuracy and prior prevalence data. Both approaches can be used for single or multiple populations, but the Bayesian approach can better deal with clustered data, inferences for rare events, and uncertainty in multiple variables. Examples of prevalence inferences based on results of Salmonella culture are presented. The opportunity to adjust test-based prevalence estimates is predicated on the availability of sensitivity and specificity estimates. These estimates can be obtained from studies using archived gold standard (reference) samples, by screening with the new test and follow-up of test-positive and test-negative samples with a gold standard test, and by use of latent class methods, which make no assumptions about the true status of each sampling unit. Latent class analysis can be done with maximum likelihood and Bayesian methods, and an example of their use in the evaluation of tests for Toxoplasma gondii in pigs is presented. Guidelines are proposed for more transparent incorporation of test data into microbial risk assessments.

  11. Deriving bio-equivalents from in vitro bioassays: assessment of existing uncertainties and strategies to improve accuracy and reporting.

    PubMed

    Wagner, Martin; Vermeirssen, Etiënne L M; Buchinger, Sebastian; Behr, Maximilian; Magdeburg, Axel; Oehlmann, Jörg

    2013-08-01

    Bio-equivalents (e.g., 17β-estradiol or dioxin equivalents) are commonly employed to quantify the in vitro effects of complex human or environmental samples. However, there is no generally accepted data analysis strategy for estimating and reporting bio-equivalents. Therefore, the aims of the present study are to 1) identify common mathematical models for the derivation of bio-equivalents from the literature, 2) assess the ability of those models to correctly predict bio-equivalents, and 3) propose measures to reduce uncertainty in their calculation and reporting. We compiled a database of 234 publications that report bio-equivalents. From the database, we extracted 3 data analysis strategies commonly used to estimate bio-equivalents. These models are based on linear or nonlinear interpolation, and the comparison of effect concentrations (ECX ). To assess their accuracy, we employed simulated data sets in different scenarios. The results indicate that all models lead to a considerable misestimation of bio-equivalents if certain mathematical assumptions (e.g., goodness of fit, parallelism of dose-response curves) are violated. However, nonlinear interpolation is most suitable to predict bio-equivalents from single-point estimates. Regardless of the model, subsequent linear extrapolation of bio-equivalents generates additional inaccuracy if the prerequisite of parallel dose-response curves is not met. When all these factors are taken into consideration, it becomes clear that data analysis introduces considerable uncertainty in the derived bio-equivalents. To improve accuracy and transparency of bio-equivalents, we propose a novel data analysis strategy and a checklist for reporting Minimum Information about Bio-equivalent ESTimates (MIBEST).

  12. Measures of Diagnostic Accuracy: Basic Definitions

    PubMed Central

    Šimundić, Ana-Maria

    2009-01-01

    Diagnostic accuracy relates to the ability of a test to discriminate between the target condition and health. This discriminative potential can be quantified by the measures of diagnostic accuracy such as sensitivity and specificity, predictive values, likelihood ratios, the area under the ROC curve, Youden's index and diagnostic odds ratio. Different measures of diagnostic accuracy relate to the different aspects of diagnostic procedure: while some measures are used to assess the discriminative property of the test, others are used to assess its predictive ability. Measures of diagnostic accuracy are not fixed indicators of a test performance, some are very sensitive to the disease prevalence, while others to the spectrum and definition of the disease. Furthermore, measures of diagnostic accuracy are extremely sensitive to the design of the study. Studies not meeting strict methodological standards usually over- or under-estimate the indicators of test performance as well as they limit the applicability of the results of the study. STARD initiative was a very important step toward the improvement the quality of reporting of studies of diagnostic accuracy. STARD statement should be included into the Instructions to authors by scientific journals and authors should be encouraged to use the checklist whenever reporting their studies on diagnostic accuracy. Such efforts could make a substantial difference in the quality of reporting of studies of diagnostic accuracy and serve to provide the best possible evidence to the best for the patient care. This brief review outlines some basic definitions and characteristics of the measures of diagnostic accuracy.

  13. Use of Selected Goodness-of-Fit Statistics to Assess the Accuracy of a Model of Henry Hagg Lake, Oregon

    NASA Astrophysics Data System (ADS)

    Rounds, S. A.; Sullivan, A. B.

    2004-12-01

    Assessing a model's ability to reproduce field data is a critical step in the modeling process. For any model, some method of determining goodness-of-fit to measured data is needed to aid in calibration and to evaluate model performance. Visualizations and graphical comparisons of model output are an excellent way to begin that assessment. At some point, however, model performance must be quantified. Goodness-of-fit statistics, including the mean error (ME), mean absolute error (MAE), root mean square error, and coefficient of determination, typically are used to measure model accuracy. Statistical tools such as the sign test or Wilcoxon test can be used to test for model bias. The runs test can detect phase errors in simulated time series. Each statistic is useful, but each has its limitations. None provides a complete quantification of model accuracy. In this study, a suite of goodness-of-fit statistics was applied to a model of Henry Hagg Lake in northwest Oregon. Hagg Lake is a man-made reservoir on Scoggins Creek, a tributary to the Tualatin River. Located on the west side of the Portland metropolitan area, the Tualatin Basin is home to more than 450,000 people. Stored water in Hagg Lake helps to meet the agricultural and municipal water needs of that population. Future water demands have caused water managers to plan for a potential expansion of Hagg Lake, doubling its storage to roughly 115,000 acre-feet. A model of the lake was constructed to evaluate the lake's water quality and estimate how that quality might change after raising the dam. The laterally averaged, two-dimensional, U.S. Army Corps of Engineers model CE-QUAL-W2 was used to construct the Hagg Lake model. Calibrated for the years 2000 and 2001 and confirmed with data from 2002 and 2003, modeled parameters included water temperature, ammonia, nitrate, phosphorus, algae, zooplankton, and dissolved oxygen. Several goodness-of-fit statistics were used to quantify model accuracy and bias. Model

  14. NASA Safety Standard: Guidelines and Assessment Procedures for Limiting Orbital Debris

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Collision with orbital debris is a hazard of growing concern as historically accepted practices and procedures have allowed man-made objects to accumulate in orbit. To limit future debris generation, NASA Management Instruction (NMI) 1700.8, 'Policy to Limit Orbital Debris Generation,' was issued in April of 1993. The NMI requires each program to conduct a formal assessment of the potential to generate orbital debris. This document serves as a companion to NMI 1700.08 and provides each NASA program with specific guidelines and assessment methods to assure compliance with the NMI. Each main debris assessment issue (e.g., Post Mission Disposal) is developed in a separate chapter.

  15. A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions.

    PubMed

    Nelson, Michael C; Cordray, David S; Hulleman, Chris S; Darrow, Catherine L; Sommer, Evan C

    2012-10-01

    An intervention's effectiveness is judged by whether it produces positive outcomes for participants, with the randomized experiment being the gold standard for determining intervention effects. However, the intervention-as-implemented in an experiment frequently differs from the intervention-as-designed, making it unclear whether unfavorable results are due to an ineffective intervention model or the failure to implement the model fully. It is therefore vital to accurately and systematically assess intervention fidelity and, where possible, incorporate fidelity data in the analysis of outcomes. This paper elaborates a five-step procedure for systematically assessing intervention fidelity in the context of randomized controlled trials (RCTs), describes the advantages of assessing fidelity with this approach, and uses examples to illustrate how this procedure can be applied.

  16. Pitfalls at the root of facial assessment on photographs: a quantitative study of accuracy in positioning facial landmarks.

    PubMed

    Cummaudo, M; Guerzoni, M; Marasciuolo, L; Gibelli, D; Cigada, A; Obertovà, Z; Ratnayake, M; Poppa, P; Gabriel, P; Ritz-Timme, S; Cattaneo, C

    2013-05-01

    In the last years, facial analysis has gained great interest also for forensic anthropology. The application of facial landmarks may bring about relevant advantages for the analysis of 2D images by measuring distances and extracting quantitative indices. However, this is a complex task which depends upon the variability in positioning facial landmarks. In addition, literature provides only general indications concerning the reliability in positioning facial landmarks on photographic material, and no study is available concerning the specific errors which may be encountered in such an operation. The aim of this study is to analyze the inter- and intra-observer error in defining facial landmarks on photographs by using a software specifically developed for this purpose. Twenty-four operators were requested to define 22 facial landmarks on frontal view photographs and 11 on lateral view images; in addition, three operators repeated the procedure on the same photographs 20 times (at distance of 24 h). In the frontal view, the landmarks with less dispersion were the pupil, cheilion, endocanthion, and stomion (sto), and the landmarks with the highest dispersion were gonion, zygion, frontotemporale, tragion, and selion (se). In the lateral view, the landmarks with the least dispersion were se, pronasale, subnasale, and sto, whereas landmarks with the highest dispersion were gnathion, pogonion, and tragion. Results confirm that few anatomical points can be defined with the highest accuracy and show the importance of the preliminary investigation of reliability in positioning facial landmarks. PMID:23515681

  17. Pitfalls at the root of facial assessment on photographs: a quantitative study of accuracy in positioning facial landmarks.

    PubMed

    Cummaudo, M; Guerzoni, M; Marasciuolo, L; Gibelli, D; Cigada, A; Obertovà, Z; Ratnayake, M; Poppa, P; Gabriel, P; Ritz-Timme, S; Cattaneo, C

    2013-05-01

    In the last years, facial analysis has gained great interest also for forensic anthropology. The application of facial landmarks may bring about relevant advantages for the analysis of 2D images by measuring distances and extracting quantitative indices. However, this is a complex task which depends upon the variability in positioning facial landmarks. In addition, literature provides only general indications concerning the reliability in positioning facial landmarks on photographic material, and no study is available concerning the specific errors which may be encountered in such an operation. The aim of this study is to analyze the inter- and intra-observer error in defining facial landmarks on photographs by using a software specifically developed for this purpose. Twenty-four operators were requested to define 22 facial landmarks on frontal view photographs and 11 on lateral view images; in addition, three operators repeated the procedure on the same photographs 20 times (at distance of 24 h). In the frontal view, the landmarks with less dispersion were the pupil, cheilion, endocanthion, and stomion (sto), and the landmarks with the highest dispersion were gonion, zygion, frontotemporale, tragion, and selion (se). In the lateral view, the landmarks with the least dispersion were se, pronasale, subnasale, and sto, whereas landmarks with the highest dispersion were gnathion, pogonion, and tragion. Results confirm that few anatomical points can be defined with the highest accuracy and show the importance of the preliminary investigation of reliability in positioning facial landmarks.

  18. A Proposed Model for Selecting Measurement Procedures for the Assessment and Treatment of Problem Behavior.

    PubMed

    LeBlanc, Linda A; Raetz, Paige B; Sellers, Tyra P; Carr, James E

    2016-03-01

    Practicing behavior analysts frequently assess and treat problem behavior as part of their ongoing job responsibilities. Effective measurement of problem behavior is critical to success in these activities because some measures of problem behavior provide more accurate and complete information about the behavior than others. However, not every measurement procedure is appropriate for every problem behavior and therapeutic circumstance. We summarize the most commonly used measurement procedures, describe the contexts for which they are most appropriate, and propose a clinical decision-making model for selecting measurement produces given certain features of the behavior and constraints of the therapeutic environment. PMID:27606232

  19. Positional Accuracy Assessment of the Openstreetmap Buildings Layer Through Automatic Homologous Pairs Detection: the Method and a Case Study

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.; Zamboni, G.

    2016-06-01

    OpenStreetMap (OSM) is currently the largest openly licensed collection of geospatial data. Being OSM increasingly exploited in a variety of applications, research has placed great attention on the assessment of its quality. This work focuses on assessing the quality of OSM buildings. While most of the studies available in literature are limited to the evaluation of OSM building completeness, this work proposes an original approach to assess the positional accuracy of OSM buildings based on comparison with a reference dataset. The comparison relies on a quasi-automated detection of homologous pairs on the two datasets. Based on the homologous pairs found, warping algorithms like e.g. affine transformations and multi-resolution splines can be applied to the OSM buildings to generate a new version having an optimal local match to the reference layer. A quality assessment of the OSM buildings of Milan Municipality (Northern Italy), having an area of about 180 km2, is then presented. After computing some measures of completeness, the algorithm based on homologous points is run using the building layer of the official vector cartography of Milan Municipality as the reference dataset. Approximately 100000 homologous points are found, which show a systematic translation of about 0.4 m on both the X and Y directions and a mean distance of about 0.8 m between the datasets. Besides its efficiency and high degree of automation, the algorithm generates a warped version of OSM buildings which, having by definition a closest match to the reference buildings, can be eventually integrated in the OSM database.

  20. Building-In Quality Rather than Assessing Quality Afterwards: A Technological Solution to Ensuring Computational Accuracy in Learning Materials

    ERIC Educational Resources Information Center

    Dunn, Peter

    2008-01-01

    Quality encompasses a very broad range of ideas in learning materials, yet the accuracy of the content is often overlooked as a measure of quality. Various aspects of accuracy are briefly considered, and the issue of computational accuracy is then considered further. When learning materials are produced containing the results of mathematical…

  1. WebRASP: a server for computing energy scores to assess the accuracy and stability of RNA 3D structures

    PubMed Central

    Norambuena, Tomas; Cares, Jorge F.; Capriotti, Emidio; Melo, Francisco

    2013-01-01

    Summary: The understanding of the biological role of RNA molecules has changed. Although it is widely accepted that RNAs play important regulatory roles without necessarily coding for proteins, the functions of many of these non-coding RNAs are unknown. Thus, determining or modeling the 3D structure of RNA molecules as well as assessing their accuracy and stability has become of great importance for characterizing their functional activity. Here, we introduce a new web application, WebRASP, that uses knowledge-based potentials for scoring RNA structures based on distance-dependent pairwise atomic interactions. This web server allows the users to upload a structure in PDB format, select several options to visualize the structure and calculate the energy profile. The server contains online help, tutorials and links to other related resources. We believe this server will be a useful tool for predicting and assessing the quality of RNA 3D structures. Availability and implementation: The web server is available at http://melolab.org/webrasp. It has been tested on the most popular web browsers and requires Java plugin for Jmol visualization. Contact: fmelo@bio.puc.cl PMID:23929030

  2. Accuracy of the third molar index for assessing the legal majority of 18 years in Turkish population.

    PubMed

    Gulsahi, Ayse; De Luca, Stefano; Cehreli, S Burcak; Tirali, R Ebru; Cameriere, Roberto

    2016-09-01

    In the last few years, forced and unregistered child marriage has widely increased into Turkey. The aim of this study was to test the accuracy of cut-off value of 0.08 by measurement of third molar index (I3M) in assessing legal adult age of 18 years. Digital panoramic images of 293 Turkish children and young adults (165 girls and 128 boys), aged between 14 and 22 years, were analysed. Age distribution gradually decreases as I3M increases in both girls and boys. For girls, the sensitivity was 85.9% (95% CI 77.1-92.8%) and specificity was 100%. The proportion of correctly classified individuals was 92.7%. For boys, the sensitivity was 94.6% (95% CI 88.1-99.8%) and specificity was 100%. The proportion of correctly classified individuals was 97.6%. The cut-off value of 0.08 is a useful method to assess if a subject is older than 18 years of age or not.

  3. Accuracy of the third molar index for assessing the legal majority of 18 years in Turkish population.

    PubMed

    Gulsahi, Ayse; De Luca, Stefano; Cehreli, S Burcak; Tirali, R Ebru; Cameriere, Roberto

    2016-09-01

    In the last few years, forced and unregistered child marriage has widely increased into Turkey. The aim of this study was to test the accuracy of cut-off value of 0.08 by measurement of third molar index (I3M) in assessing legal adult age of 18 years. Digital panoramic images of 293 Turkish children and young adults (165 girls and 128 boys), aged between 14 and 22 years, were analysed. Age distribution gradually decreases as I3M increases in both girls and boys. For girls, the sensitivity was 85.9% (95% CI 77.1-92.8%) and specificity was 100%. The proportion of correctly classified individuals was 92.7%. For boys, the sensitivity was 94.6% (95% CI 88.1-99.8%) and specificity was 100%. The proportion of correctly classified individuals was 97.6%. The cut-off value of 0.08 is a useful method to assess if a subject is older than 18 years of age or not. PMID:27344224

  4. Considering the normative, systemic and procedural dimensions in indicator-based sustainability assessments in agriculture

    SciTech Connect

    Binder, Claudia R.; Feola, Giuseppe; Steinberger, Julia K.

    2010-02-15

    This paper develops a framework for evaluating sustainability assessment methods by separately analyzing their normative, systemic and procedural dimensions as suggested by Wiek and Binder [Wiek, A, Binder, C. Solution spaces for decision-making - a sustainability assessment tool for city-regions. Environ Impact Asses Rev 2005, 25: 589-608.]. The framework is then used to characterize indicator-based sustainability assessment methods in agriculture. For a long time, sustainability assessment in agriculture has focused mostly on environmental and technical issues, thus neglecting the economic and, above all, the social aspects of sustainability, the multi-functionality of agriculture and the applicability of the results. In response to these shortcomings, several integrative sustainability assessment methods have been developed for the agricultural sector. This paper reviews seven of these that represent the diversity of tools developed in this area. The reviewed assessment methods can be categorized into three types: (i) top-down farm assessment methods; (ii) top-down regional assessment methods with some stakeholder participation; (iii) bottom-up, integrated participatory or transdisciplinary methods with stakeholder participation throughout the process. The results readily show the trade-offs encountered when selecting an assessment method. A clear, standardized, top-down procedure allows for potentially benchmarking and comparing results across regions and sites. However, this comes at the cost of system specificity. As the top-down methods often have low stakeholder involvement, the application and implementation of the results might be difficult. Our analysis suggests that to include the aspects mentioned above in agricultural sustainability assessment, the bottom-up, integrated participatory or transdisciplinary methods are the most suitable ones.

  5. AN ACCURACY ASSESSMENT OF 1992 LANDSAT-MSS DERIVED LAND COVER FOR THE UPPER SAN PEDRO WATERSHED (U.S./MEXICO)

    EPA Science Inventory

    The utility of Digital Orthophoto Quads (DOQS) in assessing the classification accuracy of land cover derived from Landsat MSS data was investigated. Initially, the suitability of DOQs in distinguishing between different land cover classes was assessed using high-resolution airbo...

  6. The accuracy of a patient or parent-administered bleeding assessment tool administered in a paediatric haematology clinic.

    PubMed

    Lang, A T; Sturm, M S; Koch, T; Walsh, M; Grooms, L P; O'Brien, S H

    2014-11-01

    Classifying and describing bleeding symptoms is essential in the diagnosis and management of patients with mild bleeding disorders (MBDs). There has been increased interest in the use of bleeding assessment tools (BATs) to more objectively quantify the presence and severity of bleeding symptoms. To date, the administration of BATs has been performed almost exclusively by clinicians; the accuracy of a parent-proxy BAT has not been studied. Our objective was to determine the accuracy of a parent-administered BAT by measuring the level of agreement between parent and clinician responses to the Condensed MCMDM-1VWD Bleeding Questionnaire. Our cross-sectional study included children 0-21 years presenting to a haematology clinic for initial evaluation of a suspected MBD or follow-up evaluation of a previously diagnosed MBD. The parent/caregiver completed a modified version of the BAT; the clinician separately completed the BAT through interview. The mean parent-report bleeding score (BS) was 6.09 (range: -2 to 25); the mean clinician report BS was 4.54 (range: -1 to 17). The mean percentage of agreement across all bleeding symptoms was 78% (mean κ = 0.40; Gwet's AC1 = 0.74). Eighty percent of the population had an abnormal BS (defined as ≥2) when rated by parents and 76% had an abnormal score when rated by clinicians (86% agreement, κ = 0.59, Gwet's AC1 = 0.79). While parents tended to over-report bleeding as compared to clinicians, overall, BSs were similar between groups. These results lend support for further study of a modified proxy-report BAT as a clinical and research tool.

  7. Assessment of the accuracy of coupled cluster perturbation theory for open-shell systems. II. Quadruples expansions

    NASA Astrophysics Data System (ADS)

    Eriksen, Janus J.; Matthews, Devin A.; Jørgensen, Poul; Gauss, Jürgen

    2016-05-01

    We extend our assessment of the potential of perturbative coupled cluster (CC) expansions for a test set of open-shell atoms and organic radicals to the description of quadruple excitations. Namely, the second- through sixth-order models of the recently proposed CCSDT(Q-n) quadruples series [J. J. Eriksen et al., J. Chem. Phys. 140, 064108 (2014)] are compared to the prominent CCSDT(Q) and ΛCCSDT(Q) models. From a comparison of the models in terms of their recovery of total CC singles, doubles, triples, and quadruples (CCSDTQ) energies, we find that the performance of the CCSDT(Q-n) models is independent of the reference used (unrestricted or restricted (open-shell) Hartree-Fock), in contrast to the CCSDT(Q) and ΛCCSDT(Q) models, for which the accuracy is strongly dependent on the spin of the molecular ground state. By further comparing the ability of the models to recover relative CCSDTQ total atomization energies, the discrepancy between them is found to be even more pronounced, stressing how a balanced description of both closed- and open-shell species—as found in the CCSDT(Q-n) models—is indeed of paramount importance if any perturbative CC model is to be of chemical relevance for high-accuracy applications. In particular, the third-order CCSDT(Q-3) model is found to offer an encouraging alternative to the existing choices of quadruples models used in modern computational thermochemistry, since the model is still only of moderate cost, albeit markedly more costly than, e.g., the CCSDT(Q) and ΛCCSDT(Q) models.

  8. Assessment of the accuracy of coupled cluster perturbation theory for open-shell systems. II. Quadruples expansions.

    PubMed

    Eriksen, Janus J; Matthews, Devin A; Jørgensen, Poul; Gauss, Jürgen

    2016-05-21

    We extend our assessment of the potential of perturbative coupled cluster (CC) expansions for a test set of open-shell atoms and organic radicals to the description of quadruple excitations. Namely, the second- through sixth-order models of the recently proposed CCSDT(Q-n) quadruples series [J. J. Eriksen et al., J. Chem. Phys. 140, 064108 (2014)] are compared to the prominent CCSDT(Q) and ΛCCSDT(Q) models. From a comparison of the models in terms of their recovery of total CC singles, doubles, triples, and quadruples (CCSDTQ) energies, we find that the performance of the CCSDT(Q-n) models is independent of the reference used (unrestricted or restricted (open-shell) Hartree-Fock), in contrast to the CCSDT(Q) and ΛCCSDT(Q) models, for which the accuracy is strongly dependent on the spin of the molecular ground state. By further comparing the ability of the models to recover relative CCSDTQ total atomization energies, the discrepancy between them is found to be even more pronounced, stressing how a balanced description of both closed- and open-shell species-as found in the CCSDT(Q-n) models-is indeed of paramount importance if any perturbative CC model is to be of chemical relevance for high-accuracy applications. In particular, the third-order CCSDT(Q-3) model is found to offer an encouraging alternative to the existing choices of quadruples models used in modern computational thermochemistry, since the model is still only of moderate cost, albeit markedly more costly than, e.g., the CCSDT(Q) and ΛCCSDT(Q) models. PMID:27208932

  9. [CONTROVERSIES REGARDING THE ACCURACY AND LIMITATIONS OF FROZEN SECTION IN THYROID PATHOLOGY: AN EVIDENCE-BASED ASSESSMENT].

    PubMed

    Stanciu-Pop, C; Pop, F C; Thiry, A; Scagnol, I; Maweja, S; Hamoir, E; Beckers, A; Meurisse, M; Grosu, F; Delvenne, Ph

    2015-12-01

    Palpable thyroid nodules are present clinically in 4-7% of the population and their prevalence increases to 50%-67% when using high-resolution neck ultrasonography. By contrast, thyroid carcinoma (TC) represents only 5-20% of these nodules, which underlines the need for an appropriate approach to avoid unnecessary surgery. Frozen section (PS) has been used for more than 40 years in thyroid surgery to establish the diagnosis of malignancy. However, a controversy persists regarding the accuracy of FS and its place in thyroid pathology has changed with the emergence of fine-needle aspiration (FNA). A PubMed Medline and SpringerLink search was made covering the period from January 2000 to June 2012 to assess the accuracy of ES, its limitations and indications for the diagnosis of thyroid nodules. Twenty publications encompassing 8.567 subjects were included in our study. The average value of TC among thyroid nodules in analyzed studies was 15.5 %. ES ability to detect cancer expressed by its sensitivity (Ss) was 67.5 %. More than two thirds of the authors considered PS useful exclusively in the presence of doubtful ENA and for guiding the surgical extension in cases confirmed as malignant by FNA; however, only 33% accepted FS as a routine examination for the management of thyroid nodules. The influence of FS on surgical reintervention rate in nodular thyroid pathology was considered to be negligible by most studies, whereas 31 % of the authors thought that FS has a favorable benefit by decreasing the number of surgical re-interventions. In conclusion, the role of FS in thyroid pathology evolved from a mandatory component for thyroid surgery to an optional examination after a pre-operative FNA cytology. The accuracy of FS seems to provide no sufficient additional benefit and most experts support its use only in the presence of equivocal or suspicious cytological features, for guiding the surgical extension in cases confirmed as malignant by FNA and for the

  10. User guide for WIACX: A transonic wind-tunnel wall interference assessment and correction procedure for the NTF

    NASA Technical Reports Server (NTRS)

    Garriz, Javier A.; Haigler, Kara J.

    1992-01-01

    A three dimensional transonic Wind-tunnel Interference Assessment and Correction (WIAC) procedure developed specifically for use in the National Transonic Facility (NTF) at NASA Langley Research Center is discussed. This report is a user manual for the codes comprising the correction procedure. It also includes listings of sample procedures and input files for running a sample case and plotting the results.

  11. Assessing the effect of data pretreatment procedures for principal components analysis of chromatographic data.

    PubMed

    McIlroy, John W; Smith, Ruth Waddell; McGuffin, Victoria L

    2015-12-01

    Following publication of the National Academy of Sciences report "Strengthening Forensic Science in the United States: A Path Forward", there has been increasing interest in the application of multivariate statistical procedures for the evaluation of forensic evidence. However, prior to statistical analysis, variance from sources other than the sample must be minimized through application of data pretreatment procedures. This is necessary to ensure that subsequent statistical analysis of the data provides meaningful results. The purpose of this work was to evaluate the effect of pretreatment procedures on multivariate statistical analysis of chromatographic data obtained for a reference set of diesel fuels. Diesel was selected due to its chemical complexity and forensic relevance, both for fire debris and environmental forensic applications. Principal components analysis (PCA) was applied to the untreated chromatograms to assess association of replicates and discrimination among the different diesel samples. The chromatograms were then pretreated by sequentially applying the following procedures: background correction, smoothing, retention-time alignment, and normalization. The effect of each procedure on association and discrimination was evaluated based on the association of replicates in the PCA scores plot. For these data, background correction and smoothing offered minimal improvement, whereas alignment and normalization offered the greatest improvement in the association of replicates and discrimination among highly similar samples. Further, prior to pretreatment, the first principal component accounted for only non-sample sources of variance. Following pretreatment, these sources were minimized and the first principal component accounted for significant chemical differences among the diesel samples. These results highlight the need for pretreatment procedures and provide a metric to assess the effect of pretreatment on subsequent multivariate statistical

  12. Assessing the effect of data pretreatment procedures for principal components analysis of chromatographic data.

    PubMed

    McIlroy, John W; Smith, Ruth Waddell; McGuffin, Victoria L

    2015-12-01

    Following publication of the National Academy of Sciences report "Strengthening Forensic Science in the United States: A Path Forward", there has been increasing interest in the application of multivariate statistical procedures for the evaluation of forensic evidence. However, prior to statistical analysis, variance from sources other than the sample must be minimized through application of data pretreatment procedures. This is necessary to ensure that subsequent statistical analysis of the data provides meaningful results. The purpose of this work was to evaluate the effect of pretreatment procedures on multivariate statistical analysis of chromatographic data obtained for a reference set of diesel fuels. Diesel was selected due to its chemical complexity and forensic relevance, both for fire debris and environmental forensic applications. Principal components analysis (PCA) was applied to the untreated chromatograms to assess association of replicates and discrimination among the different diesel samples. The chromatograms were then pretreated by sequentially applying the following procedures: background correction, smoothing, retention-time alignment, and normalization. The effect of each procedure on association and discrimination was evaluated based on the association of replicates in the PCA scores plot. For these data, background correction and smoothing offered minimal improvement, whereas alignment and normalization offered the greatest improvement in the association of replicates and discrimination among highly similar samples. Further, prior to pretreatment, the first principal component accounted for only non-sample sources of variance. Following pretreatment, these sources were minimized and the first principal component accounted for significant chemical differences among the diesel samples. These results highlight the need for pretreatment procedures and provide a metric to assess the effect of pretreatment on subsequent multivariate statistical

  13. Assessment of bleeding during minor oral surgical procedures and extraction in patients on anticoagulant therapy

    PubMed Central

    Jimson, S.; Amaldhas, Julius; Jimson, Sudha; Kannan, I.; Parthiban, J.

    2015-01-01

    Introduction: The risk of postoperative hemorrhage from oral surgical procedures has been a concern in the treatment of patients who are receiving long-term anticoagulation therapy. A study undertaken in our institution to address questions about the amount and severity of bleeding associated with minor outpatient oral surgery procedures by assessing bleeding in patients who did not alter their anticoagulant regimen. Subjects and Methods: Eighty-three patients receiving long-term anticoagulant therapy visited Department of Oral and Maxillofacial Surgery from May 2010 to October 2011 for extractions and minor oral surgical procedures. Each patient was required to undergo preoperative assessment of prothrombin time (PT) and measurement of the international normalized ratio. Fifty-six patients with preoperative PT values within the therapeutic range 3–4 were included in the study. The patients’ age ranged between 30 and 75 years. Application of surgispon was done following the procedure. Extraction of teeth performed with minimal trauma to the surrounding tissues, the socket margins sutured, and sutures removed after 5 days. Results: There was no significant incidence of prolonged or excessive hemorrhage and wound infection and the healing process was normal. PMID:26015691

  14. A procedure of landscape services assessment based on mosaics of patches and boundaries.

    PubMed

    Martín de Agar, Pilar; Ortega, Marta; de Pablo, Carlos L

    2016-09-15

    We develop a procedure for assessing the environmental value of landscape mosaics that simultaneously considers the values of land use patches and the values of the boundaries between them. These boundaries indicate the ecological interactions between the patches. A landscape mosaic is defined as a set of patches and the boundaries between them and corresponds to a spatial pattern of ecological interactions. The procedure is performed in two steps: (i) an environmental assessment of land use patches by means of a function that integrates values based on the goods and services the patches provide, and (ii) an environmental valuation of mosaics using a function that integrates the environmental values of their patches and the types and frequencies of the boundaries between them. This procedure allows us to measure how changes in land uses or in their spatial arrangement cause variations in the environmental value of landscape mosaics and therefore in that of the whole landscape. The procedure was tested in the Sierra Norte of Madrid (central Spain). The results show that the environmental values of the landscape depend not only on the land use patches but also on the values associated with the pattern of the boundaries within the mosaics. The results also highlight the importance of the boundaries between land use patches as determinants of the goods and services provided by the landscape.

  15. Vertical Accuracy Assessment of 30-M Resolution Alos, Aster, and Srtm Global Dems Over Northeastern Mindanao, Philippines

    NASA Astrophysics Data System (ADS)

    Santillan, J. R.; Makinano-Santillan, M.

    2016-06-01

    The ALOS World 3D - 30 m (AW3D30), ASTER Global DEM Version 2 (GDEM2), and SRTM-30 m are Digital Elevation Models (DEMs) that have been made available to the general public free of charge. An important feature of these DEMs is their unprecedented horizontal resolution of 30-m and almost global coverage. The very recent release of these DEMs, particularly AW3D30 and SRTM- 30 m, calls for opportunities for the conduct of localized assessment of the DEM's quality and accuracy to verify their suitability for a wide range of applications in hydrology, geomorphology, archaelogy, and many others. In this study, we conducted a vertical accuracy assessment of these DEMs by comparing the elevation of 274 control points scattered over various sites in northeastern Mindanao, Philippines. The elevations of these control points (referred to the Mean Sea Level, MSL) were obtained through 3rd order differential levelling using a high precision digital level, and their horizontal positions measured using a global positioning system (GPS) receiver. These control points are representative of five (5) land-cover classes namely brushland (45 points), built-up (32), cultivated areas (97), dense vegetation (74), and grassland (26). Results showed that AW3D30 has the lowest Root Mean Square Error (RMSE) of 5.68 m, followed by SRTM-30 m (RMSE = 8.28 m), and ASTER GDEM2 (RMSE = 11.98 m). While all the three DEMs overestimated the true ground elevations, the mean and standard deviations of the differences in elevations were found to be lower in AW3D30 compared to SRTM-30 m and ASTER GDEM2. The superiority of AW3D30 over the other two DEMS was also found to be consistent even under different landcover types, with AW3D30's RMSEs ranging from 4.29 m (built-up) to 6.75 m (dense vegetation). For SRTM-30 m, the RMSE ranges from 5.91 m (built-up) to 10.42 m (brushland); for ASTER

  16. Mild traumatic brain injury (MTBI): assessment and treatment procedures used by speech-language pathologists (SLPs).

    PubMed

    Duff, Melissa C; Proctor, Adele; Haley, Katarina

    2002-09-01

    The purposes of this study were to identify how individuals with MTBI are assessed, to determine the referral process to and from speech-language pathologists (SLPs), to describe the frequency, structure, and nature of treatment, to identify how individuals with MTBI and their families are educated about the injury and counselled, and to assess current follow-up procedures. One-hundred and forty-three hospital and rehabilitation centre based SLPs from North Carolina and Illinois responded to a survey developed to address these areas of interest. Findings indicated current diagnostic tools used by SLPs lack the sensitivity to detect the subtle cognitive communication deficits associated with MTBI, referral and follow-up procedures are not sufficiently implemented in facilities to meet the growing needs of individuals with MTBI, and SLPs would benefit from increased training regarding the management of individuals with MTBI including educating and counselling patients and their families.

  17. A comparative assessment of commonly employed staining procedures for the diagnosis of cryptosporidiosis.

    PubMed

    Moodley, D; Jackson, T F; Gathiram, V; van den Ende, J

    1991-03-16

    Following an increase in the number of reports of Cryptosporidium infections and the problems encountered in detecting these organisms in faecal smears, a comparative assessment of a modification of the Sheather's flotation technique and other commonly employed staining procedures proved the modified Sheather's technique to be most useful in identifying Cryptosporidium oocysts in diarrhoeal stools. This technique not only detected the parasite in the highest number of stools but also proved to be cost-effective and the least time-consuming. Other staining techniques assessed were the modified Ziehl-Neelsen, safranin-methylene blue and auramine-phenol fluorescence. Both the modified Ziehl-Neelsen and the auramine-phenol fluorescence procedures produced nonspecific staining, while the safranin-methylene blue method was found to be the least sensitive technique.

  18. Assessment System for Aircraft Noise (ASAN) citation database. Volume 3: New citation review procedures

    NASA Astrophysics Data System (ADS)

    Reddingius, Nicolaas; Kugler, Andrew B.

    1989-12-01

    The Assessment System for Aircraft Noise (ASAN) includes a database of several thousand references to the literature on the impact of noise and sonic booms on humans, animals and structures. Bibliographic data, abstracts and critical reviews of key documents can be retrieved. A systematic methodology for the selection and evaluation of new citations to be added to the database consistent with the procedures used in CITASAN is described.

  19. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    SciTech Connect

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L.

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual`s performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average.

  20. 34 CFR 668.149 - Special provisions for the approval of assessment procedures for special populations for whom no...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Special provisions for the approval of assessment procedures for special populations for whom no tests are reasonably available. 668.149 Section 668.149... the approval of assessment procedures for special populations for whom no tests are...

  1. Sacramento City College Assessment Center Research Report: Assessment Procedures, Fall 1983 - Fall 1984.

    ERIC Educational Resources Information Center

    Haase, M.; Caffrey, Patrick

    Studies and analyses conducted by the Assessment Center at Sacramento City College (SCC) between fall 1983 and fall 1984 provided the data on SCC's students and services which are presented in this report. Following an overview of the significant findings of the year's research efforts, part I sets forth the purpose of the report and part II…

  2. Benchmarking an operational procedure for rapid flood mapping and risk assessment in Europe

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Salamon, Peter; Kalas, Milan; Bianchi, Alessandra; Feyen, Luc

    2016-04-01

    The development of real-time methods for rapid flood mapping and risk assessment is crucial to improve emergency response and mitigate flood impacts. This work describes the benchmarking of an operational procedure for rapid flood risk assessment based on the flood predictions issued by the European Flood Awareness System (EFAS). The daily forecasts produced for the major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations, based on the hydro-meteorological dataset of EFAS. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in near real-time in terms of flood prone areas, potential economic damage, affected population, infrastructures and cities. An extensive testing of the operational procedure is carried out using the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-derived flood footprints, while ground-based estimations of economic damage and affected population is compared against modelled estimates. We evaluated the skill of flood hazard and risk estimations derived from EFAS flood forecasts with different lead times and combinations. The assessment includes a comparison of several alternative approaches to produce and present the information content, in order to meet the requests of EFAS users. The tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management.

  3. Accuracy Assessment for the U.S. Geological Survey Regional Land-Cover Mapping Program: New York and New Jersey Region

    USGS Publications Warehouse

    Zhu, Zhi-Liang; Yang, Limin; Stehman, Stephen V.; Czaplewski, Raymond L.

    2000-01-01

    The U.S. Geological Survey, in cooperation with other government and private organizations, is producing a conterminous U.S. land-cover map using Landsat Thematic Mapper 30-meter data for the Federal regions designated by the U.S. Environmental Protection Agency. Accuracy assessment is to be conducted for each Federal region to estimate overall and class-specific accuracies. In Region 2, consisting of New York and New Jersey, the accuracy assessment was completed for 15 land-cover and land-use classes, using interpreted 1:40,000-scale aerial photographs as reference data. The methodology used for Region 2 features a two-stage, geographically stratified approach, with a general sample of all classes (1,033 sample sites), and a separate sample for rare classes (294 sample sites). A confidence index was recorded for each land-cover interpretation on the 1:40,000-scale aerial photography The estimated overall accuracy for Region 2 was 63 percent (standard error 1.4 percent) using all sample sites, and 75.2 percent (standard error 1.5 percent) using only reference sites with a high-confidence index. User's and producer's accuracies for the general sample and user's accuracy for the sample of rare classes, as well as variance for the estimated accuracy parameters, were also reported. Narrowly defined land-use classes and heterogeneous conditions of land cover are the major causes of misclassification errors. Recommendations for modifying the accuracy assessment methodology for use in the other nine Federal regions are provided.

  4. Mass evolution of Mediterranean, Black, Red, and Caspian Seas from GRACE and altimetry: accuracy assessment and solution calibration

    NASA Astrophysics Data System (ADS)

    Loomis, B. D.; Luthcke, S. B.

    2016-09-01

    We present new measurements of mass evolution for the Mediterranean, Black, Red, and Caspian Seas as determined by the NASA Goddard Space Flight Center (GSFC) GRACE time-variable global gravity mascon solutions. These new solutions are compared to sea surface altimetry measurements of sea level anomalies with steric corrections applied. To assess their accuracy, the GRACE- and altimetry-derived solutions are applied to the set of forward models used by GSFC for processing the GRACE Level-1B datasets, with the resulting inter-satellite range-acceleration residuals providing a useful metric for analyzing solution quality. We also present a differential correction strategy to calibrate the time series of mass change for each of the seas by establishing the strong linear relationship between differences in the forward modeled mass and the corresponding range-acceleration residuals between the two solutions. These calibrated time series of mass change are directly determined from the range-acceleration residuals, effectively providing regionally-tuned GRACE solutions without the need to form and invert normal equations. Finally, the calibrated GRACE time series are discussed and combined with the steric-corrected sea level anomalies to provide new measurements of the unmodeled steric variability for each of the seas over the span of the GRACE observation record. We apply ensemble empirical mode decomposition (EEMD) to adaptively sort the mass and steric components of sea level anomalies into seasonal, non-seasonal, and long-term temporal scales.

  5. Accuracy of a Low-Cost Novel Computer-Vision Dynamic Movement Assessment: Potential Limitations and Future Directions

    NASA Astrophysics Data System (ADS)

    McGroarty, M.; Giblin, S.; Meldrum, D.; Wetterling, F.

    2016-04-01

    The aim of the study was to perform a preliminary validation of a low cost markerless motion capture system (CAPTURE) against an industry gold standard (Vicon). Measurements of knee valgus and flexion during the performance of a countermovement jump (CMJ) between CAPTURE and Vicon were compared. After correction algorithms were applied to the raw CAPTURE data acceptable levels of accuracy and precision were achieved. The knee flexion angle measured for three trials using Capture deviated by -3.8° ± 3° (left) and 1.7° ± 2.8° (right) compared to Vicon. The findings suggest that low-cost markerless motion capture has potential to provide an objective method for assessing lower limb jump and landing mechanics in an applied sports setting. Furthermore, the outcome of the study warrants the need for future research to examine more fully the potential implications of the use of low-cost markerless motion capture in the evaluation of dynamic movement for injury prevention.

  6. Accuracy and uncertainty assessment on geostatistical simulation of soil salinity in a coastal farmland using auxiliary variable.

    PubMed

    Yao, R J; Yang, J S; Shao, H B

    2013-06-01

    Understanding the spatial soil salinity aids farmers and researchers in identifying areas in the field where special management practices are required. Apparent electrical conductivity measured by electromagnetic induction instrument in a fairly quick manner has been widely used to estimate spatial soil salinity. However, methods used for this purpose are mostly a series of interpolation algorithms. In this study, sequential Gaussian simulation (SGS) and sequential Gaussian co-simulation (SGCS) algorithms were applied for assessing the prediction accuracy and uncertainty of soil salinity with apparent electrical conductivity as auxiliary variable. Results showed that the spatial patterns of soil salinity generated by SGS and SGCS algorithms showed consistency with the measured values. The profile distribution of soil salinity was characterized by increasing with depth with medium salinization (ECe 4-8 dS/m) as the predominant salinization class. SGCS algorithm privileged SGS algorithm with smaller root mean square error according to the generated realizations. In addition, SGCS algorithm had larger proportions of true values falling within probability intervals and narrower range of probability intervals than SGS algorithm. We concluded that SGCS algorithm had better performance in modeling local uncertainty and propagating spatial uncertainty. The inclusion of auxiliary variable contributed to prediction capability and uncertainty modeling when using densely auxiliary variable as the covariate to predict the sparse target variable.

  7. Accuracy Assessment of a Canal-Tunnel 3d Model by Comparing Photogrammetry and Laserscanning Recording Techniques

    NASA Astrophysics Data System (ADS)

    Charbonnier, P.; Chavant, P.; Foucher, P.; Muzet, V.; Prybyla, D.; Perrin, T.; Grussenmeyer, P.; Guillemin, S.

    2013-07-01

    With recent developments in the field of technology and computer science, conventional methods are being supplanted by laser scanning and digital photogrammetry. These two different surveying techniques generate 3-D models of real world objects or structures. In this paper, we consider the application of terrestrial Laser scanning (TLS) and photogrammetry to the surveying of canal tunnels. The inspection of such structures requires time, safe access, specific processing and professional operators. Therefore, a French partnership proposes to develop a dedicated equipment based on image processing for visual inspection of canal tunnels. A 3D model of the vault and side walls of the tunnel is constructed from images recorded onboard a boat moving inside the tunnel. To assess the accuracy of this photogrammetric model (PM), a reference model is build using static TLS. We here address the problem comparing the resulting point clouds. Difficulties arise because of the highly differentiated acquisition processes, which result in very different point densities. We propose a new tool, designed to compare differences between pairs of point cloud or surfaces (triangulated meshes). Moreover, dealing with huge datasets requires the implementation of appropriate structures and algorithms. Several techniques are presented : point-to-point, cloud-to-cloud and cloud-to-mesh. In addition farthest point resampling, octree structure and Hausdorff distance are adopted and described. Experimental results are shown for a 475 m long canal tunnel located in France.

  8. A flexible alternative to the Cox proportional hazards model for assessing the prognostic accuracy of hospice patient survival.

    PubMed

    Miladinovic, Branko; Kumar, Ambuj; Mhaskar, Rahul; Kim, Sehwan; Schonwetter, Ronald; Djulbegovic, Benjamin

    2012-01-01

    Prognostic models are often used to estimate the length of patient survival. The Cox proportional hazards model has traditionally been applied to assess the accuracy of prognostic models. However, it may be suboptimal due to the inflexibility to model the baseline survival function and when the proportional hazards assumption is violated. The aim of this study was to use internal validation to compare the predictive power of a flexible Royston-Parmar family of survival functions with the Cox proportional hazards model. We applied the Palliative Performance Scale on a dataset of 590 hospice patients at the time of hospice admission. The retrospective data were obtained from the Lifepath Hospice and Palliative Care center in Hillsborough County, Florida, USA. The criteria used to evaluate and compare the models' predictive performance were the explained variation statistic R(2), scaled Brier score, and the discrimination slope. The explained variation statistic demonstrated that overall the Royston-Parmar family of survival functions provided a better fit (R(2) =0.298; 95% CI: 0.236-0.358) than the Cox model (R(2) =0.156; 95% CI: 0.111-0.203). The scaled Brier scores and discrimination slopes were consistently higher under the Royston-Parmar model. Researchers involved in prognosticating patient survival are encouraged to consider the Royston-Parmar model as an alternative to Cox. PMID:23082220

  9. Does diagnosis affect the predictive accuracy of risk assessment tools for juvenile offenders: Conduct Disorder and Attention Deficit Hyperactivity Disorder.

    PubMed

    Khanna, Dinesh; Shaw, Jenny; Dolan, Mairead; Lennox, Charlotte

    2014-10-01

    Studies have suggested an increased risk of criminality in juveniles if they suffer from co-morbid Attention Deficit Hyperactivity Disorder (ADHD) along with Conduct Disorder. The Structured Assessment of Violence Risk in Youth (SAVRY), the Psychopathy Checklist Youth Version (PCL:YV), and Youth Level of Service/Case Management Inventory (YLS/CMI) have been shown to be good predictors of violent and non-violent re-offending. The aim was to compare the accuracy of these tools to predict violent and non-violent re-offending in young people with co-morbid ADHD and Conduct Disorder and Conduct Disorder only. The sample included 109 White-British adolescent males in secure settings. Results revealed no significant differences between the groups for re-offending. SAVRY factors had better predictive values than PCL:YV or YLS/CMI. Tools generally had better predictive values for the Conduct Disorder only group than the co-morbid group. Possible reasons for these findings have been discussed along with limitations of the study. PMID:25173178

  10. Use of measurement uncertainty analysis to assess accuracy of carbon mass balance closure for a cellulase production process.

    PubMed

    Schell, Daniel J; Sáez, Juan Carlos; Hamilton, Jenny; Tholudur, Arun; McMillan, James D

    2002-01-01

    Closing carbon mass balances is a critical and necessary step for verifying the performance of any conversion process. We developed a methodology for calculating carbon mass balance closures for a cellulase production process and then applied measurement uncertainty analysis to calculate 95% confidence limits to assess the accuracy of the results. Cellulase production experiments were conducted in 7-L fermentors using Trichoderma reesei grown on pure cellulose (Solka-floc), glucose, or lactose. All input and output carbon-containing streams were measured and carbon dioxide in the exhaust gas was quantified using a mass spectrometer. On Solka-floc, carbon mass balances ranged from 90 to 100% closure for the first 48 h but increased to 101 to 135% closure from 72 h to the end of the cultivation at 168 h. Carbon mass balance closures for soluble sugar substrates ranged from 92 to 127% over the entire course of the cultivations. The 95% confidence intervals (CIs) for carbon mass balance closure were typically +/-11 to 12 percentage points after 48 h of cultivation. Many of the carbon mass balance results did not bracket 100% closure within the 95% CIs. These results suggest that measurement problems with the experimental or analytical methods may exist. This work shows that uncertainty analysis can be a useful diagnostic tool for identifying measurement problems in complex biochemical systems.

  11. Accuracy of CBCT images in the assessment of buccal marginal alveolar peri-implant defects: effect of field of view

    PubMed Central

    Murat, S; Kılıç, C; Yüksel, S; Avsever, H; Farman, A; Scarfe, W C

    2014-01-01

    Objectives: To investigate the reliability and accuracy of cone beam CT (CBCT) images obtained at different fields of view in detecting and quantifying simulated buccal marginal alveolar peri-implant defects. Methods: Simulated buccal defects were prepared in 69 implants inserted into cadaver mandibles. CBCT images at three different fields of view were acquired: 40 × 40, 60 × 60 and 100 × 100 mm. The presence or absence of defects was assessed on three sets of images using a five-point scale by three observers. Observers also measured the depth, width and volume of defects on CBCT images, which were compared with physical measurements. The kappa value was calculated to assess intra- and interobserver agreement. Six-way repeated analysis of variance was used to evaluate treatment effects on the diagnosis. Pairwise comparisons of median true-positive and true-negative rates were calculated by the χ2 test. Pearson's correlation coefficient was used to determine the relationship between measurements. Significance level was set as p < 0.05. Results: All observers had excellent intra-observer agreement. Defect status (p < 0.001) and defect size (p < 0.001) factors were statistically significant. Pairwise interactions were found between defect status and defect size (p = 0.001). No differences between median true-positive or true-negative values were found between CBCT field of views (p > 0.05). Significant correlations were found between physical and CBCT measurements (p < 0.001). Conclusions: All CBCT images performed similarly for the detection of simulated buccal marginal alveolar peri-implant defects. Depth, width and volume measurements of the defects from various CBCT images correlated highly with physical measurements. PMID:24645965

  12. A novel procedure for the assessment of the antioxidant capacity of food components.

    PubMed

    Yoshimura, Toshihiro; Harashima, Mai; Kurogi, Katsuhisa; Suiko, Masahito; Liu, Ming-Cheh; Sakakibara, Yoichi

    2016-08-15

    Carbonylation, an oxidative modification of the amino group of arginine and lysine residues caused by reactive oxygen species, has emerged as a new type of oxidative damage. Protein carbonylation has been shown to exert adverse effects on various protein functions. Recently, the role of food components in the attenuation of oxidative stress has been the focus of many studies. Most of these studies focused on the chemical properties of food components. However, it is also important to determine their effects on protein functions via post-translational modifications. In this study, we developed a novel procedure for evaluating the antioxidant capacity of food components. Hydrogen peroxide (H2O2)-induced protein carbonylation in HL-60 cells was quantitatively analyzed by using fluorescent dyes (Cy5-hydrazide dye and IC3-OSu dye), followed by sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) and fluorescence determination. Among a panel of food components tested, quinic acid, kaempferol, saponin, squalene, trigonelline, and mangiferin were shown to be capable of suppressing protein carbonylation in HL-60 cells. Our results demonstrated that this fluorescence labeling/SDS-PAGE procedure allows for the detection of oxidative stress-induced protein carbonylation with high sensitivity and quantitative accuracy. This method should be useful for the screening of new antioxidant food components as well as the analysis of their suppression mechanism. PMID:27184074

  13. PLÉIADES Project: Assessment of Georeferencing Accuracy, Image Quality, Pansharpening Performence and Dsm/dtm Quality

    NASA Astrophysics Data System (ADS)

    Topan, Hüseyin; Cam, Ali; Özendi, Mustafa; Oruç, Murat; Jacobsen, Karsten; Taşkanat, Talha

    2016-06-01

    Pléiades 1A and 1B are twin optical satellites of Optical and Radar Federated Earth Observation (ORFEO) program jointly running by France and Italy. They are the first satellites of Europe with sub-meter resolution. Airbus DS (formerly Astrium Geo) runs a MyGIC (formerly Pléiades Users Group) program to validate Pléiades images worldwide for various application purposes. The authors conduct three projects, one is within this program, the second is supported by BEU Scientific Research Project Program, and the third is supported by TÜBİTAK. Assessment of georeferencing accuracy, image quality, pansharpening performance and Digital Surface Model/Digital Terrain Model (DSM/DTM) quality subjects are investigated in these projects. For these purposes, triplet panchromatic (50 cm Ground Sampling Distance (GSD)) and VNIR (2 m GSD) Pléiades 1A images were investigated over Zonguldak test site (Turkey) which is urbanised, mountainous and covered by dense forest. The georeferencing accuracy was estimated with a standard deviation in X and Y (SX, SY) in the range of 0.45m by bias corrected Rational Polynomial Coefficient (RPC) orientation, using ~170 Ground Control Points (GCPs). 3D standard deviation of ±0.44m in X, ±0.51m in Y, and ±1.82m in Z directions have been reached in spite of the very narrow angle of convergence by bias corrected RPC orientation. The image quality was also investigated with respect to effective resolution, Signal to Noise Ratio (SNR) and blur coefficient. The effective resolution was estimated with factor slightly below 1.0, meaning that the image quality corresponds to the nominal resolution of 50cm. The blur coefficients were achieved between 0.39-0.46 for triplet panchromatic images, indicating a satisfying image quality. SNR is in the range of other comparable space borne images which may be caused by de-noising of Pléiades images. The pansharpened images were generated by various methods, and are validated by most common statistical

  14. Development of a robust procedure for assessing powder flow using a commercial avalanche testing instrument.

    PubMed

    Hancock, Bruno C; Vukovinsky, Kim E; Brolley, Barry; Grimsey, Ian; Hedden, David; Olsofsky, Angela; Doherty, Rebecca A

    2004-09-01

    The objectives of this work were to develop a robust procedure for assessing powder flow using a commercial avalanche testing instrument and to define the limits of its performance. To achieve this a series of powdered pharmaceutical excipients with a wide range of flow properties was characterized using such an instrument (Aeroflow, TSI Inc., St. Paul, MN, USA). The experimental conditions (e.g., sample size, rotation speed) were rationally selected and systematically evaluated so that an optimal standard-operating-procedure could be identified. To evaluate the inherent variability of the proposed methodology samples were tested at multiple sites, using different instruments and operators. The ranking of the flow properties of the powders obtained was also compared with that obtained using a conventional shear-cell test. As a result of these experiments a quick, simple, and rugged procedure for determining the flow properties of pharmaceutical powders in their dilated state was developed. This procedure gave comparable results when performed at four different testing sites and was able to reproducibly rank the flow properties of a series of common pharmaceutical excipient powders. The limits of the test method to discriminate between different powder samples were determined, and a positive correlation with the results of a benchmark method (the simplified shear cell) was obtained.

  15. A procedure for incorporating spatial variability in ecological risk assessment of Dutch river floodplains.

    PubMed

    Kooistra, L; Leuven, R S; Nienhuis, P H; Wehrens, R; Buydens, L M

    2001-09-01

    Floodplain soils along the river Rhine in the Netherlands show a large spatial variability in pollutant concentrations. For an accurate ecological risk characterization of the river floodplains, this heterogeneity has to be included into the ecological risk assessment. In this paper a procedure is presented that incorporates spatial components of exposure into the risk assessment by linking geographical information systems (GIS) with models that estimate exposure for the most sensitive species of a floodplain. The procedure uses readily available site-specific data and is applicable to a wide range of locations and floodplain management scenarios. The procedure is applied to estimate exposure risks to metals for a typical foodweb in the Afferdensche and Deestsche Waarden floodplain along the river Waal, the main branch of the Rhine in the Netherands. Spatial variability of pollutants is quantified by overlaying appropriate topographic and soil maps resulting in the definition of homogeneous pollution units. Next to that, GIS is used to include foraging behavior of the exposed terrestrial organisms. Risk estimates from a probabilistic exposure model were used to construct site-specific risk maps for the floodplain. Based on these maps, recommendations for future management of the floodplain can be made that aim at both ecological rehabilitation and an optimal flood defense.

  16. Assessment of ecological risks at former landfill site using TRIAD procedure and multicriteria analysis.

    PubMed

    Sorvari, Jaana; Schultz, Eija; Haimi, Jari

    2013-02-01

    Old industrial landfills are important sources of environmental contamination in Europe, including Finland. In this study, we demonstrated the combination of TRIAD procedure, multicriteria decision analysis (MCDA), and statistical Monte Carlo analysis for assessing the risks to terrestrial biota in a former landfill site contaminated by petroleum hydrocarbons (PHCs) and metals. First, we generated hazard quotients by dividing the concentrations of metals and PHCs in soil by the corresponding risk-based ecological benchmarks. Then we conducted ecotoxicity tests using five plant species, earthworms, and potworms, and determined the abundance and diversity of soil invertebrates from additional samples. We aggregated the results in accordance to the methods used in the TRIAD procedure, conducted rating of the assessment methods based on their performance in terms of specific criteria, and weighted the criteria using two alternative weighting techniques to produce performance scores for each method. We faced problems in using the TRIAD procedure, for example, the results from the animal counts had to be excluded from the calculation of integrated risk estimates (IREs) because our reference soil sample showed the lowest biodiversity and abundance of soil animals. In addition, hormesis hampered the use of the results from the ecotoxicity tests. The final probabilistic IREs imply significant risks at all sampling locations. Although linking MCDA with TRIAD provided a useful means to study and consider the performance of the alternative methods in predicting ecological risks, some uncertainties involved still remained outside the quantitative analysis. PMID:22762796

  17. Framework and operational procedure for implementing Strategic Environmental Assessment in China

    SciTech Connect

    Bao Cunkuan; Lu Yongsen; Shang Jincheng

    2004-01-01

    Over the last 20 years, Environmental Impact Assessment (EIA) has been implemented and become an important instrument for decision-making in development projects in China. The Environmental Impact Assessment Law of the P.R. China was promulgated on 28 October 2002 and will be put into effect on 1 September of 2003. The law provides that Strategic Environmental Assessment (SEA) is required in regional and sector plans and programs. This paper introduces the research achievements and practice of SEA in China, discusses the relationship of SEA and 'integrating of environment and development in decision-making (IEDD)', and relevant political and legal basis of SEA. The framework and operational procedures of SEA administration and enforcement are presented. Nine cases are analyzed and some proposals are given.

  18. Assessment of the accuracy of PPP for very-high-frequency dynamic, satellite positioning and earthquake modeling

    NASA Astrophysics Data System (ADS)

    Moschas, F.; Avallone, A.; Moschonas, N.; Saltogianni, V.; Stiros, S.

    2012-04-01

    With the advent of various GPS/GNSS Point Positioning techniques, it became possible to model the dynamic displacement history of specific points during large and rather moderate earthquakes using satellite positioning, 1Hz and occasionally 10Hz sampling data. While there is evidence that the obtained data are precise, experience from monitoring of engineering structures like bridges, indicates that GPS/GNSS records are contaminated by coloured (mostly background noise) noise even in the cases of differential-type analysis of the satellite signals. This made the necessary the assessment of the results of different PPP processing using supervised learning techniques. Our work was based on a modification of an experiment first made to assess the potential of GPS to measure oscillations of civil engineering structures. A 10Hz GNSS antenna-receiver unit was mounted on the top of a vertical rod, fixed on the ground and forced to controlled oscillations. Oscillations were also recorded by a robotic theodolite and an accelerometer, and the whole experiment was video-recorded. A second 10Hz GNSS antenna-receiver unit was left on stable ground, in a nearby position. The rod was forced to semi-static motion (bending) and then was left to oscillate freely until still, and the whole movement was recorded by all sensors. GNSS data were analyzed both in kinematic mode and in PPP mode, using the GIPSY-OASIS II (http://gipsy-oasis.jpl.nasa.gov) (only GPS) and the PPP CRCS facility (GPS + GLONAS). Recorded PPP and differential kinematic processing coordinates (apparent displacements) were found to follow the real motion, but to be contaminated by a long-period noise. On the contrary, the short-period component of the apparent PPP displacements, obtained using high-pass filtering, were very much consistent with the real motion, with sub-mm mean deviation, though occasionally contaminated by clipping. The assessment of the very-high frequency GPS noise will provide useful information

  19. The assessment of cognitive procedural learning in amnesia: why the tower of Hanoi has fallen down.

    PubMed

    Winter, W E; Broman, M; Rose, A L; Reber, A S

    2001-02-01

    The Tower of Hanoi has been widely accepted as an evaluation of cognitive procedural learning in amnesia but inconsistent findings have raised questions about the nature of the learning process involved in this task. This article presents the performance of a hippocampal amnesic, MS, who, showing poor learning across daily sessions of a formal evaluation, subsequently solved the puzzle through spontaneous use of a declarative-level strategy (the odd-even rule), suggesting that his primary approach to the task was the deployment of declarative solution-searching strategies. The presented data suggest normal learning within daily sessions, but subnormal learning across daily sessions due to the forgetting of acquired declarative information. It is suggested that tasks that are potentially solvable by an algorithm or rule, as is the Tower of Hanoi, be regarded as inappropriate for use in cognitive procedural assessments.

  20. Diagnostic accuracy of refractometer and Brix refractometer to assess failure of passive transfer in calves: protocol for a systematic review and meta-analysis.

    PubMed

    Buczinski, S; Fecteau, G; Chigerwe, M; Vandeweerd, J M

    2016-06-01

    Calves are highly dependent of colostrum (and antibody) intake because they are born agammaglobulinemic. The transfer of passive immunity in calves can be assessed directly by dosing immunoglobulin G (IgG) or by refractometry or Brix refractometry. The latter are easier to perform routinely in the field. This paper presents a protocol for a systematic review meta-analysis to assess the diagnostic accuracy of refractometry or Brix refractometry versus dosage of IgG as a reference standard test. With this review protocol we aim to be able to report refractometer and Brix refractometer accuracy in terms of sensitivity and specificity as well as to quantify the impact of any study characteristic on test accuracy. PMID:27427188

  1. Variable selection procedures before partial least squares regression enhance the accuracy of milk fatty acid composition predicted by mid-infrared spectroscopy.

    PubMed

    Gottardo, P; Penasa, M; Lopez-Villalobos, N; De Marchi, M

    2016-10-01

    Mid-infrared spectroscopy is a high-throughput technique that allows the prediction of milk quality traits on a large-scale. The accuracy of prediction achievable using partial least squares (PLS) regression is usually high for fatty acids (FA) that are more abundant in milk, whereas it decreases for FA that are present in low concentrations. Two variable selection methods, uninformative variable elimination or a genetic algorithm combined with PLS regression, were used in the present study to investigate their effect on the accuracy of prediction equations for milk FA profile expressed either as a concentration on total identified FA or a concentration in milk. For FA expressed on total identified FA, the coefficient of determination of cross-validation from PLS alone was low (0.25) for the prediction of polyunsaturated FA and medium (0.70) for saturated FA. The coefficient of determination increased to 0.54 and 0.95 for polyunsaturated and saturated FA, respectively, when FA were expressed on a milk basis and using PLS alone. Both algorithms before PLS regression improved the accuracy of prediction for FA, especially for FA that are usually difficult to predict; for example, the improvement with respect to the PLS regression ranged from 9 to 80%. In general, FA were better predicted when their concentrations were expressed on a milk basis. These results might favor the use of prediction equations in the dairy industry for genetic purposes and payment system. PMID:27522434

  2. Accuracy of Ultrasound-Based (BAT) Prostate-Repositioning: A Three-Dimensional On-Line Fiducial-Based Assessment With Cone-Beam Computed Tomography

    SciTech Connect

    Boda-Heggemann, Judit Koehler, Frederick Marc; Kuepper, Beate; Wolff, Dirk; Wertz, Hansjoerg; Mai, Sabine; Hesser, Juergen; Lohr, Frank; Wenz, Frederik

    2008-03-15

    Purpose: To assess the accuracy of ultrasound-based repositioning (BAT) before prostate radiation with fiducial-based three-dimensional matching with cone-beam computed tomography (CBCT). Patients and Methods: Fifty-four positionings in 8 patients with {sup 125}I seeds/intraprostatic calcifications as fiducials were evaluated. Patients were initially positioned according to skin marks and after this according to bony structures based on CBCT. Prostate position correction was then performed with BAT. Residual error after repositioning based on skin marks, bony anatomy, and BAT was estimated by a second CBCT based on user-independent automatic fiducial registration. Results: Overall mean value (MV {+-} SD) residual error after BAT based on fiducial registration by CBCT was 0.7 {+-} 1.7 mm in x (group systematic error [M] = 0.5 mm; SD of systematic error [{sigma}] = 0.8 mm; SD of random error [{sigma}] = 1.4 mm), 0.9 {+-} 3.3 mm in y (M = 0.5 mm, {sigma} = 2.2 mm, {sigma} = 2.8 mm), and -1.7 {+-} 3.4 mm in z (M = -1.7 mm, {sigma} = 2.3 mm, {sigma} = 3.0 mm) directions, whereas residual error relative to positioning based on skin marks was 2.1 {+-} 4.6 mm in x (M = 2.6 mm, {sigma} = 3.3 mm, {sigma} = 3.9 mm), -4.8 {+-} 8.5 mm in y (M = -4.4 mm, {sigma} = 3.7 mm, {sigma} = 6.7 mm), and -5.2 {+-} 3.6 mm in z (M = -4.8 mm, {sigma} = 1.7 mm, {sigma} = 3.5mm) directions and relative to positioning based on bony anatomy was 0 {+-} 1.8 mm in x (M = 0.2 mm, {sigma} = 0.9 mm, {sigma} = 1.1 mm), -3.5 {+-} 6.8 mm in y (M = -3.0 mm, {sigma} = 1.8 mm, {sigma} = 3.7 mm), and -1.9 {+-} 5.2 mm in z (M = -2.0 mm, {sigma} = 1.3 mm, {sigma} = 4.0 mm) directions. Conclusions: BAT improved the daily repositioning accuracy over skin marks or even bony anatomy. The results obtained with BAT are within the precision of extracranial stereotactic procedures and represent values that can be achieved with several users with different education levels. If sonographic visibility is insufficient

  3. Assessing the Intraoperative Accuracy of Pedicle Screw Placement by Using a Bone-Mounted Miniature Robot System through Secondary Registration

    PubMed Central

    Wu, Chieh-Hsin; Tsai, Cheng-Yu; Chang, Chih-Hui; Lin, Chih-Lung; Tsai, Tai-Hsin

    2016-01-01

    Introduction Pedicle screws are commonly employed to restore spinal stability and correct deformities. The Renaissance robotic system was developed to improve the accuracy of pedicle screw placement. Purpose In this study, we developed an intraoperative classification system for evaluating the accuracy of pedicle screw placements through secondary registration. Furthermore, we evaluated the benefits of using the Renaissance robotic system in pedicle screw placement and postoperative evaluations. Finally, we examined the factors affecting the accuracy of pedicle screw implantation. Results Through use of the Renaissance robotic system, the accuracy of Kirschner-wire (K-wire) placements deviating <3 mm from the planned trajectory was determined to be 98.74%. According to our classification system, the robot-guided pedicle screw implantation attained an accuracy of 94.00% before repositioning and 98.74% after repositioning. However, the malposition rate before repositioning was 5.99%; among these placements, 4.73% were immediately repositioned using the robot system and 1.26% were manually repositioned after a failed robot repositioning attempt. Most K-wire entry points deviated caudally and laterally. Conclusion The Renaissance robotic system offers high accuracy in pedicle screw placement. Secondary registration improves the accuracy through increasing the precision of the positioning; moreover, intraoperative evaluation enables immediate repositioning. Furthermore, the K-wire tends to deviate caudally and laterally from the entry point because of skiving, which is characteristic of robot-assisted pedicle screw placement. PMID:27054360

  4. Diagnostic accuracy of emergency-performed focused assessment with sonography for trauma (FAST) in blunt abdominal trauma

    PubMed Central

    Ghafouri, Hamed Basir; Zare, Morteza; Bazrafshan, Azam; Modirian, Ehsan; Farahmand, Shervin; Abazarian, Niloofar

    2016-01-01

    Introduction Intra-abdominal hemorrhage due to blunt abdominal trauma is a major cause of trauma-related mortality. Therefore, any action taken for facilitating the diagnosis of intra-abdominal hemorrhage could save the lives of patients more effectively. The aim of this study was to determine the accuracy of focused assessment with sonography for trauma (FAST) performed by emergency physicians. Methods In this cross-sectional study from February 2011 to January 2012 at 7th Tir Hospital in Tehran (Iran), 120 patients with abdominal blunt trauma were chosen and evaluated for abdominal fluid. FAST sonography was performed for all the subjects by emergency residents and radiologists while they were blind to the other tests. Abdominal CTs, which is the gold standard, were done for all of the cases. SPSS 20.0 was used to analyze the results. Results During the study, 120 patients with abdominal blunt trauma were evaluated; the mean age of the patients was 33.0 ± 16.6 and the gender ratio was 3/1 (M/F). The results of FAST sonography by emergency physicians showed free fluid in the abdomen or pelvic spaces in 33 patients (27.5%), but this was not observed by the results of CT scans of six patients; sensitivity and specificity were 93.1 and 93.4%, respectively. As for tests performed by radiology residents, sensitivity was a bit higher (96.5%) with lower specificity (92.3%). Conclusion The results suggested that emergency physicians can use ultrasonography as a safe and reliable method in evaluating blunt abdominal trauma. PMID:27790349

  5. A procedural evaluation of an analytic-deliberative process: the Columbia River Comprehensive Impact Assessment.

    PubMed

    Kinney, Aimee Guglielmo; Leschine, Thomas M

    2002-02-01

    The U.S. Department of Energy's Columbia River Comprehensive Impact Assessment (CRCIA) was an ambitious attempt to direct its cleanup of the Hanford Nuclear Reservation toward the most significant risks to the Columbia River resulting from past plutonium production. DOE's approach was uncommonly open, including tribal, regulatory agency, and other Hanford interest group representatives on the board that was to develop the assessment approach. The CRCIA process had attributes of the "analytic-deliberative" process for risk assessment recommended by the National Research Council. Nevertheless, differences between the DOE and other participants over what was meant by the term "comprehensive" in the group's charge, coupled with differing perceptions of the likely effectiveness of remediation efforts in reducing risks, were never resolved. The CRCIA effort became increasingly fragmented and the role its products were to play in influencing future clean-up decisions increasingly ambiguous. A procedural evaluation of the CRCIA process, based on Thomas Webler's procedural normative model of public participation, reveals numerous instances in which theoretical-normative discourse disconnects occurred. These had negative implications for both the basic procedural dimensions of Webler's model-fairness and competence. Tribal and other interest group representatives lacked the technical resources necessary to make or challenge what philosopher Jurgens Habermas terms cognitive validity claims, while DOE and its contractors did not challenge normative claims made by tribal representatives. The results are cautionary for implementation of the analytic-deliberative process. They highlight the importance of bringing rigor to the evaluation of the quality of the deliberation component of risk characterization via the analytic-deliberative process, as well as to the analytic component.

  6. An assessment of the near-surface accuracy of the international geomagnetic reference field 1980 model of the main geomagnetic field

    USGS Publications Warehouse

    Peddie, N.W.; Zunde, A.K.

    1985-01-01

    The new International Geomagnetic Reference Field (IGRF) model of the main geomagnetic field for 1980 is based heavily on measurements from the MAGSAT satellite survey. Assessment of the accuracy of the new model, as a description of the main field near the Earth's surface, is important because the accuracy of models derived from satellite data can be adversely affected by the magnetic field of electric currents in the ionosphere and the auroral zones. Until now, statements about its accuracy have been based on the 6 published assessments of the 2 proposed models from which it was derived. However, those assessments were either regional in scope or were based mainly on preliminary or extrapolated data. Here we assess the near-surface accuracy of the new model by comparing it with values for 1980 derived from annual means from 69 magnetic observatories, and by comparing it with WC80, a model derived from near-surface data. The comparison with observatory-derived data shows that the new model describes the field at the 69 observatories about as accurately as would a model derived solely from near-surface data. The comparison with WC80 shows that the 2 models agree closely in their description of D and I near the surface. These comparisons support the proposition that the new IGRF 1980 main-field model is a generally accurate description of the main field near the Earth's surface in 1980. ?? 1985.

  7. The Role of Some Selected Psychological and Personality Traits of the Rater in the Accuracy of Self- and Peer-Assessment

    ERIC Educational Resources Information Center

    AlFallay, Ibrahim

    2004-01-01

    This paper investigates the role of some selected psychological and personality traits of learners of English as a foreign language in the accuracy of self- and peer-assessments. The selected traits were motivation types, self-esteem, anxiety, motivational intensity, and achievement. 78 students of English as a foreign language participated in…

  8. Assessment of the accuracy of portion size reports using computer-based food photographs aids in the development of an automated self-administered 24-hour recall

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of the study is to assess the accuracy of portion-size estimates and participant preferences using various presentations of digital images. Two observational feeding studies were conducted. In both, each participant selected and consumed foods for breakfast and lunch, buffet style, se...

  9. Incorporating mesh-insensitive structural stress into the fatigue assessment procedure of common structural rules for bulk carriers

    NASA Astrophysics Data System (ADS)

    Kim, Seong-Min; Kim, Myung-Hyun

    2015-01-01

    This study introduces a fatigue assessment procedure using mesh-insensitive structural stress method based on the Common Structural Rules for Bulk Carriers by considering important factors, such as mean stress and thickness effects. The fatigue assessment result of mesh-insensitive structural stress method have been compared with CSR procedure based on equivalent notch stress at major hot spot points in the area near the ballast hold for a 180 K bulk carrier. The possibility of implementing mesh-insensitive structural stress method in the fatigue assessment procedure for ship structures is discussed.

  10. Accuracy of Population Validity and Cross-Validity Estimation: An Empirical Comparison of Formula-Based, Traditional Empirical, and Equal Weights Procedures.

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Bilgic, Reyhan; Edwards, Jack E.; Fleer, Paul F.

    1999-01-01

    Performed an empirical Monte Carlo study using predictor and criterion data from 84,808 U.S. Air Force enlistees. Compared formula-based, traditional empirical, and equal-weights procedures. Discusses issues for basic research on validation and cross-validation. (SLD)

  11. Position Paper on the Potential Use of Computerized Testing Procedures for the National Assessment of Educational Progress.

    ERIC Educational Resources Information Center

    Reckase, Mark D.

    The current technology of computerized testing is discussed, and a few comments are made on how such technology might be used for assessing school-related skills as part of the National Assessment of Educational progress (NAEP). The critical feature of computerized assessment procedures is that the test items are presented in interactive fashion,…

  12. A formal expert judgment procedure for performance assessments of the Waste Isolation Pilot Plant

    SciTech Connect

    Trauth, K.M.; Guzowski, R.V.; Hora, S.C.

    1993-12-31

    The Waste Isolation Pilot Plant (WIPP) is an experimental facility located in southeastern New Mexico. It has been designed to determine the feasibility of the geologic disposal of defense-generated transuranic waste in a deep bedded-salt formation. The WIPP was also designed for disposal and will operate in that capacity if approved. The WIPP Performance Assessment Department at Sandia National Laboratories has been conducting analyses to assess the long-term performance of the WIPP. These analyses sometimes require the use of expert judgment. This Department has convened several expert-judgment panels and from that experience has developed an internal quality-assurance procedure to guide the formal elicitation of expert judgment. This protocol is based on the principles found in the decision-analysis literature.

  13. Assessment of metabolic bone disease: review of new nuclear medicine procedures

    SciTech Connect

    Wahner, H.W.

    1985-12-01

    In the management of patients with metabolic bone disease, nuclear medicine laboratories offer two nontraumatic procedures of potential clinical importance: bone mineral measurements and bone scintigraphy. Bone mineral measurements from the radius, lumbar spine, and hip obtained with use of absorptiometry or computed tomography can be used to predict the risk of fracture at these skeletal sites, can determine the severity of bone loss for the assessment of a benefit-versus-risk ratio on which appropriate therapy can be based, and can substantiate the effectiveness of therapy over time. Bone scintigraphy with use of labeled diphosphonate allows assessment of focal and, in defined circumstances, of total skeletal bone turnover in patients with normal kidney function. Both of these techniques have been used successfully in studies of population groups for the evaluation of trends. Their application to the management of individual patients is currently being evaluated. 41 references.

  14. Statistical methods to assess the reliability of measurements in the procedures for forensic age estimation.

    PubMed

    Ferrante, L; Cameriere, R

    2009-07-01

    In forensic science, anthropology, and archaeology, several techniques have been developed to estimate chronological age in both children and adults, using the relationship between age and morphological changes in the structure of teeth. Before implementing a statistical model to describe age as a function of the measured morphological variables, the reliability of the measurements of these variables must be evaluated using suitable statistical methods. This paper introduces some commonly used statistical methods for assessing the reliability of procedures for age estimation in the forensic field. The use of the concordance correlation coefficient and the intraclass correlation coefficient are explained. Finally, some pitfalls in the choice of the statistical methods to assess reliability of the measurements in age estimation are discussed.

  15. A formal expert judgment procedure for performance assessments of the Waste Isolation Pilot Plant

    SciTech Connect

    Trauth, K.M.; Guzowski, R.V.; Hora, S.C.

    1994-09-01

    The Waste Isolation Pilot Plant (WIPP) is an experimental facility located in southeastern New Mexico. It has been designed to determine the feasibility of the geologic disposal of defense-generated transuranic waste in a deep bedded-salt formation. The WIPP was also designed for disposal and will operate in that capacity if approved. The WIPP Performance Assessment Department at Sandia National Laboratories has been conducting analyses to assess the long-term performance of the WIPP. These analyses sometimes require the use of expert judgment. This Department has convened several expert-judgment panels and from that experience has developed an internal quality-assurance procedure to guide the formal elicitation of expert judgment. This protocol is based on the principles found in the decision-analysis literature.

  16. High-temperature flaw assessment procedure: A state-of-the-art survey

    SciTech Connect

    Ruggles, M.B.; Takahashi, Y.

    1989-05-01

    High-temperature crack growth under cyclic, static, and combined loading is received with an emphasis on fracture mechanics aspects. Experimental studies of the effects of loading history, microstructure, temperature, and environment on crack growth behavior are described and interpreted. The experimental evidence is used to examine crack growth parameters and theoretical models for fatigue, creep, and creep-fatigue crack propagation at elevated temperatures. The limitations of both elastic and elastic-plastic fracture mechanics for high-temperature subcritical crack growth are assessed. Existing techniques for modeling critical crack growth/ligament instability failure are also presented. Related topics of defect modeling and engineering flaw assessment procedures, nondestructive evaluation methods, and probabilistic failure analysis are briefly discussed. 142 refs., 33 figs.

  17. Laboratory guidelines and procedures for coal analysis: Volume 1, Assessing the cleanability of fine coal

    SciTech Connect

    Bosold, R.C.; Glessner, D.M.

    1988-05-01

    The conventional laboratory static bath float/sink method of measuring the theoretical limits of coal cleaning is unreliable for ultra-fine (minus 100M topsize) coal particles because of their long and erratic settling rates. Developing a reliable method to assess the theoretical cleanability of ultra-fine coal has been given impetus by the increased emphasis on reducing sulfur dioxide emissions from power plants, greater quantities of fines created by mechanized mining methods, and the development of advanced physical coal cleaning processes that grind coal to ultra-fine sizes in an effort to achieve high coal impurities liberation. EPRI, therefore, commissioned researchers at the Homer City Coal Laboratory in western Pennsylvania to develop and demonstrate a float/sink procedure for ultra-fine sizes. Based on test work performed on two ultra-fine size fractions (100M x 200M and 200M x 0), a detailed laboratory procedure using a centrifugal device was established. Results obtained using the guideline presented in this report are as accurate as those obtained using the static bath float/sink method, and for 200M x 0 material, more accurate. In addition, the centrifugal procedure is faster and less costly than the conventional static bath float/sink method. 12 refs., 32 figs., 1 tab.

  18. Procedure for estimating orbital debris risks

    NASA Technical Reports Server (NTRS)

    Crafts, J. L.; Lindberg, J. P.

    1985-01-01

    A procedure for estimating the potential orbital debris risk to the world's populace from payloads or spent stages left in orbit on future missions is presented. This approach provides a consistent, but simple, procedure to assess the risk due to random reentry with an adequate accuracy level for making programmatic decisions on planned low Earth orbit missions.

  19. Comparability of river quality assessment using macrophytes: a multi-step procedure to overcome biogeographical differences.

    PubMed

    Aguiar, F C; Segurado, P; Urbanič, G; Cambra, J; Chauvin, C; Ciadamidaro, S; Dörflinger, G; Ferreira, J; Germ, M; Manolaki, P; Minciardi, M R; Munné, A; Papastergiadou, E; Ferreira, M T

    2014-04-01

    This paper exposes a new methodological approach to solve the problem of intercalibrating river quality national methods when a common metric is lacking and most of the countries share the same Water Framework Directive (WFD) assessment method. We provide recommendations for similar works in future concerning the assessment of ecological accuracy and highlight the importance of a good common ground to make feasible the scientific work beyond the intercalibration. The approach herein presented was applied to highly seasonal rivers of the Mediterranean Geographical Intercalibration Group for the Biological Quality Element Macrophytes. The Mediterranean Group of river macrophytes involved seven countries and two assessment methods with similar acquisition data and assessment concept: the Macrophyte Biological Index for Rivers (IBMR) for Cyprus, France, Greece, Italy, Portugal and Spain, and the River Macrophyte Index (RMI) for Slovenia. Database included 318 sites of which 78 were considered as benchmarks. The boundary harmonization was performed for common WFD-assessment methods (all countries except Slovenia) using the median of the Good/Moderate and High/Good boundaries of all countries. Then, whenever possible, the Slovenian method, RMI was computed for the entire database. The IBMR was also computed for the Slovenian sites and was regressed against RMI in order to check the relatedness of methods (R(2)=0.45; p<0.00001) and to convert RMI boundaries into the IBMR scale. The boundary bias of RMI was computed using direct comparison of classification and the median boundary values following boundary harmonization. The average absolute class differences after harmonization is 26% and the percentage of classifications differing by half of a quality class is also small (16.4%). This multi-step approach to the intercalibration was endorsed by the WFD Regulatory Committee.

  20. Integration of small run-of-river and solar power: The hydrological regime prediction/assessment accuracy

    NASA Astrophysics Data System (ADS)

    Francois, Baptiste; Creutin, Jean-Dominique; Hingray, Benoit; Zoccatelli, Davide

    2014-05-01

    analyzed how water discharge prediction accuracy controls the assessment quality of run-of-river- and solar-power interaction. We especially sought over which hydro-meteorological context a simple prediction method of water discharge is able to produce pertinent run-of-river and solar power interaction assessment. We considered three degrees of complexity to estimate water discharges: i) model-based estimation using calibrated parameters over the watershed, ii) model-based estimation using parameters from nearby watershed and then iii) a scaling law. This work has been performed for a set of watersheds over a climate transect going from the Alpine crests to the Veneto plains in the north eastern part of Italy, where observed run-of-river power generations present different degrees of complementarities with solar power. The work presented is part of the FP7 project COMPLEX (Knowledge based climate mitigation systems for a low carbon economy; http://www.complex.ac.uk/).