Sample records for cryosat validation experiment

  1. CryoSat Processing Prototype, how to generate LRM like echoes with SAR data and a Comparison to DUACS SLA over high latitudes

    NASA Astrophysics Data System (ADS)

    Picot, N.; Boy, F.; Desjonqueres, J.

    2012-12-01

    Like CryoSat, Sentinel3 embarks a doppler altimeter. While there is a long experience of LRM processing, SAR nadir looking data are new and will need in depth validation. Thanks to CryoSat data, the processing of SAR data can be experienced in orbit. The continuity to current altimeter data set (based on LRM acquisitions) has also to be analysed with details. A Cryosat Processing Prototype (C2P) has been developed on CNES side to prepare the CNES SAR ocean retracking study. this prototype allows to process SAR data in order to generate LRM like echoes on ground. Those CryoSat ocean products are routinely processed on CNES side and ingested in the SALP/DUACS system. CryoSat data have proved to be very accurate and very valuable for the ocean user community in the past monthes. For example, it has allowed to largely reduce the impact of the lost of the ESA ENVISAT mission as well as the long non availability of Jason-1 data. This paper will describe the system set up in place early 2012 to feed CryoSat data in the SALP/DUACS products and will present the routine data analysis . C2P CryoSat products will be compared with DUACS SLA estimates and a specific focus will be given over high latitudes knowing that CryoSat is the oinly mission providing sea surface estimates over latitudes above 66 degrees since the lost of the ESA ENVISAT mission.

  2. The impact of the snow cover on sea-ice thickness products retrieved by Ku-band radar altimeters

    NASA Astrophysics Data System (ADS)

    Ricker, R.; Hendricks, S.; Helm, V.; Perovich, D. K.

    2015-12-01

    Snow on sea ice is a relevant polar climate parameter related to ocean-atmospheric interactions and surface albedo. It also remains an important factor for sea-ice thickness products retrieved from Ku-band satellite radar altimeters like Envisat or CryoSat-2, which is currently on its mission and the subject of many recent studies. Such satellites sense the height of the sea-ice surface above the sea level, which is called sea-ice freeboard. By assuming hydrostatic equilibrium and that the main scattering horizon is given by the snow-ice interface, the freeboard can be transformed into sea-ice thickness. Therefore, information about the snow load on hemispherical scale is crucial. Due to the lack of sufficient satellite products, only climatological values are used in current studies. Since such values do not represent the high variability of snow distribution in the Arctic, they can be a substantial contributor to the total sea-ice thickness uncertainty budget. Secondly, recent studies suggest that the snow layer cannot be considered as homogenous, but possibly rather featuring a complex stratigraphy due to wind compaction and/or ice lenses. Therefore, the Ku-band radar signal can be scattered at internal layers, causing a shift of the main scattering horizon towards the snow surface. This alters the freeboard and thickness retrieval as the assumption that the main scattering horizon is given by the snow-ice interface is no longer valid and introduces a bias. Here, we present estimates for the impact of snow depth uncertainties and snow properties on CryoSat-2 sea-ice thickness retrievals. We therefore compare CryoSat-2 freeboard measurements with field data from ice mass-balance buoys and aircraft campaigns from the CryoSat Validation Experiment. This unique validation dataset includes airborne laser scanner and radar altimeter measurements in spring coincident to CryoSat-2 overflights, and allows us to evaluate how the main scattering horizon is altered by the presence of a complex snow stratigraphy.

  3. Comparing IceBridge and CryoSat-2 sea ice observations over the Arctic and the Southern Ocean

    NASA Astrophysics Data System (ADS)

    Yi, D.; Kurtz, N. T.; Harbeck, J.; Hofton, M. A.; Manizade, S.; Cornejo, H.

    2016-12-01

    From 2009 to 2015, CryoSat-2 and IceBridge had 34 coincident lines over sea ice, 23 over the Arctic (20 with ATM, 2 with LVIS, and 1 with both ATM and LVIS) and 11 over the Southern Ocean (9 with ATM and 2 with both ATM and LVIS). In this study, we will compare both surface elevation and sea ice freeboard from CryoSat-2, ATM, and LVIS. We will apply identical ellipsoid, geoid, tide models, and atmospheric corrections to CryoSat-2, ATM, and LVIS data. For CryoSat-2, we will use surface elevation and sea ice freeboard both in the standard CryoSat-2 data product and calculated through a waveform fitting method. For ATM and LVIS, we will use surface elevation and sea ice freeboard in the OIB data product and the elevation and sea ice freeboard calculated through Gaussian waveform fitting method. The results of this study are important for using ATM and LVIS to calibrate/validate CryoSat-2 results and bridging the data gap between ICESat and ICESat-2.

  4. Validating Cryosat-2 elevation estimates with airborne laser scanner data for the Greenland ice sheet, Austfonna and Devon ice caps

    NASA Astrophysics Data System (ADS)

    Simonsen, Sebastian B.; Sandberg Sørensen, Louise; Nilsson, Johan; Helm, Veit; Langley, Kirsty A.; Forsberg, Rene; Hvidegaard, Sine M.; Skourup, Henriette

    2015-04-01

    The ESA CryoSat-2 satellite, launched in late 2010, carries a new type of radar altimeter especially designed for monitoring changes of sea and land ice. The radar signal might penetrate into the snow pack and the depth of the radar reflecting surface depends on the ratio between the surface and the volume backscatter, which is a function of several different properties such as snow density, crystal structure and surface roughness. In case of large volume scatter, the radar waveforms become broad and the determination of the range (surface elevation) becomes more difficult. Different algorithms (retrackers) are used for the range determination, and estimated surface penetration is highly dependent on the applied retracker. As part of the ESA-CryoVEx/CryoVal-Land Ice projects, DTU Space has gathered accurate airborne laser scanner elevation measurements. Sites on the Greenland ice sheet, Austfonna and Devon ice caps, has been surveyed repeatedly, aligned with Cryosat-2 ground tracks and surface experiments. Here, we utilize elevation estimates from available Cryosat-2 retrackers (ESA level-2 retracker, DTU retracker, etc.) and validate the elevation measurements against ESA-CryoVEx campaigns. A difference between laser and radar elevations is expected due to radar penetration issues, however an inter-comparison between retrackers will shed light on individual performances and biases. Additionally, the geo-location of the radar return will also be a determining factor for the precision. Ultimately, the use of multiple retrackers can provide information about subsurface conditions and utilize more of the waveform information than presently used in radar altimetry.

  5. CRYOSAT-2: POST Launch Performance of SIRAL-2 and its Calibration/validation

    NASA Astrophysics Data System (ADS)

    Cullen, Robert

    1. INTRODUCTION The main payload of CryoSat-2 [1], SIRAL (Synthetic interferometric radar altimeter), is a Ku band pulse-width limited radar altimeter which transmits pulses at a high pulse repetition frequency thus making received echoes phase coherent and suitable for azimuth processing [2]. The azimuth processing in conjunction with correction for slant range improves along track resolution to about 250 meters which is a significant improvement over traditional pulse-width limited systems such as Envisat RA-2, [3]. CryoSat-2 will be launched on 25th February 2010 and this paper describes the pre and post launch measures of CryoSat/SIRAL performance and the status of mission validation planning. 2. SIRAL PERFORMANCE: INTERNAL AND EXTERNAL CALIBRATION Phase coherent pulse-width limited radar altimeters such as SIRAL-2 pose a new challenge when considering a strategy for calibration. Along with the need to generate the well under-stood corrections for transfer function amplitude with respect to frequency, gain and instrument path delay there is also a need to provide corrections for transfer function phase with respect to frequency and AGC setting, phase variation across bursts of pulses. Furthermore, since some components of these radars are temperature sensitive one needs to be careful when the decid-ing how often calibrations are performed whilst not impacting mission performance. Several internal calibration ground processors have been developed to model imperfections within the CryoSat-2 radar altimeter (SIRAL-2) hardware and reduce their effect from the science data stream via the use of calibration correction auxiliary products within the ground segment. We present the methods and results used to model and remove imperfections and describe the baseline for usage of SIRAL-2 calibration modes during the commissioning phase and the op-erational exploitation phases of the mission. Additionally we present early results derived from external calibration of SIRAL via the use of ocean calibration zones and radar transponders. 3. CRYOSAT-2 OVERALL PERFORMANCE VALIDATION PLANNING Validating such retrievals derived from a phase coherent pulse-width limited polar observing radar altimeter, such as SIRAL, is not a simple one [4]. In order to fully understand all the respective error co-variances it is necessary to acquire many different types of in-situ mea-surements (GPR, neutron probe density profiles, drilled and electromagnetic derived sea-ice thicknesses, for example) in highly inhospitable regions of the cryosphere at key times of the year. In order to correlate retrievals from CryoSat with the in-situ data it was decided early in the CryoSat development that an aircraft borne radar altimeter with similar functionality to SIRAL would provide the necessary link, albeit on the smaller scale, and provide pre-launch incite into expected performances and issues. In 2001 ESA commenced the development of its own prototype radar altimeter that mimics the functionality of SIRAL. Similar to SIRAL, but with subtle functional differences, the airborne SAR/Interferometric Radar Altimeter System (ASIRAS) has now been the centre piece instrument for a number of large scale land and sea ice field campaigns in the Arctic during spring and autumn 2004, 2006 and 2008. Additional smaller science/test campaigns have taken place in March 2003 (Svalbard), March 2005 (Bay of Bothnia), March 2006 (Western Greenland) and April 2007 (CryoVEx 2007 in Svalbard). It is a credit to all parties that constitute the CryoSat Validation and Retrieval Team (CVRT) for the coordination, planning, acquisition of in-situ and airborne measurements and the subsequent processing and distributing of its data for analysis. CVRT has a robust infrastructure in place for validating its level 2 products derived from an operational CryoSat-2. 4. REFERENCES [1] http://www.esa.int/livingplanet/cryosat [2] Wingham, D. J., Francis, C. R., Baker, S., Bouzinac, C., Cullen, R., de Chateau-Thierry, P., Laxon, S. W., Mallow, U., Mavrocordatos, C., Phalippou, L., Ratier, G., Rey, L., Ros-tan, F., Viau. P. and Wallis, D., `CryoSat: A Mission to Determine the Fluctuations in Earth's Land and Marine Ice Fields'. Advances in Space Research, 37(2006) Pp 841-871. doi:10.1016/j.asr.2005.07.027. [3] Mission and data description, CS-RP-ESA-SY-0059 issue 3, 2nd January 2007. http://esamultimedia.esa [4] Cryosat calibration and validation concept, http://esamultimedia.esa.int/docs/Cryosat/CVC1 4N ov01.p

  6. CryoSat-2: Post launch performance of SIRAL-2 and its calibration/validation

    NASA Astrophysics Data System (ADS)

    Cullen, Robert; Francis, Richard; Davidson, Malcolm; Wingham, Duncan

    2010-05-01

    1. INTRODUCTION The main payload of CryoSat-2 [1], SIRAL (Synthetic interferometric radar altimeter), is a Ku band pulse-width limited radar altimeter which transmits pulses at a high pulse repetition frequency thus making received echoes phase coherent and suitable for azimuth processing [2]. The azimuth processing in conjunction with correction for slant range improves along track resolution to about 250 meters which is a significant improvement over traditional pulse-width limited systems such as Envisat RA-2, [3]. CryoSat-2 will be launched on 25th February 2010 and this paper describes the pre and post launch measures of CryoSat/SIRAL performance and the status of mission validation planning. 2. SIRAL PERFORMANCE: INTERNAL AND EXTERNAL CALIBRATION Phase coherent pulse-width limited radar altimeters such as SIRAL-2 pose a new challenge when considering a strategy for calibration. Along with the need to generate the well understood corrections for transfer function amplitude with respect to frequency, gain and instrument path delay there is also a need to provide corrections for transfer function phase with respect to frequency and AGC setting, phase variation across bursts of pulses. Furthermore, since some components of these radars are temperature sensitive one needs to be careful when the deciding how often calibrations are performed whilst not impacting mission performance. Several internal calibration ground processors have been developed to model imperfections within the CryoSat-2 radar altimeter (SIRAL-2) hardware and reduce their effect from the science data stream via the use of calibration correction auxiliary products within the ground segment. We present the methods and results used to model and remove imperfections and describe the baseline for usage of SIRAL-2 calibration modes during the commissioning phase and the operational exploitation phases of the mission. Additionally we present early results derived from external calibration of SIRAL via the use of ocean calibration zones and radar transponders. 3. CRYOSAT-2 OVERALL PERFORMANCE & VALIDATION PLANNING Validating such retrievals derived from a phase coherent pulse-width limited polar observing radar altimeter, such as SIRAL, is not a simple one [4]. In order to fully understand all the respective error co-variances it is necessary to acquire many different types of in-situ measurements (GPR, neutron probe density profiles, drilled and electromagnetic derived sea-ice thicknesses, for example) in highly inhospitable regions of the cryosphere at key times of the year. In order to correlate retrievals from CryoSat with the in-situ data it was decided early in the CryoSat development that an aircraft borne radar altimeter with similar functionality to SIRAL would provide the necessary link, albeit on the smaller scale, and provide pre-launch incite into expected performances and issues. In 2001 ESA commenced the development of its own prototype radar altimeter that mimics the functionality of SIRAL. Similar to SIRAL, but with subtle functional differences, the airborne SAR/Interferometric Radar Altimeter System (ASIRAS) has now been the centre piece instrument for a number of large scale land and sea ice field campaigns in the Arctic during spring and autumn 2004, 2006 and 2008. Additional smaller science/test campaigns have taken place in March 2003 (Svalbard), March 2005 (Bay of Bothnia), March 2006 (Western Greenland) and April 2007 (CryoVEx 2007 in Svalbard). It is a credit to all parties that constitute the CryoSat Validation and Retrieval Team (CVRT) for the coordination, planning, acquisition of in-situ and airborne measurements and the subsequent processing and distributing of its data for analysis. CVRT has a robust infrastructure in place for validating its level 2 products derived from an operational CryoSat-2. 4. REFERENCES [1] http://www.esa.int/livingplanet/cryosat [2] Wingham, D. J., Francis, C. R., Baker, S., Bouzinac, C., Cullen, R., de Chateau-Thierry, P., Laxon, S. W., Mallow, U., Mavrocordatos, C., Phalippou, L., Ratier, G., Rey, L., Rostan, F., Viau. P. and Wallis, D., ‘CryoSat: A Mission to Determine the Fluctuations in Earth's Land and Marine Ice Fields'. Advances in Space Research, 37(2006) Pp 841-871. doi:10.1016/j.asr.2005.07.027. [3] Mission and data description, CS-RP-ESA-SY-0059 issue 3, 2nd January 2007. http://esamultimedia.esa.int/docs/Cryosat/Mission_and_Data_Descrip.pdf [4] Cryosat calibration and validation concept, http://esamultimedia.esa.int/docs/Cryosat/CVC_14Nov01.pdf

  7. Improved Oceanographic Measurements from SAR Altimetry: Results and Scientific Roadmap from ESA CryoSat Plus for Oceans Project

    NASA Astrophysics Data System (ADS)

    Cotton, P. D.; Andersen, O.; Stenseng, L.; Boy, F.; Cancet, M.; Cipollini, P.; Gommenginger, C.; Dinardo, S.; Egido, A.; Fernandes, M. J.; Garcia, P. N.; Moreau, T.; Naeije, M.; Scharroo, R.; Lucas, B.; Benveniste, J.

    2016-08-01

    The ESA CryoSat mission is the first space mission to carry a radar altimeter that can operate in Synthetic Aperture Radar (SAR) mode. Although the prime objective of the CryoSat mission is dedicated to monitoring land and marine ice, the SAR mode capability of the CryoSat SIRAL altimeter also presents significant potential benefits for ocean applications including improved range precision and finer along track spatial resolution.The "Cryosat Plus for Oceans" (CP4O) project, supported by the ESA Support to Science Element (STSE) Programme and by CNES, was dedicated to the exploitation of Cryosat-2 data over the open and coastal ocean. The general objectives of the CP4O project were: To build a sound scientific basis for new oceanographic applications of Cryosat-2 data; to generate and evaluate new methods and products that will enable the full exploitation of the capabilities of the Cryosat-2 SIRAL altimeter, and to ensure that the scientific return of the Cryosat-2 mission is maximised.This task was addressed within four specific themes: Open Ocean Altimetry; High Resolution Coastal Zone Altimetry; High Resolution Polar Ocean Altimetry; High Resolution Sea-Floor Bathymetry, with further work in developing improved geophysical corrections. The Cryosat Plus 4 Oceans (CP4O) consortium brought together a uniquely strong team of key European experts to develop and validate new algorithms and products to enable users to fully exploit the novel capabilities of the Cryosat-2 mission for observations over ocean. The consortium was led by SatOC (UK), and included CLS (France), Delft University of Technology (The Netherlands), DTU Space (Denmark), isardSat (Spain), National Oceanography Centre (UK), Noveltis (France), Starlab (Spain) and the University of Porto (Portugal).This paper presents an overview of the major results and outlines a proposed roadmap for the further development and exploitation of these results in operational and scientific applications.

  8. Calibration And Validation Of CryoSat-2 Low Resolution Mode Data

    NASA Astrophysics Data System (ADS)

    Naeije, M.; Schrama, E.; Scharroo, R.

    2011-02-01

    Running ahead of the continuously growing need for operational use of sea level products, TUDelft started off the Radar Altimeter Database System RADS many years ago. This system attends to a global international sea- level service. It supports, on one hand, science, like studies on ocean circulation, El Nio, sea level change, and ice topography, and on the other hand (offshore) operations, like delivery of ocean current information, wind and wave statistics, ice detection and ice classification. At present, the database is used by a large scientific community throughout the world, and is daily maintained and developed by Altimetrics LLC, TUDelft and NOAA. It contains all historic altimeter data, and now has to be up- dated with the data from ESAs ice mission CryoSat-2, which was launched successfully in April 2010. These new data are important to augment the data set and by that to improve the estimates of sea level change and its contributors. For this the data have to be validated and calibrated, necessary corrections added and improved (including modelling of corrections that are not directly available from the CryoSat-2 platform), and the orbit ac- curacy verified and if possible the orbits brushed up. Subsequently, value-added ocean and ice products need to be developed in synergy with all the other satellite altimeter data. During the commissioning phase we primarily looked at the sanity of the available level-1b and level-2 Low Resolution Mode (LRM) data. Here, for the 2011 CryoSat Validation Workshop, we present the results of our calibration and validation of LRM L2 data by internal comparison of CryoSat-2 and external comparison with other satellites. We have established a range bias of 3.77 (measurement range too long) and a timing bias of 8.2ms (measurement range too late).

  9. Unmanned Aircraft Systems For CryoSat-2 Validation

    NASA Astrophysics Data System (ADS)

    Crocker, Roger Ian; Maslanik, James A.

    2011-02-01

    A suite of sensors has been assembled to map surface elevation with fine-resolution from small unmanned aircraft systems (UAS). The sensor package consists of a light detecting and ranging (LIDAR) instrument, an inertial measurement unit (IMU), a GPS module, and digital still and video cameras. It has been utilized to map ice sheet topography in Greenland and to measure sea ice freeboard and roughness in Fram Strait. Data collected during these campaigns illustrate its potential to compliment ongoing CryoSat-2 (CS-2) calibration and validation efforts.

  10. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  11. Airborne surveys in the Arctic and Antarctic for geophysics, sea-ice thickness, and CryoSat validation

    NASA Astrophysics Data System (ADS)

    Forsberg, R.; Olesen, A. V.; Hvidegaard, S.; Skourup, H.

    2010-12-01

    Airborne laser and radar measurements over the Greenland ice sheet, Svalbard, and adjacent parts of the Arctic Ocean have been carried out by DTU-Space in a number of recent Danish/Greenlandic and European project campaigns, with the purpose to monitor ice sheet and sea-ice changes, support of Greenland societal needs (oil exploration and hydropower), and support of CryoSat pre-launch calibration and validation campaigns. The Arctic campaigns have been done using a Twin-Otter aircraft, carrying laser scanners and various radars. Since 2009 a new program of long-range gravity and magnetic surveys have been initiated using a Basler DC3 aircraft for large-scale surveys in the Arctic Ocean and Antarctica, with the 2010 cooperative Danish-Argentinean-Chilean-US ICEGRAV survey of the Antarctic Peninsula additionally including a UTIG 60 MHz ice-penetrating radar. In the paper we outline the recent and upcoming airborne survey activities, outline the usefulness of the airborne data for satellite validation (CryoSat and GOCE), and give examples of measurements and comparisons to satellite and in-situ data.

  12. Evaluation of CryoSat-2 Measurements for the Monitoring of Large River Water Levels

    NASA Astrophysics Data System (ADS)

    Bercher, Nicolas; Calmant, Stephane; Picot, Nicolas; Seyler, Frederique; Fleury, Sara

    2013-09-01

    In this study, and maybe for the first time, we explore the ability of CryoSat-2 satellite to monitor the water level of large rivers. We focus on a section of 500 km of the Madeira river (Amazon basin), around the town of Manicore, cf. Fig.1.Due to the drifting orbit of the mission, the usual concept of "virtual station" vanishes and data are to be extracted within polygons that delineate the riverbeds. This results in spatio-temporal time series of the river water level, expressed as a function of both space (distance to the ocean) and time.We use Cryosat-2 low resolution mode (LRM) data processed with an Ice2 retracker, i.e., the content of the upcoming IOP/GOP ocean product from ESA [1]. For this study, we use demonstration samples (year 2011 on our validation area), processed by the so-called Cryosat Processing Prototype developed by CNES in the framework of the Sentinel-3 Project from ESA [5] [4]. At the time of this study, the product came with no corrections ("solid earth tide", atmosphere, etc.), .Validation is performed on (1) river water level pseudo time series and (2) river pseudo profile. An overview of the spatio-temporal time series is also given in 2D and 3D plots. Despite the lack of geophysical corrections, results are really promising (Std 0.51 m) and are challenging those obtained by Envisat (Std 0.43 m) and Jason-2 (Std 0.47 m) on nearby virtual stations.We also demonstrate the potential of the CryoSat-2 and the appropriateness of its drifting orbit to map rivers topography and derive water levels "at anytime and anywhere" , a major topic of interest regarding hydrological propagation models and the preparation of the SWOT mission.

  13. Dramatic and long-term lake level changes in the Qinghai-Tibet Plateau from Cryosat-2 altimeter: validation and augmentation by results from repeat altimeter missions and satellite imagery

    NASA Astrophysics Data System (ADS)

    Hwang, Cheinway; Huang, YongRuei; Cheng, Ys; Shen, WenBin; Pan, Yuanjin

    2017-04-01

    The mean elevation of the Qinghai-Tibet Plateau (QTP) exceeds 4000 m. Lake levels in the QTP are less affected by human activities than elsewhere, and may better reflect the state of contemporary climate change. Here ground-based lake level measurements are rare. Repeat altimeter missions, particularly those from the TOPEX and ERS series of altimetry, have provided long-term lake level observations in the QTP, but their large cross-track distances allow only few lakes to be monitored. In contrast, the Cryosat-2 altimeter, equipped with the new sensor SIRAL (interferometric/ synthetic aperture radar altimeter), provides a much better ranging accuracy and a finer spatial coverage than these repeated missions, and can detect water level changes over a large number of lakes in the QTP. In this study, Cryosat-2 data are used to determine lake level changes over 75˚E-100˚E and 28˚N-37.5˚N, where Cryosat-2 covers 60 lakes and SARAL/ AltiKa covers 32 lakes from 2013 to 2016. Over a lake, Cryosat-2 in different cycles can pass through different spots of the lake, making the numbers of observations non-uniform and requiring corrections for lake slopes. Four cases are investigated to cope with these situations: (1) neglecting inconsistency in data volume and lake slopes (2) considering data volume, (3) considering lake slopes only, and (4) considering both data volume and lake slopes. The CRYOSAT-2 result is then compared with the result from the SARAL to determine the best case. Because Cryosat-2 is available from 2010 to 2016, Jason-2 data are used to fill gaps between the time series of Cryosat-2 and ICESat (2003-2009) to obtain >10 years of lake level series. The Cryosat-2 result shows dramatic lake level rises in Lakes Kusai, Zhuoaihu and Salt in 2011 caused by floods. Landsat satellite imagery assists the determination and interpretation of such rises.

  14. Fine Ice Sheet margins topography from swath processing of CryoSat SARIn mode data

    NASA Astrophysics Data System (ADS)

    Gourmelen, N.; Escorihuela, M. J.; Shepherd, A.; Foresta, L.; Muir, A.; Briggs, K.; Hogg, A. E.; Roca, M.; Baker, S.; Drinkwater, M. R.

    2014-12-01

    Reference and repeat-observations of Glacier and Ice Sheet Margin (GISM) topography are critical to identify changes in ice thickness, provide estimates of mass gain or loss and thus quantify the contribution of the cryosphere to sea level change. The lack of such sustained observations was identified in the Integrated Global Observing Strategy (IGOS) Cryosphere Theme Report as a major shortcoming. Conventional altimetry measurements over GISMs exist, but coverage has been sparse and characterized by coarse ground resolution. Additionally, and more importantly, they proved ineffective in the presence of steep slopes, a typical feature of GISM areas. Since the majority of Antarctic and Greenland ice sheet mass loss is estimated to lie within 100 km from the coast, but only about 10% is surveyed, there is the need for more robust and dense observations of GISMs, in both time and space. The ESA Altimetry mission CryoSat aims at gaining better insight into the evolution of the Cryosphere. CryoSat's revolutionary design features a Synthetic Interferometric Radar Altimeter (SIRAL), with two antennas for interferometry. The corresponding SAR Interferometer (SARIn) mode of operation increases spatial resolution while resolving the angular origin of off-nadir echoes occurring over sloping terrain. The SARIn mode is activated over GISMs and the elevation for the Point Of Closest Approach (POCA) is a standard product of the CryoSat mission. Here we present an approach for more comprehensively exploiting the SARIn mode of CryoSat and produce an ice elevation product with enhanced spatial resolution compared to standard CryoSat-2 height products. In this so called L2-swath processing approach, the full CryoSat waveform is exploited under specific conditions of signal and surface characteristics. We will present the rationale, validation exercises and preliminary results from the Eurpean Space Agency's STSE CryoTop study over selected test regions of the margins of the Greenland and Antarctic Ice Sheets.

  15. A data assimilation system combining CryoSat-2 data and hydrodynamic river models

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Ridler, Marc-Etienne; Godiksen, Peter Nygaard; Madsen, Henrik; Bauer-Gottwein, Peter

    2018-02-01

    There are numerous hydrologic studies using satellite altimetry data from repeat-orbit missions such as Envisat or Jason over rivers. This study is one of the first examples for the combination of altimetry from drifting-ground track satellite missions, namely CryoSat-2, with a river model. CryoSat-2 SARIn Level 2 data is used to improve a 1D hydrodynamic model of the Brahmaputra River in South Asia, which is based on the Saint-Venant equations for unsteady flow and set up in the MIKE HYDRO River software. After calibration of discharge and water level the hydrodynamic model can accurately and bias-free represent the spatio-temporal variations of water levels. A data assimilation framework has been developed and linked with the model. It is a flexible framework that can assimilate water level data which are arbitrarily distributed in time and space. The setup has been used to assimilate CryoSat-2 water level observations over the Assam valley for the years 2010-2015, using an Ensemble Transform Kalman Filter (ETKF). Performance improvement in terms of discharge forecasting skill was then evaluated. For experiments with synthetic CryoSat-2 data the continuous ranked probability score (CRPS) was improved by up to 32%, whilst for experiments assimilating real data it could be improved by up to 10%. The developed methods are expected to be transferable to other rivers and altimeter missions. The model setup and calibration is based almost entirely on globally available remote sensing data.

  16. Validation of Cryosat-2 SAR Wind and Wave Products

    NASA Astrophysics Data System (ADS)

    Abdalla, Saleh; Dinardo, Salvatore; Benveniste, Jerome; Janssen, Peter

    2016-08-01

    Significant wave height (SWH) and surface wind speed (WS) products from the CryoSat-2 Synthetic Aperture Radar (SAR) Mode are validated against operational ECMWF atmospheric and wave model results in addition to available observations from buoys, platforms and other altimeters. The SAMOSA ocean model SAR data processed in the ESRIN G-POD service using SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation (SARvatore). The data cover two geographic boxes: one in the northeast Atlantic Ocean extending from 32°N to 70°N and from 20°W to the prime meridian (NE Atlantic Box) for the period from 6 September 2010 to 30 June 2014 and the other is in eastern Pacific extending from 2.5°S to 25.5°S and from 160°W to 85°W (Pacific Box) for the period from 7 May 2012 to 30 June 2014. The amount of data is limited by the CryoSat SAR mode acquisition capability over ocean but high enough to ensure robustness and significance of the results (Sentinel-3 will operate in SAR mode over the whole ocean). The results show that the quality of both SWH and WS products is very high.

  17. Comparison of CryoSat-2 and ENVISAT radar freeboard over Arctic sea ice: toward an improved Envisat freeboard retrieval

    NASA Astrophysics Data System (ADS)

    Guerreiro, Kevin; Fleury, Sara; Zakharova, Elena; Kouraev, Alexei; Rémy, Frédérique; Maisongrande, Philippe

    2017-09-01

    Over the past decade, sea-ice freeboard has been monitored with various satellite altimetric missions with the aim of producing long-term time series of ice thickness. While recent studies have demonstrated the capacity of the CryoSat-2 mission (2010-present) to provide accurate freeboard measurements, the current estimates obtained with the Envisat mission (2002-2012) still require some large improvements. In this study, we first estimate Envisat and CryoSat-2 radar freeboard by using the exact same processing algorithms. We then analyse the freeboard difference between the two estimates over the common winter periods (November 2010-April 2011 and November 2011-March 2012). The analysis of along-track data and gridded radar freeboard in conjunction with Envisat pulse-peakiness (PP) maps suggests that the discrepancy between the two sensors is related to the surface properties of sea-ice floes and to the use of a threshold retracker. Based on the relation between the Envisat pulse peakiness and the radar freeboard difference between Envisat and CryoSat-2, we produce a monthly CryoSat-2-like version of Envisat freeboard. The improved Envisat data set freeboard displays a similar spatial distribution to CryoSat-2 (RMSD = 1.5 cm) during the two ice growth seasons and for all months of the period of study. The comparison of the altimetric data sets with in situ ice draught measurements during the common flight period shows that the improved Envisat data set (RMSE = 12-28 cm) is as accurate as CryoSat-2 (RMSE = 15-21 cm) and much more accurate than the uncorrected Envisat data set (RMSE = 178-179 cm). The comparison of the improved Envisat radar freeboard data set is then extended to the rest of the Envisat mission to demonstrate the validity of PP correction from the calibration period. The good agreement between the improved Envisat data set and the in situ ice draught data set (RMSE = 13-32 cm) demonstrates the potential of the PP correction to produce accurate freeboard estimates over the entire Envisat mission lifetime.

  18. CryoSat Plus for Oceans - analysis of the state-of-the-art

    NASA Astrophysics Data System (ADS)

    Naeije, Marc; Gommenginger, Christine; Moreau, Thomas; Cotton, David; Benveniste, Jerome; Dinardo Dinardo, Salvatore

    2013-04-01

    The CryoSat Plus for Oceans (CP4O) project is an ESA initiative carried out by a European wide consortium of altimetry experts. It aims to build a sound scientific basis for new scientific and operational applications of data coming from CryoSat-2 over the open ocean, polar ocean, coastal seas and for seafloor mapping. It also generates and evaluates new methods and products that will enable the full exploitation of the capabilities of the CryoSat-2 SIRAL altimeter, and extend their application beyond the initial mission objectives. It therefore also acts as a preparation for the upcoming Sentinel and Jason SAR enabled altimetry missions. In this paper we address the review of the CryoSat state-of-the-art, relevant current initiatives, algorithms, models and Earth Observation based products and datasets that are relevant in the Cryosat+ ocean theme. Compared to conventional (pulse-limited) altimeter missions, Cryosat-2 is not a dedicated platform for ocean research: typically the microwave radiometer (MWR) for wet tropospheric corrections is lacking, as is the direct measurement of the first order ionospheric effect by means of a dual-frequency altimeter. Also the orbit of Cryosat-2 has a rather long repetition period, unsuited for collinear tracks analyses. These three particular features have been studied already in the HERACLES project on the eve of the first CryoSat launch. We revisit the outcome of this study, update to current understanding and perception, and ultimately develop what was, is and will be proposed in these problem areas. Clearly, we question the standard ionosphere corrections, the wet troposphere corrections and the accuracy of the mean sea surface (MSS) underlying the accuracy of derived sea level anomalies. In addition, Cryosat-2 provides the first innovative altimeter with SAR and SARIn modes. This raises the direct problem of "how to process these data", simply because this has not been done before. Compared to pulse-limited altimetry it is a totally different branch of sport. In our CP4O project we try to answer this. We build on the results that have come out of the SAMOSA study, which was initiated to investigate the improvements that SAR mode altimetry can offer in measurements over ocean, coastal and inland water surfaces, developing practical implementation of new theoretical models for the SAR echo waveform. It is clear that having specific processing for SAR and SARIn raises a number of new issues to be studied, such as RDSAR (reducing SAR to pseudo LRM data), sea state bias (SSB) in SAR mode, and land contamination, to name a few. The outcome of the analysis of the state-of-the-art culminates in the delivery of the Preliminary Analysis Report and the Development and Validation Plan (DVP). We present the summary of these documents.

  19. An Evaluation of CryoSat-2 SAR Mode Performance Around the UK Coasts

    NASA Astrophysics Data System (ADS)

    Cipollini, P.; Gommenginger, C.; Snaith, H. M.; Cotton, D.; Dinardo, S.; Benveniste, J.

    2014-12-01

    One of the objectives of ESA's CryoSat Plus for Ocean (CP4O) project is to demonstrate the excellent retrieval of Level 2 ocean geophysical parameters from CryoSat-2. Within CP4O we have carried out a comparison of sea surface height from CryoSat-2, reprocessed by ESRIN, against tide gauges from the UK Tide Gauge Network. This work has the specific objective to assess the performance in the coastal zone, and complements validation work over the open ocean (both for height and significant wave height) done elsewhere. We took updated corrections from the state-of-the-art RADS archive, computed the TWLE (total water level envelope, i.e. the sea level inclusive of ocean tides and atmospheric pressure and wind effect, a desirable quantity for validation), and then subset all segments of each pass within 50 km from a tide gauge, interpolating the tide gauge height (effectively a TWLE) on the time of the altimeter overpass to create match-up pairs. We first present the results of our attempt to correlate TWLE data over these multiple segments with the measurements from each nearby tide gauge, taking into account the distance of the altimetric measurements from the coastline. Results are dominated by large offsets, variable from match-up to match-up. Screening the data further based on the retracking misfit does not remove this bias, whose causes and possible impact are discussed. We then present an independent verification of the noise level in 20-Hz Cryosat-2 TWLEs and its variation as a function of distance from coast. The noise level is estimated by computing the absolute value of difference between consecutive TWLE values, as done in Passaro et al., Rem. Sens. Env, 2014. Remarkably, the median of that difference remains at ~5 cm up to 5Km from the coast, suggesting a noise level of that order for the 20-Hz data, which would correspond to ~1.1 cm for the 1-Hz data. At 3 km the median abs(diff) is ~7.3 cm. Finally we repeated the same analysis for only those points with retracking misfit below a threshold of 3. The median stays virtually flat at ~5cm all the way to the coast but obviously the fraction of points passing the misfit condition decreases quickly (it is about 60% at 5 km from the coast, and less than 25% at 3 km). These results demonstrate clearly that Cryosat-2 maintains an excellent performance of measurement well into in the coastal zone.

  20. Arctic and Antarctic Sea-Ice Freeboard and Thickness Retrievals from CryoSat-2 and EnviSat

    NASA Astrophysics Data System (ADS)

    Ricker, Robert; Hendricks, Stefan; Schwegmann, Sandra; Helm, Veit; Rinne, Eero

    2016-04-01

    The CryoSat-2 satellite is now in the 6th year of data acquisition. With its synthetic aperture radar altimeter, CryoSat-2 achieves great improvements in the along track resolution compared to previous radar altimeter missions like ERS or Envisat. The latitudinal coverage contains major parts of the Arctic marine ice fields where previous missions left a big data gap around the North Pole and especially over the multiyear ice zone north of Greenland. With this unique data set, changes in sea-ice thickness can be investigated in the context of the rapid reduction of the Arctic sea-ice cover which has been observed during the last decades. We present the current state of the CryoSat-2 Arctic sea-ice thickness retrieval that is processed at the Alfred Wegener Institute and available via seaiceportal.de (originally: meereisportal.de). Though biases in sea-ice thickness may occur due to the interpretation of waveforms, airborne and ground-based validation measurements give confidence that the retrieval algorithm enables us to capture the actual distributions of sea-ice regimes. Nevertheless, long time series of data retrievals are essential to estimate trends in sea-ice thickness and volume. Today, more than 20 years of radar altimeter data are potentially available and capable to derive sea ice thickness. However, data originate from satellites with different sensor characteristics. Therefore, it is crucial to study the consistency between single sensors to derive long and consistent time series. We present results from the tested consistency between Antarctic freeboard measurements of the radar altimeters on-board of Envisat and CryoSat-2 for their overlap period in 2011.

  1. Seasonal sea surface and sea ice signal in the fjords of Eastern Greenland from CryoSat-2 SARin altimetry

    NASA Astrophysics Data System (ADS)

    Abulaitijiang, Adili; Baltazar Andersen, Ole; Stenseng, Lars

    2014-05-01

    Cryosat-2 offers the first ever possibility to perform coastal altimetric studies using SAR-Interferometry. This enabled qualified measurements of sea surface height (SST) in the fjords in Greenland. Scoresbysund fjord on the east coast of Greenland is the largest fjord in the world which is also covered by CryoSat-2 SAR-In mask making it a good test region. Also, the tide gauge operated by DTU Space is sitting in Scoresbysund bay, which provides solid ground-based sea level variation records throughout the year. We perform an investigation into sea surface height variation since the start of the Cryosat-2 mission using SAR-In L1B data processed with baseline B processing. We have employed a new develop method for projecting all SAR-In observations in the Fjord onto a centerline up the Fjord. Hereby we can make solid estimates of the annual and (semi-) annual signal in sea level/sea ice freeboard within the Fjord. These seasonal height variations enable us to derive sea ice freeboard changes in the fjord from satellite altimetry. Derived sea level and sea-ice freeboard can be validated by comparison with the tide gauge observations for sea level and output from the Microwave Radiometer derived observations of sea ice freeboard developed at the Danish Meteorological Institute.

  2. Arctic and Antarctic sea-ice thickness from CryoSat and Envisat radar altimetry

    NASA Astrophysics Data System (ADS)

    Hendricks, S.; Rinne, E. J.; Paul, S.; Ricker, R.; Skourup, H.; Kern, S.; Sandven, S.

    2017-12-01

    One objective of the ESA Climate Change Initiative (CCI) on Sea Ice is the generation of a climate data record of sea-ice thickness from satellite radar altimetry in both hemispheres. We report on the results of the second phase of the CCI project, which are based on the15-year (2002-2017) monthly data record from Envisat and CryoSat-2 radar altimeter data. The data records needs to maintain consistency in the freeboard retrieval, freeboard to thickness conversion and uncertainty estimation for the full observational period. The main challenge has been to maintain consistency in the sea-ice freeboard retrieval due to the different radar altimeter concepts and footprints between Envisat and CryoSat-2. We have developed a novel empirical algorithm for both missions to minimize inter-mission biases for surface type classification as well as freeboard retrieval based on CryoSat reference data for the overlap period from November 2010 to March 2012. The parametrization takes differences between sea-ice surface properties in both hemisphere and the seasonal cycle into account. We report on the changes of sea-ice thickness in the Arctic winter seasons since 2002 and the comparison to independent freeboard and thickness observations. Far less validation data exists for the southern hemisphere and we provide an overview of changes and the expected skill of Antarctic sea ice thickness of the full seasonal cycle.

  3. Comparing elevation and freeboard from IceBridge and four different CryoSat-2 retrackers for coincident sea ice observations

    NASA Astrophysics Data System (ADS)

    Yi, D.; Kurtz, N. T.; Harbeck, J.

    2017-12-01

    The airborne IceBridge and spaceborne Cryosat-2 missions observe polar sea ice at different altitudes with different footprint sizes and often at different time and locations. Many studies use different retrackers to derive Cryosat-2 surface elevation, which we find causes large differences in the elevation and freeboard comparisons of IceBridge and Cryosat-2. In this study, we compare sea ice surface elevation and freeboard using 8 coincident CryoSat-2, ATM, and LVIS observations with IceBridge airplanes under flying the Cryosat-2 ground tracks. We apply identical ellipsoid, geoid model, tide model, and atmospheric correction to CryoSat-2 and IceBridge data to reduce elevation bias due to their differences. IceBridge's ATM and LVIS elevation and freeboard and Snow Radar snow depth are averaged at each CryoSat-2 footprint for comparison. The four different Cryosat-2 retrackers (ESA, GSFC, AWI, and JPL) show distinct differences in mean elevation up to 0.35 meters over leads and over floes, which suggests that systematic elevation bias exists between the retrackers. The mean IceBridge elevation over leads is within the mean elevation distribution of the four Cryosat-2 retrackers. The mean IceBridge elevation over floes is above the mean elevation distribution of the four Cryosat-2 retrackers. After removing the snow depth from IceBridge elevation, over floe, the mean elevation of IceBridge is within the mean elevation distribution of the four Cryosat-2 retrackers. By identifying the strengths and weaknesses of the retrackers, this study provides a mechanism to improve freeboard retrievals from existing methods.

  4. CryoSat Level1b SAR/SARin BaselineC: Product Format and Algorithm Improvements

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline B, was released in operation in February 2012. A reprocessing campaign followed, in order to reprocess the data since July 2010. After more than 2 years of development, the release in operations of Baseline C is expected in the first half of 2015. BaselineC Level1b products will be distributed in an updated format, including for example the attitude information (roll, pitch and yaw) and, for SAR/SARIN, the waveform length doubled with respect to Baseline B. Moreveor, various algorithm improvements have been identified: • a datation bias of about -0.5195 ms will be corrected (SAR/SARIn) • a range bias of about 0.6730 m will be corrected (SAR/SARIn) • a roll bias of 0.1062 deg and a pitch bias of 0.0520 deg • Surface sample stack weighting to filter out the single look echoes acquired at highest look angle, that results in a sharpening of the 20Hz waveforms With the operational release of BaselineC, the second CryoSat reprocessing campaign will be initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also at IPF2 level. The reprocessing campaign will cover the full Cryosat mission starting on 16th July 2010. This poster details the new information that will be added in the CryoSat BaselineC Level1b SAR/SARin products and the main quality improvements will be described.

  5. CryoSat Ice Processor: Known Processor Anomalies and Potential Future Product Evolutions

    NASA Astrophysics Data System (ADS)

    Mannan, R.; Webb, E.; Hall, A.; Bouffard, J.; Femenias, P.; Parrinello, T.; Bouffard, J.; Brockley, D.; Baker, S.; Scagliola, M.; Urien, S.

    2016-08-01

    Launched in 2010, CryoSat was designed to measure changes in polar sea ice thickness and ice sheet elevation. To reach this goal the CryoSat data products have to meet the highest performance standards and are subjected to a continual cycle of improvement achieved through upgrades to the Instrument Processing Facilities (IPFs). Following the switch to the Baseline-C Ice IPFs there are already planned evolutions for the next processing Baseline, based on recommendations from the Scientific Community, Expert Support Laboratory (ESL), Quality Control (QC) Centres and Validation campaigns. Some of the proposed evolutions, to be discussed with the scientific community, include the activation of freeboard computation in SARin mode, the potential operation of SARin mode over flat-to-slope transitory land ice areas, further tuning of the land ice retracker, the switch to NetCDF format and the resolution of anomalies arising in Baseline-C. This paper describes some of the anomalies known to affect Baseline-C in addition to potential evolutions that are planned and foreseen for Baseline-D.

  6. A Toolkit For CryoSat Investigations By The ESRIN EOP-SER Altimetry Team

    NASA Astrophysics Data System (ADS)

    Dinardo, Salvatore; Bruno, Lucas; Benveniste, Jerome

    2013-12-01

    The scope of this work is to feature the new tool for the exploitation of the CryoSat data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The tool framework is composed of two separate components: the first one handles the data collection and management, the second one is the processing toolkit. The CryoSat FBR (Full Bit Rate) data is downlinked uncompressed from the satellite, containing un-averaged individual echoes. This data is made available in the Kiruna CalVal server in a 10 day rolling archive. Daily at ESRIN all the CryoSat FBR data, in SAR and SARin Mode, are downloaded (around 30 Gigabytes) catalogued and archived in local ESRIN EOP-SER workstations. As of March 2013, the total amount of FBR data is over 9 Terabytes, with CryoSat acquisition dates spanning January 2011 to February 2013 (with some gaps). This archive was built by merging partial datasets available at ESTEC and NOAA, that have been kindly made available for EOP-SER team. The on-demand access to this low level data is restricted to expert users with validated ESA P.I. credentials. Currently the main users of the archiving functionality are the team members of the Project CP4O (STSE- CryoSat Plus for Ocean), CNES and NOAA. The second component of the service is the processing toolkit. On the EOP-SER workstations there is internally and independently developed software that is able to process the FBR data in SAR/SARin mode to generate multi-looked echoes (Level 1B) and subsequently able to re-track them in SAR and SARin mode (Level 2) over open ocean, exploiting the SAMOSA model and other internally developed models. The processing segment is used for research & development scopes, supporting the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training to selected users, cross- comparison against third part products (CLS/CNES CPP Products for instance), preparation to Sentinel-3 mission, publications, etc. Samples of these experimental SAR/SARin L1b/L2 Products can be provided to the scientific community for comparison with self-processed data, on-request. So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zones, inland waters and over land in sight of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces. There are also plans to make the toolkit fully accessible through software “gridification” to run in the ESRin GPod (Grid Processing on Demand) Service and to extend the tool's functionalities to support Sentinel-3 Mission (both Simulated and Real Data). Graphs and statistics about the spatial coverage and amount of FBR data actually archived on the EOP-SER workstations and some scientific results will be shown in this paper along with the tests that have been designed and performed to validate the products (tests against CryoSat Kiruna PDGS Products and against transponder data).

  7. CryoSat Plus For Oceans: an ESA Project for CryoSat-2 Data Exploitation Over Ocean

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Cotton, D.; Clarizia, M.; Roca, M.; Gommenginger, C. P.; Naeije, M. C.; Labroue, S.; Picot, N.; Fernandes, J.; Andersen, O. B.; Cancet, M.; Dinardo, S.; Lucas, B. M.

    2012-12-01

    The ESA CryoSat-2 mission is the first space mission to carry a space-borne radar altimeter that is able to operate in the conventional pulsewidth-limited (LRM) mode and in the novel Synthetic Aperture Radar (SAR) mode. Although the prime objective of the Cryosat-2 mission is dedicated to monitoring land and marine ice, the SAR mode capability of the Cryosat-2 SIRAL altimeter also presents the possibility of demonstrating significant potential benefits of SAR altimetry for ocean applications, based on expected performance enhancements which include improved range precision and finer along track spatial resolution. With this scope in mind, the "CryoSat Plus for Oceans" (CP4O) Project, dedicated to the exploitation of CryoSat-2 Data over ocean, supported by the ESA STSE (Support To Science Element) programme, brings together an expert European consortium comprising: DTU Space, isardSAT, National Oceanography Centre , Noveltis, SatOC, Starlab, TU Delft, the University of Porto and CLS (supported by CNES),. The objectives of CP4O are: - to build a sound scientific basis for new scientific and operational applications of Cryosat-2 data over the open ocean, polar ocean, coastal seas and for sea-floor mapping. - to generate and evaluate new methods and products that will enable the full exploitation of the capabilities of the Cryosat-2 SIRAL altimeter , and extend their application beyond the initial mission objectives. - to ensure that the scientific return of the Cryosat-2 mission is maximised. In particular four themes will be addressed: -Open Ocean Altimetry: Combining GOCE Geoid Model with CryoSat Oceanographic LRM Products for the retrieval of CryoSat MSS/MDT model over open ocean surfaces and for analysis of mesoscale and large scale prominent open ocean features. Under this priority the project will also foster the exploitation of the finer resolution and higher SNR of novel CryoSat SAR Data to detect short spatial scale open ocean features. -High Resolution Polar Ocean Altimetry: Combination of GOCE Geoid Model with CryoSat Oceanographic SAR Products over polar oceans for the retrieval of CryoSat MSS/MDT and currents circulations system improving the polar tides models and studying the coupling between blowing wind and current pattern. -High Resolution Coastal Zone Altimetry: Exploitation of the finer resolution and higher SNR of novel CryoSat SAR Data to get the radar altimetry closer to the shore exploiting the SARIn mode for the discrimination of off-nadir land targets (e.g. steep cliffs) in the radar footprint from nadir sea return. -High Resolution Sea-Floor Altimetry: Exploitation of the finer resolution and higher SNR of novel CryoSat SAR Data to resolve the weak short-wavelength sea surface signals caused by sea-floor topography elements and to map uncharted sea-mounts/trenches. One of the first project activities is the consolidation of preliminary scientific requirements for the four themes under investigation. This paper will present the CP4O project content and objectives and will address the first initial results from the on-going work to define the scientific requirements.

  8. The performance of CryoSat-2 as an ocean altimeter

    NASA Astrophysics Data System (ADS)

    Scharroo, R.; Smith, W. H.; Leuliette, E. W.; Lillibridge, J. L.

    2012-12-01

    Two years after the launch of CryoSat-2, oceanographic uses of the CryoSat-2 data have well taken off, after several institutes, NOAA included, have spent a dedicated effort to upgrade the official CryoSat-2 data products to a level that is suitable for monitoring of mesoscale phenomena, as well as wind speed and wave height. But in the coastal areas, this is much less the case. This is mostly the result of the fact that CryoSat-2 is running in SAR or InSAR mode in many of the focus areas, like the Mediterranean Sea. We have shown, however, that the CryoSat data is intrinsically of high quality and for over a year now have been producing "IGDR" type data through FTP and through RADS. These steps include: ● Combine final (LRM) and fast-delivery (FDM) products and split the segmented files into pass files. ● Divide the 369-day repeat cycle into subcycles of 29 or 27 days. ● Retrack the conventional low-rate data to determine range, significant wave height, backscatter (and off-nadir angle). ● Add or replace the usual corrections for ionospheric and atmospheric delays, tides, dynamic atmospheric correction, sea state bias, mean sea surface. ● Update orbits and corrections whenever they become available. This way NOAA produces an "IGDR" product from the fast-delivery FDM and the CNES MOE orbit in about 2 days after real time, and a "GDR" product from the final LRM data and the CNES POE orbit with a delay of about 1 month. In order to extend the data products to the coastal regime, we have developed a process in which the SAR data are first combined to "Pseudo-LRM" or "reduced SAR" wave forms, that are similar to the conventional low-rate wave forms. After this the reduced SAR data are retracked and combined with the conventional data to form a harmonised product. Although this sounds relatively straightforward, many steps were needed to get this done: ● Combine the SAR wave forms to conventional wave forms, without loss of information. ● Reconstruct backscatter and significant wave height in a meaningful way, consistent with low-rate data. ● Cross-calibrate the conventional and SAR mode data. ● Validate the data quality of conventional and SAR mode data through crossovers and collinear track analyses. In this presentation we will demonstrate how the CryoSat-2 data quality compares to other altimeters (Envisat, Jason-1 and Jason-2) by means of data distribution maps, histograms and crossover comparisons.

  9. Evaluation of multi-mode CryoSat-2 altimetry data over the Po River against in situ data and a hydrodynamic model

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Tarpanelli, Angelica; Nielsen, Karina; Madsen, Henrik; Bauer-Gottwein, Peter

    2018-02-01

    Coverage of in situ observations to monitor surface waters is insufficient on the global scale, and decreasing across the globe. Satellite altimetry has become an increasingly important monitoring technology for continental surface waters. The ESA CryoSat-2 altimetry mission, launched in 2010, has two novel features. (i) The radar altimeter instrument on board of CryoSat-2 is operated in three modes; two of them reduce the altimeter footprint by using Delay-Doppler processing. (ii) CryoSat-2 is placed on a distinct orbit with a repeat cycle of 369 days, leading to a drifting ground track pattern. The drifting ground track pattern challenges many common methods of processing satellite altimetry data over rivers. This study evaluates the observation error of CryoSat-2 water level observations over the Po River, Italy, against in situ observations. The average RMSE between CryoSat-2 and in situ observations was found to be 0.38 meters. CryoSat-2 was also shown to be useful for channel roughness calibration in a hydrodynamic model of the Po River. The small across-track distance of CryoSat-2 means that observations are distributed almost continuously along the river. This allowed resolving channel roughness with higher spatial resolution than possible with in situ or virtual station altimetry data. Despite the Po River being extensively monitored, CryoSat-2 still provides added value thanks to its unique spatio-temporal sampling pattern.

  10. Generation and evaluation of Cryosat-2 SARIn L1b Interferometric elevation

    NASA Astrophysics Data System (ADS)

    DONG, Y.; Zhang, K.; Liu, Q.; MA, J.; WANG, J.

    2016-12-01

    CryoSat-2 radar altimeter data have successfully used in mapping surface elevations of ice caps and ice sheets, finding the change of surface height in polar area. The SARIn mode of Synthetic Aperture Interferometric Altimeter (SIRAL), which working similar with the traditional Interferometric Synthetic Aperture Radar (IFSAR) method, can improve the across- and along-track resolution by IFSAR processing algorithm. In this study, three L1b Baseline-C SARIn tracks over the Filchner ice shelf are used to generate the location and height of ground points in sloping glacial terrain. The elevation data is mapped and validated with IceBridge Airborne Topographic Mapper (ATM) data acquired at Nov. 2, 2012. The comparison with ATM data shows a mean difference of -1.91 m with a stand deviation of 4.04 m.

  11. CryoSat-2 SAR and SARin Inland Water Heights from the CRUCIAL project

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Restano, M.; Ambrózio, A.; Moore, P.; Birkinshaw, S.

    2017-12-01

    CRUCIAL was an ESA/STSE funded project investigating innovative land and inland water applications from CryoSat-2 with a forward-look component to the Sentinel-3 and Jason-CS/Sentinel-6 missions. The high along-track sampling of CryoSat-2 in its SAR and SARin modes offers the opportunity to recover high frequency signals over inland waters. A methodology was developed to process the FBR L1A Doppler beams to form a waveform product using ground cell gridding, beam steering and beam stacking. Inland water heights from CryoSat-2 are derived by using a set of empirical retrackers formulated for inland water applications. Results of the processing strategy include a comparison of waveforms and heights from the burst echoes (80 m along-track) and from multi-look waveforms (320 m along-track). SAR and SARin FBR data are available for the Amazon, Brahmaputra and Mekong for 2011-2015. FBR SAR results are compared against stage data from the nearest gauge. Heights from Tonlé Sap are also compared against Jason-2 data from the United States Department of Agriculture. A strategy to select the number of multi-looks over rivers was designed based on the rms of heights across Tonlé Sap. Comparisons include results from the empirical retrackers and from waveforms and heights obtained via ESA's Grid Processing on Demand (G-POD/SARvatore) using the SAMOSA2 retracker. Results of FBR SARin processing for the Amazon and Brahmaputra are presented including comparison of heights from the two antennae, extraction of slope of the ground surface and validation against ground data where appropriate.

  12. Improved Oceanographic Measurements with CryoSat SAR Altimetry: Applications to the Coastal Zone and Arctic

    NASA Astrophysics Data System (ADS)

    Cotton, D.; Garcia, P. N.; Cancet, M.; Andersen, O.; Stenseng, L.; Martin, F.; Cipollini, P.; Calafat, F. M.; Passaro, M.; Restano, M.; Ambrozio, A.; Benveniste, J.

    2016-08-01

    The ESA CryoSat-2 mission is the first space mission to carry a radar altimeter that can operate in Synthetic Aperture Radar (SAR) mode. Although the prime objective of the CryoSat-2 mission is dedicated to monitoring land and marine ice, the SAR mode capability of the CryoSat-2 SIRAL altimeter also presents significant potential benefits for ocean applications including improved range precision and finer along track spatial resolution.The "CryoSat Plus for Oceans" (CP4O) project, supported by the ESA Support to Science Element (STSE) Programme and by CNES, was dedicated to the exploitation of CryoSat-2 data over the open and coastal ocean. The general objectives of the CP4O project were: to build a sound scientific basis for new oceanographic applications of CryoSat-2 data; to generate and evaluate new methods and products that will enable the full exploitation of the capabilities of the CryoSat-2 SIRAL altimeter, and to ensure that the scientific return of the CryoSat-2 mission is maximised. Cotton et al, (2015) is the final report on this work.However, whilst the results from CP4O were highly promising and confirmed the potential of SAR altimetry to support new scientific and operational oceanographic applications, it was also apparent that further work was needed in some key areas to fully realise the original project objectives. Thus additional work in four areas has been supported by ESA under a Contract Change Notice:• Developments in SARin data processing for Coastal Altimetry (isardSAT).• Implementation of a Regional Tidal Atlas for the Arctic Ocean (Noveltis and DTU Space).• Improvements to the SAMOSA re-tracker: Implementation and Evaluation- Optimised Thermal Noise Estimation. (Starlab and SatOC).• Extended evaluation of CryoSat-2 SAR data for Coastal Applications (NOC).This work was managed by SatOC. The results of this work are summarized here. Detailed information regarding the CP4O project can be found at: http://www.satoc.eu/projects/CP4O/

  13. Estimation of Arctic Sea Ice Freeboard and Thickness Using CryoSat-2

    NASA Astrophysics Data System (ADS)

    Lee, S.; Im, J.; Kim, J. W.; Kim, M.; Shin, M.

    2014-12-01

    Arctic sea ice is one of the significant components of the global climate system as it plays a significant role in driving global ocean circulation. Sea ice extent has constantly declined since 1980s. Arctic sea ice thickness has also been diminishing along with the decreasing sea ice extent. Because extent and thickness, two main characteristics of sea ice, are important indicators of the polar response to on-going climate change. Sea ice thickness has been measured with numerous field techniques such as surface drilling and deploying buoys. These techniques provide sparse and discontinuous data in spatiotemporal domain. Spaceborne radar and laser altimeters can overcome these limitations and have been used to estimate sea ice thickness. Ice Cloud and land Elevation Satellite (ICEsat), a laser altimeter provided data to detect polar area elevation change between 2003 and 2009. CryoSat-2 launched with Synthetic Aperture Radar (SAR)/Interferometric Radar Altimeter (SIRAL) in April 2010 can provide data to estimate time-series of Arctic sea ice thickness. In this study, Arctic sea ice freeboard and thickness between 2011 and 2014 were estimated using CryoSat-2 SAR and SARIn mode data that have sea ice surface height relative to the reference ellipsoid WGS84. In order to estimate sea ice thickness, freeboard, i.e., elevation difference between the top of sea ice surface should be calculated. Freeboard can be estimated through detecting leads. We proposed a novel lead detection approach. CryoSat-2 profiles such as pulse peakiness, backscatter sigma-0, stack standard deviation, skewness and kurtosis were examined to distinguish leads from sea ice. Near-real time cloud-free MODIS images corresponding to CryoSat-2 data measured were used to visually identify leads. Rule-based machine learning approaches such as See5.0 and random forest were used to identify leads. The proposed lead detection approach better distinguished leads from sea ice than the existing approaches. With the freeboard height calculated using the lead detection approach, sea ice thickness was finally estimated using the Archimedes' buoyancy principle. The estimated sea ice freeboard and thickness were validated using ESA airborne Ku-band interferometric radar and Airborne Electromagnetic (AEM) data.

  14. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  15. The 2012 Arctic Field Season of the NRL Sea-Ice Measurement Program

    NASA Astrophysics Data System (ADS)

    Gardner, J. M.; Brozena, J. M.; Hagen, R. A.; Liang, R.; Ball, D.

    2012-12-01

    The U.S. Naval Research Laboratory (NRL) is beginning a five year study of the changing Arctic with a particular focus on ice thickness and distribution variability with the intent of optimizing state-of-the-art computer models which are currently used to predict sea ice changes. An important part of our study is to calibrate/validate CryoSat2 ice thickness data prior to its incorporation into new ice forecast models. NRL Code 7420 collected coincident data with the CryoSat2 satellite in both 2011 and 2012 using a LiDAR (Riegl Q560) to measure combined snow and ice thickness and a 10 GHz pulse-limited precision radar altimeter to measure sea-ice freeboard. These measurements were coordinated with the Seasonal Ice Zone Observing Network (SIZONet) group who conducted surface based ice thickness surveys using a Geonics EM-31 along hunter trails on the landfast ice near Barrow as well as on drifting ice offshore during helicopter landings. On two sorties, a twin otter carrying the NRL LiDAR and radar altimeter flew in tandem with the helicopter carrying the EM-31 to achieve synchronous data acquisition. Data from these flights are shown here along with a digital elevation map. The LiDAR and radar altimeter were also flown on grid patterns over the ice that were synchronous with 5 Cryosat2 satellite passes. These grids were intended to cover roughly 10 km long segments of Cryosat2 tracks with widths similar to the footprint of the satellite (~2 km). Reduction of these grids is challenging because of ice drift which can be many hundreds of meters over the 1-2 hours collection period of each grid. Relocation of the individual scanning LiDAR tracks is done by means of tie-points observed in the overlapping swaths. Data from these grids are shown here and will be used to examine the relationship of the tracked satellite waveform data to the actual surface across the footprint.

  16. The 2013 Arctic Field Season of the NRL Sea-Ice Measurement Program

    NASA Astrophysics Data System (ADS)

    Gardner, J. M.; Brozena, J. M.; Ball, D.; Hagen, R. A.; Liang, R.; Stoudt, C.

    2013-12-01

    The U.S. Naval Research Laboratory (NRL) is conducting a five year study of the changing Arctic with a particular focus on ice thickness and distribution variability with the intent of optimizing state-of-the-art computer models which are currently used to predict sea ice changes. An important part of our study is to calibrate/validate CryoSat2 ice thickness data prior to its incorporation into new ice forecast models. NRL Code 7420 collected coincident data with the CryoSat2 satellite in 2011 and 2012 using a LiDAR (Riegl Q560) to measure combined snow and ice thickness and a 10 GHz pulse-limited precision radar altimeter to measure sea-ice freeboard. This field season, LiDAR data was collected using the Riegl Q680 which permitted higher density operation and data collection. Concident radar data was collected using an improved version of the NRL 10 GHz pulse limited radar that was used for the 2012 fieldwork. 8 coincident tracks of CryoSat2 satellite data were collected. Additionally a series of grids (7 total) of adjacent tracks were flown coincident with Cryosat2 satellite overpass. These grids cover the approximate satellite footprint of the satellite on the ice as it passes overhead. Data from these grids are shown here and will be used to examine the relationship of the tracked satellite waveform data to the actual surface across the footprint. We also coordinated with the Seasonal Ice Zone Observing Network (SIZONet) group who conducted surface based ice thickness surveys using a Geonics EM-31 along hunter trails on the landfast ice near Barrow as well as on drifting ice offshore during helicopter landings. On two sorties, a twin otter carrying the NRL LiDAR and radar altimeter flew in tandem with the helicopter carrying the EM-31 to achieve synchronous data acquisition. Data from these flights are shown here along with a digital elevation map.

  17. CryoSat-2 Processing and Model Interpretation of Greenland Ice Sheet Volume Changes

    NASA Astrophysics Data System (ADS)

    Nilsson, J.; Gardner, A. S.; Sandberg Sorensen, L.

    2015-12-01

    CryoSat-2 was launched in late 2010 tasked with monitoring the changes of the Earth's land and sea ice. It carries a novel radar altimeter allowing the satellite to monitor changes in highly complex terrain, such as smaller ice caps, glaciers and the marginal areas of the ice sheets. Here we present on the development and validation of an independent elevation retrieval processing chain and respective elevation changes based on ESA's L1B data. Overall we find large improvement in both accuracy and precision over Greenland relative to ESA's L2 product when comparing against both airborne data and crossover analysis. The seasonal component and spatial sampling of the surface elevation changes where also compared against ICESat derived changes from 2003-2009. The comparison showed good agreement between the to product on a local scale. However, a global sampling bias was detected in the seasonal signal due to the clustering of CryoSat-2 data in higher elevation areas. The retrieval processing chain presented here does not correct for changes in surface scattering conditions and appears to be insensitive to the 2012 melt event (Nilsson et al., 2015). This in contrast to the elevation changes derived from ESA's L2 elevation product, which where found to be sensitive to the effects of the melt event. The positive elevation bias created by the event introduced a discrepancy between the two products with a magnitude of roughly 90 km3/year. This difference can directly be attributed to the differences in retracking procedure pointing to the importance of the retracking of the radar waveforms for altimetric volume change studies. Greenland 2012 melt event effects on CryoSat-2 radar altimetry./ Nilsson, Johan; Vallelonga, Paul Travis; Simonsen, Sebastian Bjerregaard; Sørensen, Louise Sandberg; Forsberg, René; Dahl-Jensen, Dorthe; Hirabayashi, Motohiro; Goto-Azuma, Kumiko; Hvidberg, Christine S.; Kjær, Helle A.; Satow, Kazuhide.

  18. CryoSat Ice Processor: High-Level Overview of Baseline-C Data and Quality-Control

    NASA Astrophysics Data System (ADS)

    Mannan, R.; Webb, E.; Hall, A.; Bouffard, J.; Femenias, P.; Parrinello, T.; Bouffard, J.; Brockley, D.; Baker, S.; Scagliola, M.; Urien, S.

    2016-08-01

    Since April 2015, the CryoSat ice products have been generated with the new Baseline-C Instrument Processing Facilities (IPFs). This represents a major upgrade to the CryoSat ice IPFs and is the baseline for the second CryoSat Reprocessing Campaign. Baseline- C introduces major evolutions with respect to Baseline- B, most notably the release of freeboard data within the L2 SAR products, following optimisation of the SAR retracker. Additional L2 improvements include a new Arctic Mean Sea Surface (MSS) in SAR; a new tuneable land ice retracker in LRM; and a new Digital Elevation Model (DEM) in SARIn. At L1B new attitude fields have been introduced and existing datation and range biases reduced. This paper provides a high level overview of the changes and evolutions implemented at Baseline-C in order to improve CryoSat L1B and L2 data characteristics and exploitation over polar regions. An overview of the main Quality Control (QC) activities performed on operational Baseline-C products is also presented.

  19. Evaluation of CryoSat-2 SARIn vs. SAR Arctic Sea Ice Freeboard

    NASA Astrophysics Data System (ADS)

    Di Bella, A.; Skourup, H.; Forsberg, R.

    2017-12-01

    Earth climate is a complex system which behaviour is dictated by the interaction among many components. Sea ice, one of these fundamental components, interacts directly with the oceans and the atmosphere playing an important role in defining heat exchange processes and, thus, impacting weather patterns on a global scale. Sea ice thickness estimates have notably improved in the last couple of decades, however, the uncertainty of such estimates is still significant. For the past 7 years, the ESA CryoSat-2 (CS2) mission has provided a unique opportunity to observe polar regions due to its extended coverage up to 88° N/S. The SIRAL radar altimeter on board CS2 enables the sea ice community to estimate sea ice thickness by measuring the sea ice freeboard. Studies by Armitage and Davidson [2014] and Di Bella et al. [submitted] showed that the interferometric capabilities of SIRAL can be used to retrieve an increased number of valid sea surface heights in sea ice covered regions and thus reduce the random uncertainty of the estimated freeboards, especially in areas with a sparse lead distribution. This study focuses on the comparison between sea ice freeboard estimates obtained by processing L1B SARIn data inside the Wingham box - an area in the Arctic Ocean where SIRAL has acquired SARIn data for 4 years - and those obtained by processing L1B SAR data in the area surrounding the box. This comparison evaluates CS2 performance on Arctic sea ice from a statistical perspective by analysing the continuity of freeboard estimates in areas where SIRAL switches between SAR and SARIn acquisition modes. Data collected during the Operation IceBridge and CryoVEx field campaigns are included in the study as an additional validation. Besides investigating the possibility of including the phase information from SIRAL in currently available freeboard estimates, this results provide valuable information for a possible SARIn CryoSat follow-on mission.

  20. SAR Altimetry Processing on Demand Service for CryoSat-2 and Sentinel-3 at ESA G-POD

    NASA Astrophysics Data System (ADS)

    Dinardo, Salvatore; Lucas, Bruno; Benveniste, Jerome

    2015-12-01

    The scope of this work is to feature the new ESA service (SARvatore) for the exploitation of the CryoSat-2 data, designed and developed entirely by the Altimetry Team at ESA-ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The G-POD Service, SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) for CryoSat-2, is a web platform that provides the capability to process on-line and on-demand CryoSat-2 SAR/SARIN data, from L1a (FBR) data products until SAR/SARIN Level-2 geophysical data products.. The Processor will make use of the G-POD (Grid-Processing On Demand) distributed computing platform to deliver timely the output data products. These output data products are generated in standard NetCDF format (using CF Convention), and they are compatible with BRAT (Basic Radar Altimetry Toolbox) and other NetCDF tool. Using the G-POD graphic interface, it is easy to select the geographical area of interest along with the time-frame of interest, based on the Cryosat-2 SAR/SARIN FBR data products availability in the service's catalogue. After the task submission, the users can follow, in real time, the status of the processing task. The processor prototype is versatile in the sense that the users can customize and adapt the processing, according their specific requirements, setting a list of configurable options. The processing service is meant to be used for research & development experiments, to support the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training in training courses and workshops, cross-comparison against third party products (CLS/CNES CPP Products for instance), preparation for the Sentinel-3 Topographic mission, producing data and graphics for publications, etc. So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zone, inland water and over land in view of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces. The service is open and free of charge.

  1. Assimilation of CryoSat-2 altimetry to a hydrodynamic model of the Brahmaputra river

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Nygaard Godiksen, Peter; Ridler, Marc-Etienne; Madsen, Henrik; Bauer-Gottwein, Peter

    2016-04-01

    Remote sensing provides valuable data for parameterization and updating of hydrological models, for example water level measurements of inland water bodies from satellite radar altimeters. Satellite altimetry data from repeat-orbit missions such as Envisat, ERS or Jason has been used in many studies, also synthetic wide-swath altimetry data as expected from the SWOT mission. This study is one of the first hydrologic applications of altimetry data from a drifting orbit satellite mission, namely CryoSat-2. CryoSat-2 is equipped with the SIRAL instrument, a new type of radar altimeter similar to SRAL on Sentinel-3. CryoSat-2 SARIn level 2 data is used to improve a 1D hydrodynamic model of the Brahmaputra river basin in South Asia set up in the DHI MIKE 11 software. CryoSat-2 water levels were extracted over river masks derived from Landsat imagery. After discharge calibration, simulated water levels were fitted to the CryoSat-2 data along the Assam valley by adapting cross section shapes and datums. The resulting hydrodynamic model shows accurate spatio-temporal representation of water levels, which is a prerequisite for real-time model updating by assimilation of CryoSat-2 altimetry or multi-mission data in general. For this task, a data assimilation framework has been developed and linked with the MIKE 11 model. It is a flexible framework that can assimilate water level data which are arbitrarily distributed in time and space. Different types of error models, data assimilation methods, etc. can easily be used and tested. Furthermore, it is not only possible to update the water level of the hydrodynamic model, but also the states of the rainfall-runoff models providing the forcing of the hydrodynamic model. The setup has been used to assimilate CryoSat-2 observations over the Assam valley for the years 2010 to 2013. Different data assimilation methods and localizations were tested, together with different model error representations. Furthermore, the impact of different filtering and clustering methods and error descriptions of the CryoSat-2 observations was evaluated. Performance improvement in terms of discharge and water level forecast due to the assimilation of satellite altimetry data was then evaluated. The model forecasts were also compared to climatology and persistence forecasts. Using ensemble based filters, the evaluation was done not only based on performance criteria for the central forecast such as root-mean-square error (RMSE) and Nash-Sutcliffe model efficiency (NSE), but also based on sharpness, reliability and continuous ranked probability score (CRPS) of the ensemble of probabilistic forecasts.

  2. The Concept Of A Potential Operational CryoSat Follow-on Mission

    NASA Astrophysics Data System (ADS)

    Cullen, R.

    2016-12-01

    CryoSat was a planned as a 3 year mission with clear mission objectives to allow the assessment rates of change of thickness in the land and marine ice fields with reduced uncertainties with relation to other non-dedicated missions. Although CryoSat suffered a launch failure in Oct 2005, the mission was recovered with a launch in April 2010 of CryoSat-2. The nominal mission has now been completed, all mission requirements have been fulfilled and CryoSat has been shown to be most successful as a dedicated polar ice sheet measurement system demonstrated by nearly 200 peer reviewed publications within the first four years of launch. Following the completion of the nominal mission in Oct 2013 the platform was shown to be in good health and with a scientific backing provided by the ESA Earth Science Advisory Committee (ESAC) the mission has been extended until Feb 2017 by the ESA Programme Board for Earth Observation. Though not designed to provide data for science and operational services beyond its original mission requirements, a number of services have been developed for exploitation and these are expected to increase over the next few years. Services cover a number of aspects of land and marine ice fields in addition to complementary activities covering glacial monitoring, inland water in addition to coastal and open ocean surface topography science that CryoSat has demonstrated world leading advances with. This paper will present the overall concept for a potential low-cost continuity to the CryoSat mission with the objective to provide both continuity of the existing CryoSat based data sets, i.e., longer term science and operational services that cannot be provided by the existing Copernicus complement of satellites. This is, in part, due to the high inclination (92°) drifting orbit and state of the art Synthetic Aperture Interferometer Radar Altimeter (SIRAL). In addition, further improvements in performance are expected by use of improved modes of operation over land and marine ice-fields as well as open and coastal ocean. The mission could also provide complementary data for global ocean services. With the current planning, a consolidation phase has taken place in 2016 that is expected by a potential preparation phase in 2017 with a start to Phase C/D implementation in 2018 and a launch in the 2021 timeframe.

  3. A Potential Operational CryoSat Follow-on Mission Concept and Design

    NASA Astrophysics Data System (ADS)

    Cullen, R.

    2015-12-01

    CryoSat was a planned as a 3 year mission with clear mission objectives to allow the assessment rates of change of thickness in the land and marine ice fields with reduced uncertainties with relation to other non-dedicated missions. Although CryoSat suffered a launch failure in Oct 2005, the mission was recovered with a launch in April 2010 of CryoSat-2. The nominal mission has now been completed, all mission requirements have been fulfilled and CryoSat has been shown to be most successful as a dedicated polar ice sheet measurement system demonstrated by nearly 200 peer reviewed publications within the first four years of launch. Following the completion of the nominal mission in Oct 2013 the platform was shown to be in good health and with a scientific backing provided by the ESA Earth Science Advisory Committee (ESAC) the mission has been extended until Feb 2017 by the ESA Programme Board for Earth Observation. Though not designed to provide data for science and operational services beyond its original mission requirements, a number of services have been developed for exploitation and these are expected to increase over the next few years. Services cover a number of aspects of land and marine ice fields in addition to complementary activities covering glacial monitoring, inland water in addition to coastal and open ocean surface topography science that CryoSat has demonstrated world leading advances with. This paper will present the overall concept for a potential low-cost follow-on to the CryoSat mission with the objective to provide both continuity of the existing CryoSat based data sets, i.e., longer term science and operational services that cannot be provided by the existing Copernicus complement of satellites. This is, in part, due to the high inclination (92°) drifting orbit and state of the art Synthetic Aperture Interferometer Radar Altimeter (SIRAL). In addition, further improvements in performance are expected by use of the instrument timing and digital hardware developments used in the Sentinel-6/Jason-CS Poseidon-4 design. It is expected that the mission will also provide data for global ocean services complementary to those of the other Sentinel 3 and 6 missions. With the current planning the development of the potential is expected to commence during 2016 launch in the 2021 time frame.

  4. CryoSat-2 altimetry derived Arctic bathymetry map: first results and validation

    NASA Astrophysics Data System (ADS)

    Andersen, O. B.; Abulaitijiang, A.; Cancet, M.; Knudsen, P.

    2017-12-01

    The Technical University of Denmark (DTU), DTU Space has been developing high quality high resolution gravity fields including the new highly accurate CryoSat-2 radar altimetry satellite data which extends the global coverage of altimetry data up to latitude 88°. With its exceptional Synthetic Aperture Radar (SAR) mode being operating throughout the Arctic Ocean, leads, i.e., the ocean surface heights, is used to retrieve the sea surface height with centimeter-level range precision. Combined with the long repeat cycle ( 369 days), i.e., dense cross-track coverage, the high-resolution Arctic marine gravity can be modelled using the CryoSat-2 altimetry. Further, the polar gap can be filled by the available ArcGP product, thus yielding the complete map of the Arctic bathymetry map. In this presentation, we will make use of the most recent DTU17 marine gravity, to derive the arctic bathymetry map using inversion based on best available hydrographic maps. Through the support of ESA a recent evaluation of existing hydrographic models of the Arctic Ocean Bathymetry models (RTOPO, GEBCO, IBCAO etc) and various inconsistencies have been identified and means to rectify these inconsistencies have been taken prior to perform the inversion using altimetry. Simultaneously DTU Space has been placing great effort on the Arctic data screening, filtering, and de-noising using various altimetry retracking solutions and classifications. All the pre-processing contributed to the fine modelling of Actic gravity map. Thereafter, the arctic marine gravity grids will eventually be translated (downward continuation operation) to a new altimetry enhanced Arctic bathymetry map using appropriate band-pass filtering.

  5. Cross-calibrating ALES Envisat and CryoSat-2 Delay-Doppler: A coastal altimetry study in the Indonesian Seas

    NASA Astrophysics Data System (ADS)

    Passaro, Marcello; Dinardo, Salvatore; Quartly, Graham D.; Snaith, Helen M.; Benveniste, Jérôme; Cipollini, Paolo; Lucas, Bruno

    2016-08-01

    A regional cross-calibration between the first Delay-Doppler altimetry dataset from CryoSat-2 and a retracked Envisat dataset is here presented, in order to test the benefits of the Delay-Doppler processing and to expand the Envisat time series in the coastal ocean. The Indonesian Seas are chosen for the calibration, since the availability of altimetry data in this region is particularly beneficial due to the lack of in situ measurements and its importance for global ocean circulation. The Envisat data in the region are retracked with the Adaptive Leading Edge Subwaveform (ALES) retracker, which has been previously validated and applied successfully to coastal sea level research. The study demonstrates that CryoSat-2 is able to decrease the 1-Hz noise of sea level estimations by 0.3 cm within 50 km of the coast, when compared to the ALES-reprocessed Envisat dataset. It also shows that Envisat can be confidently used for detailed oceanographic research after the orbit change of October 2010. Cross-calibration at the crossover points indicates that in the region of study a sea state bias correction equal to 5% of the significant wave height is an acceptable approximation for Delay-Doppler altimetry. The analysis of the joint sea level time series reveals the geographic extent of the semiannual signal caused by Kelvin waves during the monsoon transitions, the larger amplitudes of the annual signal due to the Java Coastal Current and the impact of the strong La Niña event of 2010 on rising sea level trends.

  6. Improved Oceanographic Measurements from SAR Altimetry: Results and Scientific Roadmap from the ESA Cryosat Plus for Oceans Project

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Cotton, D.; Andersen, O. B.; Boy, F.; Cancet, M.; Dinardo, S.; Gommenginger, C.; Egido, A.; Fernandes, J.; Garcia, P. N.; Lucas, B.; Moreau, T.; Naeije, M.; Scharroo, R.; Stenseng, L.

    2014-12-01

    The ESA CryoSat mission is the first space mission to carry a radar altimeter that can operate in Synthetic Aperture Radar (SAR) mode. It thus provides the first opportunity to test and evaluate, using real data, the significant potential benefits of SAR altimetry for ocean applications. The objective of the CryoSat Plus for Oceans (CP4O) project is to develop and evaluate new ocean products from CryoSat data and so maximize the scientific return of CryoSat over oceans. The main focus of CP4O has been on the additional measurement capabilities that are offered by the SAR mode of the SIRAL altimeter, with further work in developing improved geophysical corrections. CP4O has developed SAR based ocean products for application in four themes: Open Oceans, Coastal Oceans, Polar Oceans and Sea Floor Topography. The team has developed a number of new processing schemes and compared and evaluated the resultant data products. This work has clearly demonstrated the improved ocean measuring capability offered by SAR mode altimetry and has also added significantly to our understanding of the issues around the processing and interpretation of SAR altimeter echoes. The project finishes in the summer of 2014, so this paper presents an overview of the major results and outlines a proposed roadmap for the further development and exploitation of these results in operational and scientific applications. The results are of course also highly relevant to support the planning for future missions, including Sentinel-3 and Jason-CS. The "CryoSat Plus for Oceans" (CP4O) project has been supported by ESA (Support To Science Element) and CNES.

  7. A New High Resolution Tidal Model in the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Cancet, M.; Andersen, O.; Lyard, F.; Schulz, A.; Cotton, D.; Benveniste, J.

    2016-08-01

    The Arctic Ocean is a challenging region for tidal modelling. The accuracy of the global tidal models decreases by several centimeters in the Polar Regions, which has a large impact on the quality of the satellite altimeter sea surface heights and the altimetry-derived products.NOVELTIS and DTU Space have developed a regional, high-resolution tidal atlas in the Arctic Ocean, in the framework of an extension of the CryoSat Plus for Ocean (CP4O) ESA STSE (Support to Science Element) project. In particular, this atlas benefits from the assimilation of the most complete satellite altimetry dataset ever used in this region, including Envisat data up to 82°N and CryoSat-2 data between 82°N and 88°N. The combination of these satellite altimetry missions gives the best possible coverage of altimetry-derived tidal constituents. The available tide gauge data were also used for data assimilation and validation.This paper presents the implementation methodology and the performance of this new regional tidal model in the Arctic Ocean, compared to the existing global tidal models.

  8. High resolution tidal modeling in the Arctic Ocean: needs and upcoming developments

    NASA Astrophysics Data System (ADS)

    Cancet, Mathilde; Baltazar Andersen, Ole; Cotton, David; Lyard, Florent; Benveniste, Jerome

    2015-04-01

    The Arctic Ocean is a challenging region for tidal modeling, because of its complex and not well-documented bathymetry, combined with the intermittent presence of sea ice and the fact that the in situ tidal observations are rather scarce at high latitudes. As a consequence, the accuracy of the global tidal models decreases by several centimeters in the Polar Regions. As a consequence the quality of the satellite altimeter sea surface heights in these regions (ERS1/2, Envisat, CryoSat-2, SARAL/AltiKa and the future Sentinel-3 mission) are impacted. Better knowledge of the tides would improve the quality of the high latitudes altimeter sea surface heights and of all derived products, such as the altimetry-derived geostrophic currents, the mean sea surface and the mean dynamic topography. In addition, accurate tidal models are highly strategic information for ever-growing maritime and industrial activities in this region. NOVELTIS and DTU Space are currently working on the development of a regional, high-resolution tidal atlas in the Arctic Ocean. In particular, this atlas will benefit from the assimilation of the most complete satellite altimetry dataset ever used in this region, including Envisat and SARAL/AltiKa data up to 82°N and the CryoSat-2 reprocessed data between 82°N and 88°N. The combination of all these satellites will give the best possible coverage of altimetry-derived tidal constituents. The available tide gauge data will also be used either for assimilation or validation. This paper presents the deficiencies and needs of the global tidal models in the Arctic Ocean as identified using the CryoSat altimetry data, and the on-going work to develop an improved regional tidal atlas in this region.

  9. A new high resolution tidal model in the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Cancet, Mathilde; Andersen, Ole; Lyard, Florent; Cotton, David; Benveniste, Jérôme

    2016-04-01

    The Arctic Ocean is a challenging region for tidal modeling, because of its complex and not well-documented bathymetry, together combined with the intermittent presence of sea ice and the fact that the in situ tidal observations are scarce at such high latitudes. As a consequence, the accuracy of the global tidal models decreases by several centimeters in the Polar Regions. It has a large impact on the quality of the satellite altimeter sea surface heights in these regions (ERS1/2, Envisat, CryoSat-2, SARAL/AltiKa and the future Sentinel-3 mission), but also on the end-users' applications that need accurate tidal information. Better knowledge of the tides will improve the quality of the high latitudes altimeter sea surface heights and of all derived products, such as the altimetry-derived geostrophic currents, the mean sea surface and the mean dynamic topography. In addition, accurate tidal models are highly strategic information for ever-growing maritime and industrial activities in this region. NOVELTIS and DTU Space have recently developed a regional, high-resolution tidal atlas in the Arctic Ocean, in the framework of an extension of the CryoSat Plus for Oceans (CP4O) project funded by ESA (STSE program). In particular, this atlas benefits from the assimilation of the most complete satellite altimetry dataset ever used in this region, including the Envisat data up to 82°N and the CryoSat-2 reprocessed data between 82°N and 88°N. The combination of all these satellites gives the best possible coverage of altimetry-derived tidal constituents. Tide gauge data have also been used either for assimilation or validation. This paper presents the methodology followed to develop the model and the performances of this new regional tidal model in the Arctic Ocean.

  10. The Research on Elevation Change of Antarctic Ice Sheet Based on CRYOSAT-2 Alimeter

    NASA Astrophysics Data System (ADS)

    Sun, Q.; Wan, J.; Liu, S.; Li, Y.

    2018-04-01

    In this paper, the Cryosat-2 altimeter data distributed by the ESA, and these data are processed to extract the information of the elevation change of the Antarctic ice sheet from 2010 to 2017. Firstly, the main pretreatment preprocessing for Cryosat-2 altimetry data is crossover adjustment and elimination of rough difference. Then the grid DEM of the Antarctic ice sheet was constructed by using the kriging interpolation method,and analyzed the spatial characteristic time characteristics of the Antarctic ice sheet. The latitude-weighted elevation can be obtained by using the elevation data of each cycle, and then the general trend of the Antarctic ice sheet elevation variation can be seen roughly.

  11. Enhancing the Arctic Mean Sea Surface and Mean Dynamic Topography with CryoSat-2 Data

    NASA Astrophysics Data System (ADS)

    Stenseng, Lars; Andersen, Ole B.; Knudsen, Per

    2014-05-01

    A reliable mean sea surface (MSS) is essential to derive a good mean dynamic topography (MDT) and for the estimation of short and long-term changes in the sea surface. The lack of satellite radar altimetry observations above 82 degrees latitude means that existing mean sea surface models have been unreliable in the Arctic Ocean. We here present the latest DTU mean sea surface and mean dynamic topography models that includes CryoSat-2 data to improve the reliability in the Arctic Ocean. In an attempt to extrapolate across the gap above 82 degrees latitude the previously models included ICESat data, gravimetrical geoids, ocean circulation models and various combinations hereof. Unfortunately cloud cover and the short periods of operation has a negative effect on the number of ICESat sea surface observations. DTU13MSS and DTU13MDT are the new generation of state of the art global high-resolution models that includes CryoSat-2 data to extend the satellite radar altimetry coverage up to 88 degrees latitude. Furthermore the SAR and SARin capability of CryoSat-2 dramatically increases the amount of useable sea surface returns in sea-ice covered areas compared to conventional radar altimeters like ENVISAT and ERS-1/2. With the inclusion of CryoSat-2 data the new mean sea surface is improved by more than 20 cm above 82 degrees latitude compared with the previous generation of mean sea surfaces.

  12. An Improved Cryosat-2 Sea Ice Freeboard Retrieval Algorithm Through the Use of Waveform Fitting

    NASA Technical Reports Server (NTRS)

    Kurtz, Nathan T.; Galin, N.; Studinger, M.

    2014-01-01

    We develop an empirical model capable of simulating the mean echo power cross product of CryoSat-2 SAR and SAR In mode waveforms over sea ice covered regions. The model simulations are used to show the importance of variations in the radar backscatter coefficient with incidence angle and surface roughness for the retrieval of surfaceelevation of both sea ice floes and leads. The numerical model is used to fit CryoSat-2 waveforms to enable retrieval of surface elevation through the use of look-up tables and a bounded trust region Newton least squares fitting approach. The use of a model to fit returns from sea ice regions offers advantages over currently used threshold retrackingmethods which are here shown to be sensitive to the combined effect of bandwidth limited range resolution and surface roughness variations. Laxon et al. (2013) have compared ice thickness results from CryoSat-2 and IceBridge, and found good agreement, however consistent assumptions about the snow depth and density of sea ice werenot used in the comparisons. To address this issue, we directly compare ice freeboard and thickness retrievals from the waveform fitting and threshold tracker methods of CryoSat-2 to Operation IceBridge data using a consistent set of parameterizations. For three IceBridge campaign periods from March 20112013, mean differences (CryoSat-2 IceBridge) of 0.144m and 1.351m are respectively found between the freeboard and thickness retrievals using a 50 sea ice floe threshold retracker, while mean differences of 0.019m and 0.182m are found when using the waveform fitting method. This suggests the waveform fitting technique is capable of better reconciling the seaice thickness data record from laser and radar altimetry data sets through the usage of consistent physical assumptions.

  13. Combining Envisat type and CryoSat-2 altimetry to inform hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Nygaard Godiksen, Peter; Villadsen, Heidi; Madsen, Henrik; Bauer-Gottwein, Peter

    2015-04-01

    Hydrological models are developed and used for flood forecasting and water resources management. Such models rely on a variety of input and calibration data. In general, and especially in data scarce areas, remote sensing provides valuable data for the parameterization and updating of such models. Satellite radar altimeters provide water level measurements of inland water bodies. So far, many studies making use of satellite altimeters have been based on data from repeat-orbit missions such as Envisat, ERS or Jason or on synthetic wide-swath altimetry data as expected from the SWOT mission. This work represents one of the first hydrologic applications of altimetry data from a drifting orbit satellite mission, using data from CryoSat-2. We present an application where CryoSat-2 data is used to improve a hydrodynamic model of the Ganges and Brahmaputra river basins in South Asia set up in the DHI MIKE 11 software. The model's parameterization and forcing is mainly based on remote sensing data, for example the TRMM 3B42 precipitation product and the SRTM DEM for river and subcatchment delineation. CryoSat-2 water levels were extracted over a river mask derived from Landsat 7 and 8 imagery. After calibrating the hydrological-hydrodynamic model against observed discharge, simulated water levels were fitted to the CryoSat-2 data, with a focus on the Brahmaputra river in the Assam valley: The average simulated water level in the hydrodynamic model was fitted to the average water level along the river's course as observed by CryoSat-2 over the years 2011-2013 by adjusting the river bed elevation. In a second step, the cross section shapes were adjusted so that the simulated water level dynamics matched those obtained from Envisat virtual station time series. The discharge calibration resulted in Nash-Sutcliffe coefficients of 0.86 and 0.94 for the Ganges and Brahmaputra. Using the Landsat river mask, the CryoSat-2 water levels show consistency along the river and are in good accordance with other products, such as the SRTM DEM. The adjusted hydrodynamic model reproduced the average water level profile along the river channel with a higher accuracy than a model based on the SRTM DEM. Furthermore, the amplitudes as observed in Envisat virtual station time series could be reproduced fitting simple triangular cross section shapes. A hydrodynamic model prepared in such a way provides water levels at any point along the river and any point in time, which are consistent with the multi-mission altimetric dataset. This means it can for example be updated by assimilation of near real-time water level measurements from CryoSat-2 improving its flood forecasting capability.

  14. Trends in Arctic Sea Ice Volume 2010-2013 from CryoSat-2

    NASA Astrophysics Data System (ADS)

    Tilling, R.; Ridout, A.; Wingham, D.; Shepherd, A.; Haas, C.; Farrell, S. L.; Schweiger, A. J.; Zhang, J.; Giles, K.; Laxon, S.

    2013-12-01

    Satellite records show a decline in Arctic sea ice extent over the past three decades with a record minimum in September 2012, and results from the Pan-Arctic Ice-Ocean Modelling and Assimilation System (PIOMAS) suggest that this has been accompanied by a reduction in volume. We use three years of measurements recorded by the European Space Agency CryoSat-2 (CS-2) mission, validated with in situ data, to generate estimates of seasonal variations and inter-annual trends in Arctic sea ice volume between 2010 and 2013. The CS-2 estimates of sea ice thickness agree with in situ estimates derived from upward looking sonar measurements of ice draught and airborne measurements of ice thickness and freeboard to within 0.1 metres. Prior to the record minimum in summer 2012, autumn and winter Arctic sea ice volume had fallen by ~1300 km3 relative to the previous year. Using the full 3-year period of CS-2 observations, we estimate that winter Arctic sea ice volume has decreased by ~700 km3/yr since 2010, approximately twice the average rate since 1980 as predicted by the PIOMAS.

  15. Enhanced Arctic Mean Sea Surface and Mean Dynamic Topography including retracked CryoSat-2 Data

    NASA Astrophysics Data System (ADS)

    Andersen, O. B.; Jain, M.; Stenseng, L.; Knudsen, P.

    2014-12-01

    A reliable mean sea surface (MSS) is essential to derive a good mean dynamic topography (MDT) and for the estimation of short and long-term changes in the sea surface. The lack of satellite radar altimetry observations above 82 degrees latitude means that existing mean sea surface models have been unreliable in the Arctic Ocean. We here present the latest DTU mean sea surface and mean dynamic topography models combining conventional altimetry with retracked CryoSat-2 data to improve the reliability in the Arctic Ocean. For the derivation of a mean dynamic topography the ESA GOCE derived geoid model have been used to constrain the longer wavelength. We present the retracking of C2 SAR data using various retrackes and how we have been able to combine data from various retrackers under various sea ice conditions. DTU13MSS and DTU13MDT are the newest state of the art global high-resolution models including CryoSat-2 data to extend the satellite radar altimetry coverage up to 88 degrees latitude and through combination with a GOCE geoid model completes coverage all the way to the North Pole. Furthermore the SAR and SARin capability of CryoSat-2 dramatically increases the amount of useable sea surface returns in sea-ice covered areas compared to conventional radar altimeters like ENVISAT and ERS-1/2. With the inclusion of CryoSat-2 data the new mean sea surface is improved by more than 20 cm above 82 degrees latitude compared with the previous generation of mean sea surfaces.

  16. Informing a hydrological model of the Ogooué with multi-mission remote sensing data

    NASA Astrophysics Data System (ADS)

    Kittel, Cecile M. M.; Nielsen, Karina; Tøttrup, Christian; Bauer-Gottwein, Peter

    2018-02-01

    Remote sensing provides a unique opportunity to inform and constrain a hydrological model and to increase its value as a decision-support tool. In this study, we applied a multi-mission approach to force, calibrate and validate a hydrological model of the ungauged Ogooué river basin in Africa with publicly available and free remote sensing observations. We used a rainfall-runoff model based on the Budyko framework coupled with a Muskingum routing approach. We parametrized the model using the Shuttle Radar Topography Mission digital elevation model (SRTM DEM) and forced it using precipitation from two satellite-based rainfall estimates, FEWS-RFE (Famine Early Warning System rainfall estimate) and the Tropical Rainfall Measuring Mission (TRMM) 3B42 v.7, and temperature from ECMWF ERA-Interim. We combined three different datasets to calibrate the model using an aggregated objective function with contributions from (1) historical in situ discharge observations from the period 1953-1984 at six locations in the basin, (2) radar altimetry measurements of river stages by Envisat and Jason-2 at 12 locations in the basin and (3) GRACE (Gravity Recovery and Climate Experiment) total water storage change (TWSC). Additionally, we extracted CryoSat-2 observations throughout the basin using a Sentinel-1 SAR (synthetic aperture radar) imagery water mask and used the observations for validation of the model. The use of new satellite missions, including Sentinel-1 and CryoSat-2, increased the spatial characterization of river stage. Throughout the basin, we achieved good agreement between observed and simulated discharge and the river stage, with an RMSD between simulated and observed water amplitudes at virtual stations of 0.74 m for the TRMM-forced model and 0.87 m for the FEWS-RFE-forced model. The hydrological model also captures overall total water storage change patterns, although the amplitude of storage change is generally underestimated. By combining hydrological modeling with multi-mission remote sensing from 10 different satellite missions, we obtain new information on an otherwise unstudied basin. The proposed model is the best current baseline characterization of hydrological conditions in the Ogooué in light of the available observations.

  17. The Sentinel-3 Surface Topography Mission (S-3 STM): Level 2 SAR Ocean Retracker

    NASA Astrophysics Data System (ADS)

    Dinardo, S.; Lucas, B.; Benveniste, J.

    2015-12-01

    The SRAL Radar Altimeter, on board of the ESA Mission Sentinel-3 (S-3), has the capacity to operate either in the Pulse-Limited Mode (also known as LRM) or in the novel Synthetic Aperture Radar (SAR) mode. Thanks to the initial results from SAR Altimetry obtained exploiting CryoSat-2 data, lately the interest by the scientific community in this new technology has significantly increased and consequently the definition of accurate processing methodologies (along with validation strategies) has now assumed a capital importance. In this paper, we present the algorithm proposed to retrieve from S-3 STM SAR return waveforms the standard ocean geophysical parameters (ocean topography, wave height and sigma nought) and the validation results that have been so far achieved exploiting the CryoSat-2 data as well as the simulated data. The inversion method (retracking) to extract from the return waveform the geophysical information is a curve best-fitting scheme based on the bounded Levenberg-Marquardt Least-Squares Estimation Method (LEVMAR-LSE). The S-3 STM SAR Ocean retracking algorithm adopts, as return waveform’s model, the “SAMOSA” model [Ray et al, 2014], named after the R&D project SAMOSA (led by Satoc and funded by ESA), in which it has been initially developed. The SAMOSA model is a physically-based model that offers a complete description of a SAR Altimeter return waveform from ocean surface, expressed in the form of maps of reflected power in Delay-Doppler space (also known as stack) or expressed as multilooked echoes. SAMOSA is able to account for an elliptical antenna pattern, mispointing errors in roll and yaw, surface scattering pattern, non-linear ocean wave statistics and spherical Earth surface effects. In spite of its truly comprehensive character, the SAMOSA model comes with a compact analytical formulation expressed in term of Modified Bessel functions. The specifications of the retracking algorithm have been gathered in a technical document (DPM) and delivered as baseline for industrial implementation. For operational needs, thanks to the fine tuning of the fitting library parameters and the usage of look-up table for Bessel functions computation, the CPU execution time was accelerated over 100 times and made the execution in par with real time. In the course of the ESA-funded project CryoSat+ for Ocean (CP4O), new technical evolutions for the algorithm have been proposed (as usage of PTR width look up table and application of a stack masking). One of the main outcomes of the CP4O project was that, with these latest evolutions, the SAMOSA SAR retracking was giving equivalent results to CNES CPP retracking prototype, which was built with a totally different approach, which enforces the validation results. Work actually is underway to align the industrial implementation with the last new evolutions. Further, in order to test the algorithm with a dataset as realistic as possible, a set of simulated Test Data Set (generated by S-3 STM End-to-End Simulator) has been created by CLS following the specifications as described in a test data set requirements document drafted by ESA. In this work, we will show the baseline algorithm details, the evolutions, the impact of the evolutions and the results obtained processing the CryoSat-2 data and the simulated test data set.

  18. SAR Altimetry Processing on Demand Service for Cryosat-2 and Sentinel-3 at ESA G-Pod

    NASA Astrophysics Data System (ADS)

    Dinardo, Salvatore; Benveniste, Jérôme; Ambrózio, Américo; Restano, Marco

    2016-07-01

    The G-POD SARvatore service to users for the exploitation of CryoSat-2 data was designed and developed by the Altimetry Team at ESA-ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The G-POD service coined SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) is a web platform that allows any scientist to process on-line, on-demand and with user-selectable configuration CryoSat-2 SAR/SARIN data, from L1a (FBR) data products up to SAR/SARin Level-2 geophysical data products. The Processor takes advantage of the G-POD (Grid Processing On Demand) distributed computing platform (350 CPUs in ~70 Working Nodes) to timely deliver output data products and to interface with ESA-ESRIN FBR data archive (155'000 SAR passes and 41'000 SARin passes). The output data products are generated in standard NetCDF format (using CF Convention), therefore being compatible with the Multi-Mission Radar Altimetry Toolbox (BRAT) and other NetCDF tools. By using the G-POD graphical interface, it is straightforward to select a geographical area of interest within the time-frame related to the Cryosat-2 SAR/SARin FBR data products availability in the service catalogue. The processor prototype is versatile, allowing users to customize and to adapt the processing according to their specific requirements by setting a list of configurable options. After the task submission, users can follow, in real time, the status of the processing, which can be lengthy due to the required intense number-crunching inherent to SAR processing. From the web interface, users can choose to generate experimental SAR data products as stack data and RIP (Range Integrated Power) waveforms. The processing service, initially developed to support the awarded development contracts by confronting the deliverables to ESA's prototype, is now made available to the worldwide SAR Altimetry Community for research & development experiments, for on-site demonstrations/training in training courses and workshops, for cross-comparison to third party products (e.g. CLS/CNES CPP or ESA SAR COP data products), for the preparation of the Sentinel-3 Surface Topography Mission, for producing data and graphics for publications, etc. Initially, the processing was designed and uniquely optimized for open ocean studies. It was based on the SAMOSA model developed for the Sentinel-3 Ground Segment using CryoSat data (Cotton et al., 2008; Ray et al., 2014). However, since June 2015, a new retracker (SAMOSA+) is offered within the service as a dedicated retracker for coastal zone, inland water and sea-ice/ice-sheet. In view of the Sentinel-3 launch, a new flavor of the service will be initiated, exclusively dedicated to the processing of Sentinel-3 mission data products. The scope of this new service will be to maximize the exploitation of the upcoming Sentinel-3 Surface Topography Mission's data over all surfaces. The service is open, free of charge (supported by the ESA SEOM Programme Element) for worldwide scientific applications and available at https://gpod.eo.esa.int/services/CRYOSAT_SAR/

  19. Sea ice thickness derived from radar altimetry: achievements and future plans

    NASA Astrophysics Data System (ADS)

    Ricker, R.; Hendricks, S.; Paul, S.; Kaleschke, L.; Tian-Kunze, X.

    2017-12-01

    The retrieval of Arctic sea ice thickness is one of the major objectives of the European CryoSat-2 radar altimeter mission and the 7-year long period of operation has produced an unprecedented record of monthly sea ice thickness information. We present CryoSat-2 results that show changes and variability of Arctic sea ice from the winter season 2010/2011 until fall 2017. CryoSat-2, however, was designed to observe thick perennial sea ice, while an accurate retrieval of thin seasonal sea ice is more challenging. We have therefore developed a method of completing and improving Arctic sea ice thickness information within the ESA SMOS+ Sea Ice project by merging CryoSat-2 and SMOS sea ice thickness retrievals. Using these satellite missions together overcomes several issues of single-mission retrievals and provides a more accurate and comprehensive view on the state of Arctic sea-ice thickness at higher temporal resolution. However, stand-alone CryoSat-2 observations can be used as reference data for the exploitation of older pulse-limited radar altimetry data sets over sea ice. In order to observe trends in sea ice thickness, it is required to minimize inter-mission biases between subsequent satellite missions. Within the ESA Climate Change Initiative (CCI) on Sea Ice, a climate data record of sea ice thickness derived from satellite radar altimetry has been developed for both hemispheres, based on the 15-year (2002-2017) monthly retrievals from Envisat and CryoSat-2 and calibrated in the 2010-2012 overlap period. The next step in promoting the utilization of sea ice thickness information from radar altimetry is to provide products by a service that meets the requirements for climate applications and operational systems. This task will be pursued within a Copernicus Climate Change Service project (C3S). This framework also aims to include additional sensors such as onboard Sentinel-3 and we will show first results of Sentinel-3 Arctic sea-ice thickness. These developments are the base for preserving the continuity of the sea ice thickness data record and the transformation from research oriented products into an operational service.

  20. The CryoSat Interferometer after 6 years in orbit: calibration and achievable performance

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; De Bartolomei, Maurizio; Bouffard, Jerome; Parrinello, Tommaso

    2016-04-01

    The main payload of CryoSat is a Ku-band pulse width limited radar altimeter, called SIRAL (Synthetic interferometric radar altimeter). When commanded in SARIn (synthetic aperture radar interferometry) mode, through coherent along-track processing of the returns received from two antennas, the interferometric phase related to the first arrival of the echo is used to retrieve the angle of arrival of the scattering in the across-track direction. In fact, the across-track echo direction can be derived by exploiting the precise knowledge of the baseline vector (i.e. the vector between the two antennas centers of phase) and simple geometry. The end-to-end calibration strategy for the CryoSat interferometer consists on in-orbit calibration campaigns following the approach described in [1]. From the beginning of the CryoSat mission, about once a year the interferometer calibration campaigns have been periodically performed by rolling left and right the spacecraft of about ±0.4 deg. This abstract is aimed at presenting our analysis of the calibration parameters and of the achievable performance of the CryoSat interferometer over the 6 years of mission. Additionally, some further studies have been performed to assess the accuracy of the roll angle computed on ground as function of the aberration (the apparent displacement of a celestial object from its true position, caused by the relative motion of the observer and the object) correction applied to the attitude quaternions, provided by the Star Tracker mounted on-board. In fact, being the roll information crucial to obtain an accurate estimate of the angle of arrival, the data from interferometer calibration campaigns have been used to verify how the application of the aberration correction affects the roll information and, in turns, the measured angle of arrival. [1] Galin, N.; Wingham, D.J.; Cullen, R.; Fornari, M.; Smith, W.H.F.; Abdalla, S., "Calibration of the CryoSat-2 Interferometer and Measurement of Across-Track Ocean Slope," in Geoscience and Remote Sensing, IEEE Transactions on , vol.51, no.1, pp.57-72, Jan. 2013

  1. Airborne Grid Sea-Ice Surveys for Comparison with CryoSat-2

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Gardner, J. M.; Liang, R.; Hagen, R. A.; Ball, D.

    2014-12-01

    The U.S. Naval Research Laboratory is engaged in a study of the changing Arctic with a particular focus on ice thickness and distribution variability. The purpose is to optimize computer models used to predict sea ice changes. An important part of our study is to calibrate/validate CryoSat-2 ice thickness data prior to its incorporation into new ice forecast models. The large footprint of the CryoSat-2 altimeter over sea-ice is a significant issue in any attempt to ground-truth the data. Along-track footprints are reduced to ~ 300 m by SAR processing of the returns. However, the cross-track footprint is determined by the topography of the surface. Further, the actual return is the sum of the returns from individual reflectors within the footprint making it difficult to interpret the return, and optimize the waveform tracker. We therefore collected a series of grids of airborne scanning lidar and nadir pointing radar on sub-satellite tracks over sea-ice that would extend far enough cross-track to capture the illuminated area. One difficulty in the collection of grids comprised of adjacent overlapping tracks is that the ice moves as much as 300 m over the duration of a single track (~ 10 min). With a typical lidar swath width of 500m we needed to adjust the survey tracks in near real-time for the ice motion. This was accomplished by a photogrammetric method of ice velocity determination (RTIME) reported in another presentation. Post-processing refinements resulted in typical track-to-track miss-ties of ~ 1-2 m, much of which could be attributed to ice deformation over the period of the survey. An important factor is that we were able to reconstruct the ice configuration at the time of the satellite overflight, resulting in an accurate representation of the surface illuminated by CryoSat-2. Our intention is to develop a model of the ice surface using the lidar grid which includes both snow and ice using radar profiles to determine snow thickness. In 2013 a set of 6 usable grids 5-20 km wide (cross-track) by 10-30 km long were collected north of Barrow, AK. In 2014 a further 5 narrower grids (~5km) were collected. Data from these grids are shown here and will be used to examine the relationship of the tracked satellite waveform data to the actual surface.

  2. Retrieving improved multi-temporal CryoSat elevations over ice caps and glaciers - a case study of Barnes ice cap

    NASA Astrophysics Data System (ADS)

    Nilsson, Johan; Burgess, David

    2014-05-01

    The CryoSat mission was launched in 2010 to observe the Earth's cryosphere. In contrast to previous satellite radar altimeters, this mission is expected to monitor the elevation of small ice caps and glaciers, which according to the IPCC will be the largest contributor to 21st century sea level rise. To date the ESA CryoSat SARiN level-2 (L2) elevation product is not yet fully optimized for use over these types of glaciated regions, as its processed with a more universal algorithm. Thus the aim of this study is to demonstrate that with the use of improved processing CryoSat SARiN data can be used for more accurate topography mapping and elevation change detection for ice caps and glaciers. To demonstrate this, elevations and elevation changes over Barnes ice cap, located on Baffin Island in the Canadian Arctic, have been estimated from available data from the years 2010-2013. ESA's CryoSat level-1b (L1b) SARiN baseline "B" data product was used and processed in-house to estimate surface elevations. The resulting product is referred to as DTU-L2. The processing focused on improving the retracker, reducing phase noise and correcting phase ambiguities. The accuracy of the DTU-L2 and the ESA-L2 product was determined by comparing the measured elevations against NASA's IceBridge Airborne Topographic Mapper (ATM) elevations from May 2011. The resulting difference in accuracy was determined by comparing their associated errors. From the multi-temporal measurements spanning the period 2010-2013, elevation changes where estimated and compared to ICESat derived changes from 2003-2009. The result of the study shows good agreement between the NASA measured ATM elevations and the DTU-L2 data. It also shows that the pattern of elevation change is similar to that derived from ICESat data. The accuracy of the DTU-L2 estimated elevations is on average several factors higher compared to the ESA-L2 elevation product. These preliminary results demonstrates that CryoSat elevation data, using improved processing, can be used for accurate topographic mapping and elevation change detection on ice caps and glaciers. Future work would entail extending this processing to other regions of this type to support these results.

  3. A Review Of CryoSat-2/SIRAL Applications For The Monitoring Of River Water Levels

    NASA Astrophysics Data System (ADS)

    Bercher, Nicolas; Dinardo, Salvatore; Lucas, Bruno Manuel; Fleury, Sara; Calmant, Stephane; Femenias, Pierre; Boy, Francois; Picot, Nicolas; Benveniste, Jerome

    2013-12-01

    Regarding hydrology applications and particularly the monitoring of river water levels from space, the CryoSat- 2 ice mission has two main valuable characteristics: (1) its geodetic orbit and (2) the altimeter's SAR and SARin modes. The benefits of the geodetic orbit of the satellite have been illustrated in the frame of the ”20 years of progress in radar altimetry” symposium (Venice, 2012) [2]. It has been shown that, with such an orbit, the way river water level was monitored using conventional altimeters had to be revisited. In particular, using LRM mode only, CryoSat-2 allowed us to build spatio-temporal time series of the river water level, to map river's topography and eventually derive pseudo-time series and pseudo-profiles of the river. This paper focuses on the new ways to use altimetry for the monitoring of river water levels. SIRAL's (CryoSat-2 altimeter) SAR and SARin modes have the ability to deliver surface heights with an unprecedented along-track resolution of about 300 m. Moreover, using the SARin mode (involving the satellite's two antennas), the cross- track angle of the retracked echo is also available in routine. These two aspects of the SARin mode (high resolution and cross-track angle) make it a new tool to distinguish whether the retracked echo came from the sur- face of interest (e.g., a river) or any other reflective object nearby the surface of interest (e.g., another river section, lakes or temporary lake after flooding events or any other specular surfaces). We introduce the multiple benefits of using the intermediate multi-look matrix (also known as stack matrix), among them: (1) to refine and select among the multiple Doppler-beam waveforms before averaging and retracking them, and (2) to be able to study the surfaces response according to their view angle. Custom products processed at ESA (ESRIN) by Dinardo et al. [7], in the perspective of Sentinel-3, as well as official CryoSat-2 L1b and L2 products were used to illustrate these perspectives. The paper mainly introduces the potential new applications brought by SIRAL's SAR and SARin modes. Finally, combined with its really dense geodetic orbit, CryoSat-2 can be seen as a topography mission that paves the way toward the SWOT mission.

  4. Development of a multi-sensor elevation time series pole-ward of 86°S in support of altimetry validation and ice sheet mass balance studies

    NASA Astrophysics Data System (ADS)

    Studinger, M.; Brunt, K. M.; Casey, K.; Medley, B.; Neumann, T.; Manizade, S.; Linkswiler, M. A.

    2015-12-01

    In order to produce a cross-calibrated long-term record of ice-surface elevation change for input into ice sheet models and mass balance studies it is necessary to "link the measurements made by airborne laser altimeters, satellite measurements of ICESat, ICESat-2, and CryoSat-2" [IceBridge Level 1 Science Requirements, 2012] and determine the biases and the spatial variations between radar altimeters and laser altimeters using different wavelengths. The convergence zones of all ICESat tracks (86°S) and all ICESat-2 and CryoSat-2 tracks (88°S) are in regions of relatively low accumulation, making them ideal for satellite altimetry calibration. In preparation for ICESat-2 validation, the IceBridge and ICESat-2 science teams have designed IceBridge data acquisitions around 86°S and 88°S. Several aspects need to be considered when comparing and combining elevation measurements from different radar and laser altimeters, including: a) foot print size and spatial sampling pattern; b) accuracy and precision of each data sets; c) varying signal penetration into the snow; and d) changes in geodetic reference frames over time, such as the International Terrestrial Reference Frame (ITRF). The presentation will focus on the analysis of several IceBridge flights around 86 and 88°S with the LVIS and ATM airborne laser altimeters and will evaluate the accuracy and precision of these data sets. To properly interpret the observed elevation change (dh/dt) as mass change, however, the various processes that control surface elevation fluctuations must be quantified and therefore future work will quantify the spatial variability in snow accumulation rates pole-ward of 86°S and in particular around 88°S. Our goal is to develop a cross-validated multi-sensor time series of surface elevation change pole-ward of 86°S that, in combination with measured accumulation rates, will support ICESat-2 calibration and validation and ice sheet mass balance studies.

  5. Analysis of Waveform Retracking Methods in Antarctic Ice Sheet Based on CRYOSAT-2 Data

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Li, F.; Zhang, S.; Hao, W.; Yuan, L.; Zhu, T.; Zhang, Y.; Zhu, C.

    2017-09-01

    Satellite altimetry plays an important role in many geoscientific and environmental studies of Antarctic ice sheet. The ranging accuracy is degenerated near coasts or over nonocean surfaces, due to waveform contamination. A postprocess technique, known as waveform retracking, can be used to retrack the corrupt waveform and in turn improve the ranging accuracy. In 2010, the CryoSat-2 satellite was launched with the Synthetic aperture Interferometric Radar ALtimeter (SIRAL) onboard. Satellite altimetry waveform retracking methods are discussed in the paper. Six retracking methods including the OCOG method, the threshold method with 10 %, 25 % and 50 % threshold level, the linear and exponential 5-β parametric methods are used to retrack CryoSat-2 waveform over the transect from Zhongshan Station to Dome A. The results show that the threshold retracker performs best with the consideration of waveform retracking success rate and RMS of retracking distance corrections. The linear 5-β parametric retracker gives best waveform retracking precision, but cannot make full use of the waveform data.

  6. Estimation of Arctic Sea Ice Freeboard and Thickness Using CryoSat-2

    NASA Astrophysics Data System (ADS)

    Lee, Sanggyun; Im, Jungho; yoon, Hyeonjin; Shin, Minso; Kim, Miae

    2014-05-01

    Arctic sea ice is one of the significant components of the global climate system as it plays a significant role in driving global ocean circulation, provides a continuous insulating layer at air-sea interface, and reflects a large portion of the incoming solar radiation in Polar Regions. Sea ice extent has constantly declined since 1980s. Its area was the lowest ever recorded on 16 September 2012 since the satellite record began in 1979. Arctic sea ice thickness has also been diminishing along with the decreasing sea ice extent. Because extent and thickness, two main characteristics of sea ice, are important indicators of the polar response to on-going climate change, there has been a great effort to quantify them using various approaches. Sea ice thickness has been measured with numerous field techniques such as surface drilling and deploying buoys. These techniques provide sparse and discontinuous data in spatiotemporal domain. Spaceborne radar and laser altimeters can overcome these limitations and have been used to estimate sea ice thickness. Ice Cloud and land Elevation Satellite (ICEsat), a laser altimeter from National Aeronautics and Space Administration (NASA), provided data to detect polar area elevation change between 2003 and 2009. CryoSat-2 launched with Synthetic Aperture Radar (SAR)/Interferometric Radar Altimeter (SIRAL) on April 2010 can provide data to estimate time-series of Arctic sea ice thickness. In this study, Arctic sea ice freeboard and thickness in 2012 and 2013 were estimated using CryoSat-2 SAR mode data that has sea ice surface height relative to the reference ellipsoid WGS84. In order to estimate sea ice thickness, freeboard height, elevation difference between the top of sea ice surface and leads should be calculated. CryoSat-2 profiles such as pulse peakiness, backscatter sigma-0, number of echoes, and significant wave height were examined to distinguish leads from sea ice. Several near-real time cloud-free MODIS images as CryoSat-2 data were used to identify leads. Rule-based machine learning approaches such as random forest and See5.0 and human-derived decision trees were used to produce rules to identify leads. With the freeboard height calculated from the lead analysis, sea ice thickness was finally estimated using the Archimedes' buoyancy principle with density of sea ice and sea water and the height of freeboard. The results were compared with Arctic sea ice thickness distribution retrieved from CryoSat-2 data by Alfred-Wegener-Institute.

  7. A revised calibration of the interferometric mode of the CryoSat-2 radar altimeter improves ice height and height change measurements in western Greenland

    NASA Astrophysics Data System (ADS)

    Gray, Laurence; Burgess, David; Copland, Luke; Dunse, Thorben; Langley, Kirsty; Moholdt, Geir

    2017-05-01

    We compare geocoded heights derived from the interferometric mode (SARIn) of CryoSat to surface heights from calibration-validation sites on Devon Ice Cap and western Greenland. Comparisons are included for both the heights derived from the first return (the point-of-closest-approach or POCA) and heights derived from delayed waveform returns (swath processing). While swath-processed heights are normally less precise than edited POCA heights, e.g. standard deviations of ˜ 3 and ˜ 1.5 m respectively for the western Greenland site, the increased coverage possible with swath data complements the POCA data and provides useful information for both system calibration and improving digital elevation models (DEMs). We show that the pre-launch interferometric baseline coupled with an additional roll correction ( ˜ 0.0075° ± 0.0025°), or equivalent phase correction ( ˜ 0.0435 ± 0.0145 radians), provides an improved calibration of the interferometric SARIn mode. We extend the potential use of SARIn data by showing the influence of surface conditions, especially melt, on the return waveforms and that it is possible to detect and measure the height of summer supraglacial lakes in western Greenland. A supraglacial lake can provide a strong radar target in the waveform, stronger than the initial POCA return, if viewed at near-normal incidence. This provides an ideal situation for swath processing and we demonstrate a height precision of ˜ 0.5 m for two lake sites, one in the accumulation zone and one in the ablation zone, which were measured every year from 2010 or 2011 to 2016. Each year the lake in the ablation zone was viewed in June by ascending passes and then 5.5 days later by descending passes, which allows an approximate estimate of the filling rate. The results suggest that CryoSat waveform data and measurements of supraglacial lake height change could complement the use of optical satellite imagery and be helpful as proxy indicators for surface melt around Greenland.

  8. Ice shelf thickness change from 2010 to 2017

    NASA Astrophysics Data System (ADS)

    Hogg, A.; Shepherd, A.; Gilbert, L.; Muir, A. S.

    2017-12-01

    Floating ice shelves fringe 74 % of Antarctica's coastline, providing a direct link between the ice sheet and the surrounding oceans. Over the last 25 years, ice shelves have retreated, thinned, and collapsed catastrophically. While change in the mass of floating ice shelves has only a modest steric impact on the rate of sea-level rise, their loss can affect the mass balance of the grounded ice-sheet by influencing the rate of ice flow inland, due to the buttressing effect. Here we use CryoSat-2 altimetry data to map the detailed pattern of ice shelf thickness change in Antarctica. We exploit the dense spatial sampling and repeat coverage provided by the CryoSat-2 synthetic aperture radar interferometric mode (SARIn) to investigate data acquired between 2010 to the present day. We find that ice shelf thinning rates can exhibit large fluctuations over short time periods, and that the improved spatial resolution of CryoSat-2 enables us to resolve the spatial pattern of thinning with ever greater detail in Antarctica. In the Amundsen Sea, ice shelves at the terminus of the Pine Island and Thwaites glaciers have thinned at rates in excess of 5 meters per year for more than two decades. We observe the highest rates of basal melting near to the ice sheet grounding line, reinforcing the importance of high resolution datasets. On the Antarctic Peninsula, in contrast to the 3.8 m per decade of thinning observed since 1992, we measure an increase in the surface elevation of the Larsen-C Ice-Shelf during the CryoSat-2 period.

  9. Launch and Early Orbit Operations for CryoSat-2

    NASA Astrophysics Data System (ADS)

    Mardel, Nic; Marchese, Franco

    2010-12-01

    CryoSat-2 was launched from Baikonur on 8th of April 2010 aboard a modified Dnepr ICBM, the so-called SS18 Satan. Following the ascent and separation from the launch vehicle the Flight Operations Segment (FOS) in ESOC, Darmstadt started the operations to configure the satellite into the correct mode to acquire science; switching on units, configuring software and ensuring that the satellite health and performance was as expected. This paper will describe the operations performed by the FOS during the first weeks in orbit, including the unexpected problems encountered, their implications and solutions.

  10. Comparison of Freeboard Retrieval and Ice Thickness Calculation From ALS, ASIRAS, and CryoSat-2 in the Norwegian Arctic to Field Measurements Made During the N-ICE2015 Expedition

    NASA Astrophysics Data System (ADS)

    King, Jennifer; Skourup, Henriette; Hvidegaard, Sine M.; Rösel, Anja; Gerland, Sebastian; Spreen, Gunnar; Polashenski, Chris; Helm, Veit; Liston, Glen E.

    2018-02-01

    We present freeboard measurements from airborne laser scanner (ALS), the Airborne Synthetic Aperture and Interferometric Radar Altimeter System (ASIRAS), and CryoSat-2 SIRAL radar altimeter; ice thickness measurements from both helicopter-borne and ground-based electromagnetic-sounding; and point measurements of ice properties. This case study was carried out in April 2015 during the N-ICE2015 expedition in the area of the Arctic Ocean north of Svalbard. The region is represented by deep snow up to 1.12 m and a widespread presence of negative freeboards. The main scattering surfaces from both CryoSat-2 and ASIRAS are shown to be closer to the snow freeboard obtained by ALS than to the ice freeboard measured in situ. This case study documents the complexity of freeboard retrievals from radar altimetry. We show that even under cold (below -15°C) conditions the radar freeboard can be close to the snow freeboard on a regional scale of tens of kilometers. We derived a modal sea-ice thickness for the study region from CryoSat-2 of 3.9 m compared to measured total thickness 1.7 m, resulting in an overestimation of sea-ice thickness on the order of a factor 2. Our results also highlight the importance of year-to-year regional scale information about the depth and density of the snowpack, as this influences the sea-ice freeboard, the radar penetration, and is a key component of the hydrostatic balance equations used to convert radar freeboard to sea-ice thickness.

  11. Concepts for Cost-Effective Enhanced Cryosat Continuity: Opportunity in the Iridium PRIME Context

    NASA Astrophysics Data System (ADS)

    Le Roy, Y.; Caubet, E.; Silverstrin, P.; Legrand, C.

    2016-08-01

    The Iridium-PRIME offer, recently initiated by the Iridium company, consists in hosting payloads on customized low cost Iridium-NEXT platforms on which the main telecom mission antenna (L-band) is removed. This leaves significant resources in terms of mass, volume and power consumption to host up to three payloads on these customized platforms. The Iridium-PRIME satellites will be inserted in the Iridium-NEXT constellation to take benefit of the low cost operation service (command, control and data telemetry through the life time of the Iridium-PRIME mission). Given the synergy between schedules of the Iridium-PRIME program (launches starting around 2020) and of a possible CryoSat Follow-On (FO) mission (launch around 2022) and the adequacy of the available on-board resources for such a mission, ESA tasked Thales Alenia Space, as responsible for the SIRAL radar instrument of the currently in-orbit CryoSat mission, to study the feasibility of a concept for enhanced continuity of CryoSat on an Iridium- PRIME satellite as potential low-cost fast-track solution. The study aimed to define a cost-effective topographic payload including not only the SIRAL radar but also the necessary sub-systems to retrieve the SIRAL antenna baseline attitude (star trackers) with high accuracy and to perform a Precise Orbit Determination (POD). All these aspects are presented in this paper. In addition, possible evolutions/improvements of the Ku-band radar instrument were analysed and are presented: adding a Ka-band nadir measurement capability and a Ku-band or Ka-band wide swath mode measurement capability. The transmission issue for the SIRAL science data is also discussed in the paper.

  12. Integrated Airborne and In-Situ Measurements Over Land-Fast Ice Near Barrow, AK.

    NASA Astrophysics Data System (ADS)

    Gardner, J. M.; Brozena, J. M.; Richter-Menge, J.; Abelev, A.; Liang, R.; Ball, D.; Claffey, K. J.; Hebert, D. A.; Jones, K.

    2015-12-01

    The Naval Research Laboratory has collected two field seasons of integrated airborne and in-situ measurements over multiple sites of floating, but land-fast ice north of Barrow, AK. During the first season in March of 2014 the Cold Regions Research and Engineering Laboratory led the on-ice group including NRL personnel and Naval Academy midshipmen. The second season (March 2015) included only NRL scientists and midshipmen. The in-situ data provided ground-truth for airborne measurements from a scanning LiDAR (Riegl Q 560i), digital photogrammetry (Applanix DSS-439), a low-frequency SAR (P-band in 2014 and P and L bands in 2015) and a snow/Ku radar procured from the Center for Remote Sensing of Ice Sheets of the University of Kansas. The CReSIS radar was updated in 2015 to integrate the snow and Ku radars into a single continuous chirp, thus improving resolution. The objective of the survey was to aid our understanding of the use of the airborne data to calibrate/validate Cryosat-2 data. Sampling size or "footprint" plays a critical role in the attempt to compare in-situ measurements with airborne (or satellite) measurements. Thus the in-situ data were arranged to minimize aliasing. Ground measurements were collected along transects a sites generally consisting of a 2 km long profile of Magnaprobe and EM31 measurements with periodic boreholes. A 60 m x 400 m swath of Magnaprobe measurements was centered on this profile. Airborne data were collected on multiple overflights of the transect areas. The LiDAR measured total freeboard (ice + snow) referenced to leads in the ice, and produced swaths 200-300 m wide. The SAR imaged the ice beneath the snow and the snow/Ku radar measured snow thickness. The freeboard measurements and snow thickness are used to estimate ice thickness via isostasy and density estimates. Comparisons and processing methodology will be shown. The results of this ground-truth experiment will inform our analysis of grids of airborne data collected over areas of sea-ice illuminated by Cryosat-2.

  13. Ice elevation change from Swath Processing of CryoSat SARIn Mode Data

    NASA Astrophysics Data System (ADS)

    Foresta, Luca; Gourmelen, Noel; Shepherd, Andrew; Muir, Alan; Nienow, Pete

    2015-04-01

    Reference and repeat-observations of Glacier and Ice Sheet Margin (GISM) topography are critical to identify changes in ice elevation, provide estimates of mass gain or loss and thus quantify the contribution of the cryosphere to sea level rise (e.g. McMillan et al., 2014). The Synthetic Interferometric Radar Altimeter (SIRAL) onboard the ESA radar altimetry CryoSat (CS) mission has collected ice elevation measurements since 2010. The corresponding SARIn mode of operation, activated over GISM areas, provides high spatial resolution in the along-track direction while resolving the angular origin of echoes (i.e. across-track). The current ESA SARIn processor calculates the elevation of the Point Of Closest Approach (POCA) within each waveform and maps of elevation change in Antarctica and Greenland have been produced using the regular CS height product (McMillan et al., 2014; Helm et al., 2014). Data from the CS-SARIn mode has also been used to produce measurements of ice elevation beyond the POCA, also known as swath elevation (Hawley et al. 2009; Gray et al., 2013; ESA-STSE CryoTop project). Here we use the swath processing approach to generate maps of ice elevation change from selected regions around the margins of the Greenland and Antarctic Ice Sheets. We discuss the impact of the swath processing on the spatial resolution and precision of the resulting ice elevation field and compare our results to current dh/dt estimates. References: ESA STSE CryoTop project - http://www.stse-cryotop.org/ Gray L., Burgess D., Copland L., Cullen R., Galin N., Hawley R. and Helm V. Interferometric swath processing of Cryosat data for glacial ice topography. The Cryosphere, 7(6):1857-1867, December 2013. Hawley R.L., Shepherd A., Cullen R., Helm V. and WIngham D.J. Ice-sheet elevations from across-track processing of airborne interferometric radar altimetry. Geophysical Research Letters, 36(22):L22501, November 2009. Helm V., Humbert A. and Miller H. Elevation and elevation change of Greenland and Antarctica derived from CryoSat-2. The Cryosphere, 8(4):1539-1559, August 2014. McMillan M., Shepherd A., Sundal A., Briggs K., Muir A., Ridout A., Hogg A. and Wingham D. Increased ice losses from Antarctica detected by CryoSat-2. Geophysical Research Letters, pages 3899-3905, 2014.

  14. Comparison of AltiKa and CryoSat-2 Elevation and Elevation Rates over the Amundsen Sea Sector

    NASA Astrophysics Data System (ADS)

    Otosaka, I.; Shepherd, A.; Hogg, A.

    2017-12-01

    Altimeters have been successfully used for more than two decades to observe changes in the ice sheet surface and to estimate the contribution of ice sheets to sea level rise. The Satellite for Argos and AltiKa (SARAL) was launched in February 2013 as a joint mission between the French space agency (CNES) and the Indian Space Research Organisation (ISRO). While the altimeters previously launched into space are operating at Ku-band (13.6 GHz), the altimeter on board SARAL, AltiKa, is the first instrument to operate at Ka-band (36.8 GHz). The higher frequency of AltiKa is expected to lead to reduced penetration of the radar signal into the snowpack, compared to Ku-band. A comparison of ice sheet elevation measurements recorded at the two frequencies may therefore provide useful information on surface and its scattering properties. In this study, we compare elevation and elevation rates recorded by AltiKa and CryoSat-2 between March 2013 and April 2017 over the Amundsen Sea Sector (ASS), one of the most rapidly changing sectors of West Antarctica. Elevation and elevation rates are computed within 5 km grid cells using a plane fit method, taking into account the contributions of topography and fluctuations in elevation and backscatter. The drifting orbit and imaging modes of CryoSat-2 result in 78,7 % sampling of the study area, whereas AltiKa samples 39,7 % due to its sparser orbit pattern and due to loss of signal in steeply sloping coastal margins. Over the study period, the root mean square difference between elevation and elevation change recorded at Ka-band and Ku-band were 40.3 m and 0.54 m/yr, respectively. While the broad spatial pattern of elevation change is well resolved by both satellites, data gaps along the Getz coastline may be partly responsible for the lower elevation change rate observed at Ka-band. We also compared CryoSat-2 and AltiKa to coincident airborne data from NASA's Operation IceBridge (OIB). The mean difference of elevation rate between space borne data and IceBridge data are respectively -0.09 m/yr and -0.08 m/yr at Ka and Ku band, highlighting the good capability of both CryoSat-2 and AltiKa to accurately map ice sheet elevation change.

  15. Study of elevation changes along a profile crossing the Greenland Ice Sheet

    NASA Astrophysics Data System (ADS)

    Hvidegaard, S. M.; Sandberg, L.

    2009-04-01

    In recent years much research has focused on determining how the Greenland Ice Sheet is responding to the observed climate changes. There is wide agreement on the fact that the Ice Sheet is currently loosing mass, and studies have shown that the mass loss is found near the ice edge and that no significant changes are found in the central part of the Ice Sheet. As a part of European Space Agency's CryoSat Validation Experiment (CryoVEx) running from 2004 to 2008, the National Space Institute (DTU Space) measured the elevations along a profile crossing the Greenland Ice Sheet. The elevation observations were carried out in 2004, 2006 and 2008 using airborne laser altimetry from a Twin Otter aircraft. The observed profile follows the old EGIG line (Expédition Glaciologique au Groenland, measured in the 1950's) situated between 69-71N, heading nearly east-west. This unique dataset gives the opportunity to study elevation changes along the profile crossing the ice sheet. With this work, we outline the observed elevation changes from the different zones of the ice sheet. We furthermore compare elevation changes based on coincident ICESat and airborne laser altimeter data.

  16. 25 years of elevation changes of the Greenland Ice Sheet from ERS, Envisat, and CryoSat-2 radar altimetry

    NASA Astrophysics Data System (ADS)

    Sandberg Sørensen, Louise; Simonsen, Sebastian B.; Forsberg, René; Khvorostovsky, Kirill; Meister, Rakia; Engdahl, Marcus E.

    2018-08-01

    The shape of the large ice sheets responds rapidly to climate change, making the elevation changes of these ice-covered regions an essential climate variable. Consistent, long time series of these elevation changes are of great scientific value. Here, we present a newly-developed data product of 25 years of elevation changes of the Greenland Ice Sheet, derived from satellite radar altimetry. The data product is made publicly available within the Greenland Ice Sheets project as part of the ESA Climate Change Initiative programme. Analyzing repeated elevation measurements from radar altimetry is widely used for monitoring changes of ice-covered regions. The Greenland Ice Sheet has been mapped by conventional radar altimetry since the launch of ERS-1 in 1991, which was followed by ERS-2, Envisat and currently CryoSat-2. The recently launched Sentinel-3A will provide a continuation of the radar altimetry time series. Since 2010, CryoSat-2 has for the first time measured the changes in the coastal regions of the ice sheet with radar altimetry, with its novel SAR Interferometric (SARIn) mode, which provides improved measurement over regions with steep slopes. Here, we apply a mission-specific combination of cross-over, along-track and plane-fit elevation change algorithms to radar data from the ERS-1, ERS-2, Envisat and CryoSat-2 radar missions, resulting in 25 years of nearly continuous elevation change estimates (1992-2016) of the Greenland Ice Sheet. This analysis has been made possible through the recent reprocessing in the REAPER project, of data from the ERS-1 and ERS-2 radar missions, making them consistent with Envisat data. The 25 years of elevation changes are evaluated as 5-year running means, shifted almost continuously by one year. A clear acceleration in thinning is evident in the 5-year maps of elevation following 2003, while only small elevation changes observed in the maps from the 1990s.

  17. Surface melt effects on Cryosat-2 elevation retrievals in the ablation zone of the Greenland ice sheet

    NASA Astrophysics Data System (ADS)

    Slater, T.; McMillan, M.; Shepherd, A.; Leeson, A.; Cornford, S. L.; Hogg, A.; Gilbert, L.; Muir, A. S.; Briggs, K.

    2017-12-01

    Over the past two decades, there has been an acceleration in the rate of mass losses from the Greenland ice sheet. This acceleration is, in part, attributed to an increasingly negative surface mass balance (SMB), linked to increasing melt water runoff rates due to enhanced surface melting. Understanding the past, present and future evolution in surface melting is central to ongoing monitoring of ice sheet mass balance and, in turn, to building realistic future projections. Currently, regional climate models are commonly used for this purpose, because direct in-situ observations are spatially and temporally sparse due to the logistics and resources required to collect such data. In particular, modelled SMB is used to estimate the extent and magnitude of surface melting, which influences (1) many geodetic mass balance estimates, and (2) snowpack microwave scattering properties. The latter is poorly understood and introduces uncertainty into radar altimeter estimates of ice sheet evolution. Here, we investigate the changes in CryoSat-2 waveforms and elevation measurements caused by the onset of surface melt in the summer months over the ablation zone of the Greenland ice sheet. Specifically, we use CryoSat-2 SARIn mode data acquired between 2011 and 2016, to characterise the effect of high variability in surface melt during this period, and to assess the associated impact on estimates of ice mass balance.

  18. High Resolution Tidal Modelling in the Arctic Ocean: Needs and Upcoming Developments

    NASA Astrophysics Data System (ADS)

    Cancet, M.; Andersen, O.; Stenseng, L.; Lyard, F.; Cotton, D.; Benveniste, J.; Schulz, A.

    2015-12-01

    The Arctic Ocean is a challenging region for tidal modelling, because of its complex and not well-documented bathymetry, together combined with the intermittent presence of sea ice and the fact that the in situ tidal observations are rather scarce at such high latitudes. As a consequence, the accuracy of the global tidal models decreases by several centimetres in the Polar Regions. In particular, it has a large impact on the quality of the satellite altimeter sea surface heights in these regions (ERS1/2, Envisat, CryoSat-2, SARAL/AltiKa and the future Sentinel-3 mission). Better knowledge of the tides would improve the quality of the high latitudes altimeter sea surface heights and of all derived products, such as the altimetry-derived geostrophic currents, the mean sea surface and the mean dynamic topography. In addition, accurate tidal models are highly strategic information for ever-growing maritime and industrial activities in this region. NOVELTIS and DTU Space are currently working on the development of a regional, high-resolution tidal atlas in the Arctic Ocean. In particular, this atlas will benefit from the assimilation of the most complete satellite altimetry dataset ever used in this region, including Envisat data up to 82°N and the CryoSat-2 reprocessed data between 82°N and 88°N. The combination of all these satellites will give the best possible coverage of altimetry-derived tidal constituents. The available tide gauge data will also be used either for assimilation or validation. This paper presents the performances of the available global tidal models in the Arctic Ocean and the on-going development of an improved regional tidal atlas in this region.

  19. From CryoSat-2 to Sentinel-3 and Beyond

    NASA Astrophysics Data System (ADS)

    Francis, R.

    2011-12-01

    CryoSat-2 carried into Earth orbit the first altimeter using SAR principles, although similar techniques had been used on earlier Venusian missions. Furthermore, it carries a second antenna and receive chain, and has been very carefully calibrated, allowing interferometry between these antennas. The results of the SAR mode and of the interferometer have met all expectations, with handsome margins. Even before the launch of CryoSat-2 the further development of this concept was underway with the radar for the oceanography mission Sentinel-3. While this radar, named SRAL (SAR Radar Altimeter) does not have the interferometer capability of CryoSat-2's SIRAL (SAR Interferometric Radar Altimeter), it does have a second frequency, to enable direct measurement of the delay induced by the ionospheric electron content. Sentinel-3 will have a sun-synchronous orbit, like ERS and EnviSat, and will have a similar latitudinal range: about 82° north and south, compared to CryoSat's 88°. Sentinel-3 will operate its radar altimeter in the high-resolution SAR mode over coastal oceans and inland water, and will revert to the more classical pulse-width limited mode over the open oceans. The SAR mode generates data at a high rate, so the major limiting factor is the amount of on-board storage. The power consumption is also higher, imposing less critical constraints. For sizing purposes the coastal oceans are defined as waters within 300 km of the continental shorelines. Sentinel-3 is expected to be launched in 2013 and be followed 18 months later by a second satellite of the same design. The next step in the development of this family of radar altimeters is Jason-CS, which will provide Continuity of Service to the existing Jason series of operational oceanography missions. Jason-CS has a very strong heritage from CryoSat but will fly the traditional Jason orbit, which covers latitudes up to 66° from a high altitude of 1330 km. The new radar is called Poseidon-4, to emphasise the connection to Jason, but its concept owes more to Sentinel-3's SRAL. It retains SRAL's dual frequencies and its SAR mode, but adds some further refinements. Most notably, an operating mode in which SAR operations and full performance pulse-width limited mode are available simultaneously, is under study. This would enable the benefits of SAR mode to be achieved over all ocean areas if the volume of data generated could be stored and downlinked to the ground. This problem only becomes tractable if an on-board processing system can be introduced to perform the first level of SAR processing, reducing the data volume by several orders of magnitude. This is also under study. The architecture of the radar has a further improvement, in the extension of digital technology further into the domain of analog radio-frequency electronics. While this is essentially invisible to the scientific user, it will yield an instrument with higher quality and markedly superior stability. The Jason-CS missions (at least two satellites are planned) are currently in a study phase with an implementation decision expected at the end of 2012. The planned launch date for the first mission is 2017.

  20. Airborne Grid Sea-Ice Surveys for Comparison with Cryosat-2

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Gardner, J. M.; Liang, R.; Hagen, R. A.; Ball, D.; Newman, T.

    2015-12-01

    The Naval Research Laboratory is studying of the changing Arctic with a focus on ice thickness and distribution variability. The goal is optimization of computer models used to predict sea ice changes. An important part of our study is to calibrate/validate Cryosat-2 ice thickness data prior to its incorporation into new ice forecast models. The footprint of the altimeter over sea-ice is a significant issue in any attempt to ground-truth the data. Along-track footprints are reduced to ~ 300 m by SAR processing of the returns. However, the cross-track footprint is determined by the topography of the surface. Further, the actual return is the sum of the returns from individual reflectors within the footprint making it difficult to interpret the return, and optimize the waveform tracker. We therefore collected a series of grids of scanning LiDAR and radar on sub-satellite tracks over sea-ice that would extend far enough cross-track to capture the illuminated area. The difficulty in the collection of such grids, which are comprised of adjacent overlapping tracks is ice motion of as much as 300 m over the duration of a single flight track (~ 20 km) of data collection. With a typical LiDAR swath width of < 500m adjustment of the survey tracks in near real-time for the ice motion is necessary for a coherent data set. This was accomplished by a an NRL devised photogrammetric method of ice velocity determination. Post-processing refinements resulted in typical track-to-track miss-ties of ~ 1-2 m, much of which could be attributed to ice deformation over the period of the survey. This allows us to reconstruct the ice configuration to the time of the satellite overflight, resulting in a good picture of the surface actually illuminated by the radar. The detailed 2-d LiDAR image is the snow surface, not the underlying ice presumably illuminated by the radar. Our hope is that the 1-D radar profiles collected along the LiDAR swath centerlines will be sufficient to correct the grid for snow thickness. A total of 15 grids 5-20 km wide (cross-track) by 10-30 km long (along-track) centered on ice illuminated by CryoSat-2 were collected north of Barrow, AK. This occured over three field seasons which took place from 2013-15. Data from the grids are shown here and are being used to examine the relationship of the tracked satellite waveform data to the actual surface.

  1. Stack Characterization in CryoSat Level1b SAR/SARin Baseline C

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. CryoSat is the first altimetry mission operating in SAR mode and it carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. The current CryoSat IPF (Instrument Processing Facility), Baseline B, was released in operation in February 2012. After more than 2 years of development, the release in operations of the Baseline C is expected in the first half of 2015. It is worth recalling here that the CryoSat SAR/SARin IPF1 generates 20Hz waveforms in correspondence of an approximately equally spaced set of ground locations on the Earth surface, i.e. surface samples, and that a surface sample gathers a collection of single-look echoes coming from the processed bursts during the time of visibility. Thus, for a given surface sample, the stack can be defined as the collection of all the single-look echoes pointing to the current surface sample, after applying all the necessary range corrections. The L1B product contains the power average of all the single-look echoes in the stack: the multi-looked L1B waveform. This reduces the data volume, while removing some information contained in the single looks, useful for characterizing the surface and modelling the L1B waveform. To recover such information, a set of parameters has been added to the L1B product: the stack characterization or beam behaviour parameters. The stack characterization, already included in previous Baselines, has been reviewed and expanded in Baseline C. This poster describes all the stack characterization parameters, detailing what they represent and how they have been computed. In details, such parameters can be summarized in: - Stack statistical parameters, such as skewness and kurtosis - Look angle (i.e. the angle at which the surfaces sample is seen with respect to the nadir direction of the satellite) and Doppler angle (i.e. the angle at which the surfaces sample is seen with respect to the normal to the velocity vector) for the first and the last single-look echoes in the stack. - Number of single-looks averaged in the stack (in Baseline C a stack-weighting has been applied that reduces the number of looks). With the correct use of these parameters, users will be able to retrieve some of the 'lost' information contained within the stack and fully exploit the L1B product.

  2. SAR Altimetry Processing on Demand Service for CryoSat-2 and Sentinel-3 at ESA G-POD

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Dinardo, S.; Lucas, B.

    2014-12-01

    The scope of this work is to show the new ESA service (SARvatore) for the exploitation of the CryoSat-2 data and upcoming Sentinel-3 data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER. The G-POD (Grid-Processing On Demand) Service, SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) for CryoSat-2, is a web platform that provides the capability to process on-line and on demand CryoSat-2 SAR data, starting from L1a (FBR) data up to SAR Level-2 geophysical data products.The service is based on SARvatore Processor Prototype and it The output data products are generated in standard NetCDF format (using CF Convention), and they are compatible with BRAT (Basic Radar Altimety Toolbox) and its successor, the up-coming Sentinel-3 Altimetry Toolbox and other NetCDF tools.Using the G-POD graphic interface, it is possible to easily select the geographical area of interest along with the time of interest. As of August 2014 the service allows the user to select data for most of 2013 and part of 2014, no geographical restriction on this data. It is expected that before Fall 2014 all the mission (when available) will be at the disposal of the users.The processor prototype is versatile in the sense that the users can customize and adapt the processing, according their specific requirements, setting a list of configurable options..The processing service is meant to be used for research & development scopes, supporting the development contracts, on site demonstrations/training to selected users, cross-comparison against third part products, preparation to Sentinel-3 mission, publications, etc.So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zones, inland waters and over land in sight of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces.

  3. SAR Altimetry for Mean Sea Surface Determination in the Arctic DTU15MSS

    NASA Astrophysics Data System (ADS)

    Piccioni, G.; Andersen, O. B.; Stenseng, L.

    2015-12-01

    A reliable MSS that includes high-latitude regions within the 82 degree parallel is required for the Sentinel-3 data processing. In this paper we present the new DTU15MSS which is an update of the DTU13MSS with more years of CryoSat-2. CryoSat-2 offers a unique dataset in the Arctic Ocean for testing SAR altimetry with nearly five years of high-resolution SAR altimetry. In the Arctic Ocean older conventional altimetry satellites (ERS-1/ERS-2/Envisat) have only been able to provide sparse data for the past 20 years. Here we present the development of the DTU13MSS in the Arctic being the latest release of the global high resolution mean sea surface from DTU Space based on 4 years/repeat of Cryostat-2. The analysis shows that Laser Altimetry from the ICESat satellite being the basis of DTU10 and DTU13MSS between 82 and 86N is now obsolete for mean sea surface determination. The study also highlight the problems of integrating altimetry from various modes (LRM, SAR and SAR-in) as well as the problems relating to the fact that the averaging period of CryoSat-2 is adjacent to the 20 years (1993-2012) period used to develop DTU13MSS. Evaluation of the new MSS is performed and comparison with existing MSS models is performed to evaluate the impact of these updates into MSS computation.

  4. Marine geophysics. New global marine gravity model from CryoSat-2 and Jason-1 reveals buried tectonic structure.

    PubMed

    Sandwell, David T; Müller, R Dietmar; Smith, Walter H F; Garcia, Emmanuel; Francis, Richard

    2014-10-03

    Gravity models are powerful tools for mapping tectonic structures, especially in the deep ocean basins where the topography remains unmapped by ships or is buried by thick sediment. We combined new radar altimeter measurements from satellites CryoSat-2 and Jason-1 with existing data to construct a global marine gravity model that is two times more accurate than previous models. We found an extinct spreading ridge in the Gulf of Mexico, a major propagating rift in the South Atlantic Ocean, abyssal hill fabric on slow-spreading ridges, and thousands of previously uncharted seamounts. These discoveries allow us to understand regional tectonic processes and highlight the importance of satellite-derived gravity models as one of the primary tools for the investigation of remote ocean basins. Copyright © 2014, American Association for the Advancement of Science.

  5. Altimeter detection of elevation changes over coastal plains of northern Alaska and Hudson Bay

    NASA Astrophysics Data System (ADS)

    Hwang, C.; Cheng, Y. S.; Han, J.; Chen, J. Y.

    2017-12-01

    This presentation shows how satellite radar altimeters are used to detect elevation changes over flat, coastal regions in northern Alaska and Hudson Bay, in connection with permafrost thawing and glacial isostatic adjustment (GIA). We use a data selection criterion to identity usable waveforms over lands, which are then retracked by the subwaveform retracker to improve the ranging accuracy. The altimeter datasets are from the Envisat (2003-2010), Cryosat-2 (2010-2016), TOPEX/Poseidon (T/P), Jason-1 (J1) and Jason-2 (J2, 1992-2016) missions. The result indicates a rapid decline of elevations over the sloping, thaw lake-covered area of northern Alaska, with rates up to -20 cm/year. The rapid decline is probably due to a favorite condition for fast draining of meltwater. The lake levels of Teshekpuk Lake underwent a decline at a mean rate of - 5 cm/year until 2010 (from Envisat), and then rose steadily at about the same rate (from Cryosat-2). Around the coastal plains of Hudson Bay, we constructed long-term elevation time series from T/P, J1 and J2, and short-term ones from Cryosat-2. In the flat region southwest of Hudson Bay, most altimeter-derived rates are close to those from the GIA model ICE-6G. Near two GPS stations west and east of Hudson Bay, the Jason-2-derived rates range from 1.0 to 1.5 cm/year, close to the rates from GPS. Other convincing results of elevation changes from altimetry will be presented.

  6. Reassessment of the mass balance of the Abbot and Getz sectors of West Antarctica

    NASA Astrophysics Data System (ADS)

    Chuter, Stephen; Martín-Español, Alba; Wouters, Bert; Bamber, Jonathan

    2017-04-01

    Large discrepancies exist in mass balance estimates for the Getz and Abbot drainage basins, primarily due to previous poor knowledge of ice thickness at the grounding line, poor coverage by previous altimetry missions and signal leakage issues for GRACE. This is particularly the case for the Abbot region, where previously there have been contrasting positive ice sheet basin elevation rates from altimetry and negative mass budget estimates. Large errors arise when using ice thickness measurements derived from ERS-1 and/or ICESat altimetry data due to poor track spacing, 'loss of lock' issues near the grounding line and the complex morphology of these shelves, requiring fine resolution to derive robust and accurate elevations close to the grounding line. This was exemplified with the manual adjustments of up to 100 m required at the grounding line during the creation of Bedmap2. However, the advent of CryoSat-2 with its unique orbit and SARIn mode of operation has overcome these issues and enabled the determination of ice shelf thickness at a much higher accuracy than possible from previous satellites, particularly within the grounding zone. We present a reassessment of mass balance estimates for the 2007-2009 epoch using improved CryoSat-2 ice thicknesses. We find that CryoSat-2 ice thickness estimates are systematically thinner by 30% and 16.5% for the Abbot and Getz sectors respectively. Our new mass balance estimate of 8 ± 6 Gt yr-1for the Abbot region resolves the previous discrepancy with altimetry. Over the Getz region, the new mass balance estimate of 7.56 ± 16.6 Gt yr-1is in better agreement with other geodetic techniques. We also find there has been an increase in grounding line velocity of up to 20% since the 2007-2009 epoch, coupled with mean ice sheet thinning rates of -0.67 ± 0.13 m yr-1 derived from CryoSat-2 in fast flow regions. This is in addition to mean snowfall trends of -0.33 m yr-1w.e. since 2006. This suggests the onset of a dynamic instability in the region and the possibility of grounding line retreat, driven by both surface processes and ice dynamics.

  7. ESA airborne campaigns in support of Earth Explorers

    NASA Astrophysics Data System (ADS)

    Casal, Tania; Davidson, Malcolm; Schuettemeyer, Dirk; Perrera, Andrea; Bianchi, Remo

    2013-04-01

    In the framework of its Earth Observation Programmes the European Space Agency (ESA) carries out ground based and airborne campaigns to support geophysical algorithm development, calibration/validation, simulation of future spaceborne earth observation missions, and applications development related to land, oceans and atmosphere. ESA has been conducting airborne and ground measurements campaigns since 1981 by deploying a broad range of active and passive instrumentation in both the optical and microwave regions of the electromagnetic spectrum such as lidars, limb/nadir sounding interferometers/spectrometers, high-resolution spectral imagers, advanced synthetic aperture radars, altimeters and radiometers. These campaigns take place inside and outside Europe in collaboration with national research organisations in the ESA member states as well as with international organisations harmonising European campaign activities. ESA campaigns address all phases of a spaceborne missions, from the very beginning of the design phase during which exploratory or proof-of-concept campaigns are carried out to the post-launch exploitation phase for calibration and validation. We present four recent campaigns illustrating the objectives and implementation of such campaigns. Wavemill Proof Of Concept, an exploratory campaign to demonstrate feasibility of a future Earth Explorer (EE) mission, took place in October 2011 in the Liverpool Bay area in the UK. The main objectives, successfully achieved, were to test Astrium UKs new airborne X-band SAR instrument capability to obtain high resolution ocean current and topology retrievals. Results showed that new airborne instrument is able to retrieve ocean currents to an accuracy of ± 10 cms-1. The IceSAR2012 campaign was set up to support of ESA's EE Candidate 7,BIOMASS. Its main objective was to document P-band radiometric signatures over ice-sheets, by upgrading ESA's airborne POLARIS P-band radar ice sounder with SAR capability. Campaign comprised three airborne campaigns in Greenland from April to June 2012 separated by roughly one month and preliminary results showed the instrument capability to detect ice motion. CryoVEx 2012 was a large collaborative effort to help ensure the accuracy of ESA's ice mission CryoSat. The aim of this large-scale Arctic campaign was to record sea-ice thickness and conditions of the ice exactly below the CryoSat-2 path. A range of sensors installed on different aircraft included simple cameras to get a visual record of the sea ice, laser scanners to clearly map the height of the ice, an ice-thickness sensor (EM-Bird), ESA's radar altimeter (ASIRAS) and NASA's snow and Ku-band radars, which mimic CryoSat's measurements but at a higher resolution. Preliminary results reveal the ability to detect centimetre differences between sea-ice and thin ice/water which in turn allow for the estimation of actual sea ice thickness. In support of two currently operating EE Missions: SMOS (Soil Moisture and Ocean Salinity) and GOCE (Gravity field and steady-state Ocean Circulation Explorer), DOMECair airborne campaign will take place in Antarctica, in the Dome C region during the middle of January 2013. The two main objectives are to quantify and document the spatial variability in the DOME C area, important to establish long-term cross-calibrated multi-mission L-band measurement time-series (SMOS) and fill in the gap in the high-quality gravity anomaly maps in Antarctica since airborne gravity measurements are sparse (GOCE). Key airborne instruments in the campaign are EMIRAD-2 L-band radiometer, designed and operated by DTU and a gravimeter from AWI. ESA campaigns have been fundamental and an essential part in the preparation of new Earth Observation missions, as well as in the independent validation of their measurements and quantification of error sources. For the different activities a rich variety of datasets has been recorded, are archived and users can access campaign data through the EOPI web portal [http://eopi.esa.int].

  8. Integrated Airborne and In-Situ Measurements over Land-Fast Ice near Barrow, AK.

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Gardner, J. M.; Liang, R.; Ball, D.; Richter-Menge, J.; Claffey, K. J.; Abelev, A.; Hebert, D. A.; Jones, K.

    2014-12-01

    During March of 2014, the Naval Research Laboratory and the Cold Regions Research and Engineering Laboratory collected an integrated set of airborne and in-situ measurements over two areas of floating, but land-fast ice near the coast of Barrow, AK. The near-shore site was just north of Point Barrow, and the "offshore" site was ~ 20 km east of Point Barrow. The in-situ data provided ground-truth for airborne measurements from a scanning LiDAR (Riegl Q 560i), digital photogrammetry (Applanix DSS-439) and a snow radar procured from the Center for Remote Sensing of Ice Sheets of the University of Kansas. The objective of the survey was to aid our understanding of the use of the airborne data to calibrate/validate Cryosat-2 data. Sampling size or "footprint" plays a critical role in the attempt to compare in-situ measurements with airborne (or satellite) measurements. Thus the in-situ data were arranged to minimize aliasing. Ground measurements were collected along transects at both sites consisting of a 2 km long profile of snow depth and ice thickness measurements with periodic boreholes. A 60 m x 400 m swath of snow depth measurements was centered on this profile. Airborne data were collected on five overflights of the two transect areas. The LiDAR measured total freeboard (ice + snow) referenced to leads in the ice, and produced swaths 200-300 m wide. The radar measured snow thickness. The freeboard and snow thickness measurements are used to estimate ice thickness via isostasy and density estimates. The central swath of in situ snow depth data allows examination of the effects of cross-track variations considering the relatively large footprint of the snow radar. Assuming a smooth, flat surface the radar range resolution in air is < 4 cm, but the along-track sampling distance is ~ 3 m after unfocussed SAR processing. The width of the footprint varies from ~ 9 m up to about 40 m (beam-limited) for uneven surfaces. However, the radar could not resolve snow thickness except in areas of relatively flat snow and ice. The LiDAR had a ground point spacing of ~25-50 cm (depending on survey altitude) and so easily encompassed all other data. Comparisons and processing methodology will be shown. The results of this ground-truth experiment will inform our analysis of grids of airborne data collected over areas of sea-ice illuminated by Cryosat-2.

  9. Arctic lead detection using a waveform unmixing algorithm from CryoSat-2 data

    NASA Astrophysics Data System (ADS)

    Lee, S.; Im, J.

    2016-12-01

    Arctic areas consist of ice floes, leads, and polynyas. While leads and polynyas account for small parts in the Arctic Ocean, they play a key role in exchanging heat flux, moisture, and momentum between the atmosphere and ocean in wintertime because of their huge temperature difference In this study, a linear waveform unmixing approach was proposed to detect lead fraction. CryoSat-2 waveforms for pure leads, sea ice, and ocean were used as end-members based on visual interpretation of MODIS images coincident with CryoSat-2 data. The unmixing model produced lead, sea ice, and ocean abundances and a threshold (> 0.7) was applied to make a binary classification between lead and sea ice. The unmixing model produced better results than the existing models in the literature, which are based on simple thresholding approaches. The results were also comparable with our previous research using machine learning based models (i.e., decision trees and random forest). A monthly lead fraction was calculated, dividing the number of detected leads by the total number of measurements. The lead fraction around Beaufort Sea and Fram strait was high due to the anti-cyclonic rotation of Beaufort Gyre and the outflows of sea ice to the Atlantic. The lead fraction maps produced in this study were matched well with monthly lead fraction maps in the literature. The areas with thin sea ice identified from our previous research correspond to the high lead fraction areas in the present study. Furthermore, sea ice roughness from ASCAT scatterometer was compared to a lead fraction map to see the relationship between surface roughness and lead distribution.

  10. Precision orbit determination performance for CryoSat-2

    NASA Astrophysics Data System (ADS)

    Schrama, Ernst

    2018-01-01

    In this paper we discuss our efforts to perform precision orbit determination (POD) of CryoSat-2 which depends on Doppler and satellite laser ranging tracking data. A dynamic orbit model is set-up and the residuals between the model and the tracking data is evaluated. The average r.m.s. of the 10 s averaged Doppler tracking pass residuals is approximately 0.39 mm/s; and the average of the laser tracking pass residuals becomes 1.42 cm. There are a number of other tests to verify the quality of the orbit solution, we compare our computed orbits against three independent external trajectories provided by the CNES. The CNES products are part of the CryoSat-2 products distributed by ESA. The radial differences of our solution relative to the CNES precision orbits shows an average r.m.s. of 1.25 cm between Jun-2010 and Apr-2017. The SIRAL altimeter crossover difference statistics demonstrate that the quality of our orbit solution is comparable to that of the POE solution computed by the CNES. In this paper we will discuss three important changes in our POD activities that have brought the orbit performance to this level. The improvements concern the way we implement temporal gravity accelerations observed by GRACE; the implementation of ITRF2014 coordinates and velocities for the DORIS beacons and the SLR tracking sites. We also discuss an adjustment of the SLR retroreflector position within the satellite reference frame. An unexpected result is that we find a systematic difference between the median of the 10 s Doppler tracking residuals which displays a statistically significant pattern in the South Atlantic Anomaly (SSA) area where the median of the velocity residuals varies in the range of -0.15 to +0.15 mm/s.

  11. ICESat-2, its retrievals of ice sheet elevation change and sea ice freeboard, and potential synergies with CryoSat-2

    NASA Astrophysics Data System (ADS)

    Neumann, Thomas; Markus, Thorsten; Smith, Benjamin; Kwok, Ron

    2017-04-01

    Understanding the causes and magnitudes of changes in the cryosphere remains a priority for Earth science research. Over the past decade, NASA's and ESA's Earth-observing satellites have documented a decrease in both the areal extent and thickness of Arctic sea ice, and an ongoing loss of grounded ice from the Greenland and Antarctic ice sheets. Understanding the pace and mechanisms of these changes requires long-term observations of ice-sheet mass, sea-ice thickness, and sea-ice extent. NASA's ICESat-2 mission is the next-generation space-borne laser altimeter mission and will use three pairs of beams, each pair separated by about 3 km across-track with a pair spacing of 90 m. The spot size is 17 m with an along-track sampling interval of 0.7 m. This measurement concept is a result of the lessons learned from the original ICESat mission. The multi-beam approach is critical for removing the effects of ice sheet surface slope from the elevation change measurements of most interest. For sea ice, the dense spatial sampling (eliminating along-track gaps) and the small footprint size are especially useful for sea surface height measurements in the, often narrow, leads needed for sea ice freeboard and ice thickness retrievals. Currently, algorithms are being developed to calculate ice sheet elevation change and sea ice freeboard from ICESat-2 data. The orbits of ICESat-2 and Cryosat-2 both converge at 88 degrees of latitude, though the orbit altitude differences result in different ground track patterns between the two missions. This presentation will present an overview of algorithm approaches and how ICESat-2 and Cryosat-2 data may augment each other.

  12. Basic Radar Altimetry Toolbox: Tools and Tutorial to Use Cryosat Data

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.; Niemeijer, S.

    2011-12-01

    Radar altimetry is very much a technique expanding its applications. Even If quite a lot of effort has been invested for oceanography users, the use of Altimetry data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious for new Altimetry data products users. ESA and CNES therfore developed the Basic Radar Altimetry Toolbox a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat, the future Saral missions and is ready for adaptation to Sentinel-3 products - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available since April 2007, and had been demonstrated during training courses and scientific meetings. About 2000 people downloaded it (Summer 2011), with many "newcomers" to altimetry among them, including teachers and professors, worldwide. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in the recent version release (v3.0.1). Others are in discussion for future development. Data use cases on CryoSat data use will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  13. CryoSat-2: From SAR to LRM (FBR) for quantitative precision comparison over identical sea state

    NASA Astrophysics Data System (ADS)

    Martin-Puig, Cristina; Ruffini, Giulio; Raney, R. Keith; Gommenginger, Christine

    The use of Synthetic Aperture Radar (SAR) techniques in conventional altimetry—i.e., Delay Doppler Altimetry (DDA)—was first introduced by R.K. Raney in 1998 [1]. This technique provides an improved solution for water surface altimetry observations due to two major innova-tions: the addition of along track processing for increased resolution, and multi-look processing for improved SNR. Cryosat-2 (scheduled for launch 2010) will be the first satellite to operate a SAR altimetry mode. Although its main focus will be the cryosphere, this instrument will also be sporadically operative over water surfaces, thus provide an opportunity to test and refine the improved capabilities of DDA. Moreover, the work presented here is of interest to the ESA's Sentinel-3 mission. This mission will be devoted to the provision of operational oceanographic services within Global Monitoring for the Environment and Security (GMES), and will include a DDA altimeter on board. SAMOSA, an ESA funded project, has studied along the last two years the potentialities of advanced DDA over water surfaces. Its extension aims to better quantify the improvement of DDA over conventional altimetry for the characterization of water surfaces. Cryosat-2s altimeter (SIRAL) has three operating modes: the Low Resolution Mode (LRM), the SAR mode and the inSAR mode. The first two are of interest for the work to be done. In LRM the altimeter performs as a conventional pulse limited altimeter (PRF of 1970 Hz); in SAR mode the pulses are transmitted in bursts (64 pulses per burst). In the last, correlation between echoes is desired [1], thus the PRF within a burst is higher than in LRM (PRF of 17.8 KHz). After transmission the altimeter waits for the returns, and transmits the next burst (burst repetition frequency of 85.7 Hz). The previous acquisition modes will provide different data products: level 1 or full bit rate data (FBR), level 1b or multi-looked waveform data, and level 2 for evaluation or geophysical products. This paper is only addressing FBR data for LRM and SAR mode. In LRM the FBR data corresponds to echoes incoherently multi-looked on-board the satellite at a rate of 20Hz, while in SAR mode FBR corresponds to individual complex echoes (I and Q), telemetered before the IFFT block [2]. Given that CryoSat-2 operational modes are exclusive, one task within SAMOSA extension aims to reduce SAR FBR data such that it emulates LRM FBR data allowing for the quantitative comparison of the measurement precision over identical sea state. In working to this aim, three methodolo-gies were implemented in the SAMOSA contract, the results achieved and detailed discussions with JHU/APL identified a revised approach (to be implemented in the SAMOSA extension), which should allow the team to meet the task goal. The different approaches will be presented in this paper. ACKNOWLEDGEMENT The authors of this paper would like to acknowledge the European Space Agency for funding the work presented in this paper, with special attention to J. Benveniste and S. Dinardo (ESA); and the SAMOSA team: D. Cotton (SatOC; UK), L. Stenseng (DTU; DE) and P. Berry (DMU; UK) REFERENCES [1] R.K.Raney, The Delay/Doppler Radar Altimeter, IEEE Trans. Georsci. Remote Sensing, vol. 36, pp. 1578-1588, Sep 1998. [2] CryoSat Mission and Data Description, Doc No. CS-RP-ESA-SY-0059, 2007.

  14. Numerical experiments of dynamical processes during the 2011-2013 surge of the Bering-Bagley Glacier System, using a full-Stokes finite element model

    NASA Astrophysics Data System (ADS)

    Trantow, Thomas

    The Bering-Bagley Glacial System (BBGS) is the largest glacier system outside of the Greenland and Antarctic ice sheets, and is the Earth's largest surge-type glacier. Surging is one of three types of glacial acceleration and the least understood one. Understanding glacial acceleration is paramount when trying to explain ice discharge to the oceans and the glacial contribution to sea-level rise, yet there are currently no numerical glacial models that account for surging. The recent 2011-2013 surge of the BBGS provides a rare opportunity to study the surge process through observations and the subsequent data analysis and numerical modeling. Using radar, altimeter, and image data collected from airborne and satellite missions, various descriptions of ice geometry are created at different times throughout the surge. Using geostatistical estimation techniques including variography and ordinary kriging, surface and bedrock Digital Elevation Maps (DEMs) are derived. A time series analysis of elevation change during the current surge is then conducted and validated using a complete error analysis along with airborne observations. The derived DEMs are then used as inputs to a computer simulated model of glacier dynamics in the BBGS. Using the Finite Element software Elmer/Ice, a full-Stokes simulation, with Glen's flow law for temperate ice, is created for numerical experiments. With consideration of free surface evolution, glacial hydrology and surface mass balance, the model is able to predict a variety of field variables including velocity, stress, strain-rate, pressure and surface elevation change at any point forward in time. These outputs are compared and validated using observational data such as CryoSat-2 altimetry, airborne field data, imagery and previous detailed analysis of the BBGS. Preliminary results reveal that certain surge phenomena such as surface elevation changes, surge progression and locations at which the surge starts, can be recreated using the current model. Documentation of the effects that altering glaciological parameters and boundary conditions have on ice rheology in a large complex glacial system comes as secondary result. Simulations have yet to reveal any quasi-cyclic behavior or natural surge initiation.

  15. Toward variational assimilation of SARAL/Altika altimeter data in a North Atlantic circulation model at eddy-permitting resolution: assessment of a NEMO-based 4D-VAR system

    NASA Astrophysics Data System (ADS)

    Bouttier, Pierre-Antoine; Brankart, Jean-Michel; Candille, Guillem; Vidard, Arthur; Blayo, Eric; Verron, Jacques; Brasseur, Pierre

    2015-04-01

    In this project, the response of a variational data assimilation system based on NEMO and its linear tangent and adjoint model is investigated using a 4DVAR algorithm into a North-Atlantic model at eddy-permitting resolution. The assimilated data consist of Jason-2 and SARAL/AltiKA dataset collected during the 2013-2014 period. The main objective is to explore the robustness of the 4DVAR algorithm in the context of a realistic turbulent oceanic circulation at mid-latitude constrained by multi-satellite altimetry missions. This work relies on two previous studies. First, a study with similar objectives was performed based on academic double-gyre turbulent model and synthetic SARAL/AltiKA data, using the same DA experimental framework. Its main goal was to investigate the impact of turbulence on variational DA methods performance. The comparison with this previous work will bring to light the methodological and physical issues encountered by variational DA algorithms in a realistic context at similar, eddy-permitting spatial resolution. We also have demonstrated how a dataset mimicking future SWOT observations improves 4DVAR incremental performances at eddy-permitting resolution. Then, in the context of the OSTST and FP7 SANGOMA projects, an ensemble DA experiment based on the same model and observational datasets has been realized (see poster by Brasseur et al.). This work offers the opportunity to compare efficiency, pros and cons of both DA methods in the context of KA-band altimetric data, at spatial resolution commonly used today for research and operational applications. In this poster we will present the validation plan proposed to evaluate the skill of variational experiment vs. ensemble assimilation experiments covering the same period using independent observations (e.g. from Cryosat-2 mission).

  16. Warm winter, thin ice?

    NASA Astrophysics Data System (ADS)

    Stroeve, Julienne C.; Schroder, David; Tsamados, Michel; Feltham, Daniel

    2018-05-01

    Winter 2016/2017 saw record warmth over the Arctic Ocean, leading to the least amount of freezing degree days north of 70° N since at least 1979. The impact of this warmth was evaluated using model simulations from the Los Alamos sea ice model (CICE) and CryoSat-2 thickness estimates from three different data providers. While CICE simulations show a broad region of anomalously thin ice in April 2017 relative to the 2011-2017 mean, analysis of three CryoSat-2 products show more limited regions with thin ice and do not always agree with each other, both in magnitude and direction of thickness anomalies. CICE is further used to diagnose feedback processes driving the observed anomalies, showing 11-13 cm reduced thermodynamic ice growth over the Arctic domain used in this study compared to the 2011-2017 mean, and dynamical contributions of +1 to +4 cm. Finally, CICE model simulations from 1985 to 2017 indicate the negative feedback relationship between ice growth and winter air temperatures may be starting to weaken, showing decreased winter ice growth since 2012, as winter air temperatures have increased and the freeze-up has been further delayed.

  17. A new digital elevation model of Antarctica derived from CryoSat-2 altimetry

    NASA Astrophysics Data System (ADS)

    Slater, Thomas; Shepherd, Andrew; McMillan, Malcolm; Muir, Alan; Gilbert, Lin; Hogg, Anna E.; Konrad, Hannes; Parrinello, Tommaso

    2018-05-01

    We present a new digital elevation model (DEM) of the Antarctic ice sheet and ice shelves based on 2.5 × 108 observations recorded by the CryoSat-2 satellite radar altimeter between July 2010 and July 2016. The DEM is formed from spatio-temporal fits to elevation measurements accumulated within 1, 2, and 5 km grid cells, and is posted at the modal resolution of 1 km. Altogether, 94 % of the grounded ice sheet and 98 % of the floating ice shelves are observed, and the remaining grid cells north of 88° S are interpolated using ordinary kriging. The median and root mean square difference between the DEM and 2.3 × 107 airborne laser altimeter measurements acquired during NASA Operation IceBridge campaigns are -0.30 and 13.50 m, respectively. The DEM uncertainty rises in regions of high slope, especially where elevation measurements were acquired in low-resolution mode; taking this into account, we estimate the average accuracy to be 9.5 m - a value that is comparable to or better than that of other models derived from satellite radar and laser altimetry.

  18. Arctic lead detection using a waveform mixture algorithm from CryoSat-2 data

    NASA Astrophysics Data System (ADS)

    Lee, Sanggyun; Kim, Hyun-cheol; Im, Jungho

    2018-05-01

    We propose a waveform mixture algorithm to detect leads from CryoSat-2 data, which is novel and different from the existing threshold-based lead detection methods. The waveform mixture algorithm adopts the concept of spectral mixture analysis, which is widely used in the field of hyperspectral image analysis. This lead detection method was evaluated with high-resolution (250 m) MODIS images and showed comparable and promising performance in detecting leads when compared to the previous methods. The robustness of the proposed approach also lies in the fact that it does not require the rescaling of parameters (i.e., stack standard deviation, stack skewness, stack kurtosis, pulse peakiness, and backscatter σ0), as it directly uses L1B waveform data, unlike the existing threshold-based methods. Monthly lead fraction maps were produced by the waveform mixture algorithm, which shows interannual variability of recent sea ice cover during 2011-2016, excluding the summer season (i.e., June to September). We also compared the lead fraction maps to other lead fraction maps generated from previously published data sets, resulting in similar spatiotemporal patterns.

  19. Orbit Determination of KOMPSAT-1 and Cryosat-2 Satellites Using Optical Wide-field Patrol Network (OWL-Net) Data with Batch Least Squares Filter

    NASA Astrophysics Data System (ADS)

    Lee, Eunji; Park, Sang-Young; Shin, Bumjoon; Cho, Sungki; Choi, Eun-Jung; Jo, Junghyun; Park, Jang-Hyun

    2017-03-01

    The optical wide-field patrol network (OWL-Net) is a Korean optical surveillance system that tracks and monitors domestic satellites. In this study, a batch least squares algorithm was developed for optical measurements and verified by Monte Carlo simulation and covariance analysis. Potential error sources of OWL-Net, such as noise, bias, and clock errors, were analyzed. There is a linear relation between the estimation accuracy and the noise level, and the accuracy significantly depends on the declination bias. In addition, the time-tagging error significantly degrades the observation accuracy, while the time-synchronization offset corresponds to the orbital motion. The Cartesian state vector and measurement bias were determined using the OWL-Net tracking data of the KOMPSAT-1 and Cryosat-2 satellites. The comparison with known orbital information based on two-line elements (TLE) and the consolidated prediction format (CPF) shows that the orbit determination accuracy is similar to that of TLE. Furthermore, the precision and accuracy of OWL-Net observation data were determined to be tens of arcsec and sub-degree level, respectively.

  20. Parameters Comparsion of Leads Detection in Arctic Sea Ice Using CRYOSAT-2 Waveform Data

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, S.; Xiao, F.; Zhu, C.; Zhang, Y.; Zhu, T.; Yuan, L.

    2018-04-01

    Leads are only a small part of the polar sea ice structure, but they play a dominant role on the turbulence exchange between the ocean and the atmosphere, they are also important factors about sea ice thickness inversion. Since the early 2000s, Satellite altimetry has been applied to monitor the Arctic sea ice thickness, Satellite altimetry data can be used to distinguish leads and sea ice. In this paper, four parameters including Pulse peakiness (PP), stack standard deviation (SSD), stack kurtosis (SKU) and stack skewness (SSK) are extracted from CryoSat-2 satellite altimetry waveform data. The four parameters are combined into five combinations (PP, PP&SSD, PP&SSD&SKU, PP&SSD&SSK, PP&SSD&SSK&SKU) with constrain conditions to detect the leads. The results of the five methods are compared with MODIS (moderate-resolution imagining spectroradiometer) images and show that, the combination of PP&SSD is better than the single PP, the rest of combinations are the same as the combination of PP&SSD. It turns out, there is no promotion when we add SSK and SKU, successively or simultaneously.

  1. Coastal Sea Level along the North Eastern Atlantic Shelf from Delay Doppler Altimetry

    NASA Astrophysics Data System (ADS)

    Fenoglio-Marc, L.; Benveniste, J.; Andersen, O. B.; Gravelle, M.; Dinardo, S.; Uebbing, B.; Scharroo, R.; Kusche, J.; Kern, M.; Buchhaupt, C.

    2017-12-01

    Satellite altimetry data of the CryoSat-2 and Sentinel-3 missions processed with Delay Doppler methodology (DDA) provide improved coastal sea level measurements up to 2-4 km from coast, thanks to an along-track resolution of about 300m and a higher signal to noise ratio. We investigate the 10 Kilometre stripe along the North-Eastern Atlantic shelf from Lisbon to Bergen to detect the possible impacts in sea level change studies of this enhanced dataset. We consider SAR CryoSat-2 and Sentinel-3 altimetry products from the ESA GPOD processor and in-house reduced SAR altimetry (RDSAR) products. Improved processing includes in RDSAR the application of enhanced retrackers for the RDSAR waveform. Improved processing in SAR includes modification both in the generation of SAR waveforms, (as Hamming weighting window on the burst data prior to the azimuth FFT, zero-padding prior to the range FFT, doubling of the extension for the radar range swath) and in the SAMOSA2 retracker. Data cover the full lifetime of CryoSat-2 (6 years) and Sentinel-3 (1 year). Conventional altimetry are from the sea level CCI database. First we analyse the impact of these SAR altimeter data on the sea level trend and on the estimation of vertical motion from the altimeter minus tide gauge differences. VLM along the North-Eastern Atlantic shelf is generally small compared to the North-Western Atlantic Coast VLM, with a smaller signal to noise ratio. Second we investigate impact on the coastal mean sea level surface and the mean dynamic topography. We evaluate a mean surface from the new altimeter data to be combined to state of the art geoid models to derive the mean dynamic topography. We compare the results to existing oceanographic and geodetic mean dynamic topography solutions, both on grid and pointwise at the tide gauge stations. This study is supported by ESA through the Sea Level CCI and the GOCE++DYCOT projects

  2. Mapping the Antarctic grounding line with CryoSat-2 radar altimetry

    NASA Astrophysics Data System (ADS)

    Bamber, J. L.; Dawson, G. J.

    2017-12-01

    The grounding line, where grounded ice begins to float, is the boundary at which the ocean has the greatest influence on the ice-sheet. Its position and dynamics are critical in assessing the stability of the ice-sheet, for mass budget calculations and as an input into numerical models. The most reliable approaches to map the grounding line remotely are to measure the limit of tidal flexure of the ice shelf using differential synthetic aperture radar interferometry (DInSAR) or ICESat repeat-track measurements. However, these methods are yet to provide satisfactory spatial and temporal coverage of the whole of the Antarctic grounding zone. It has not been possible to use conventional radar altimetry to map the limit of tidal flexure of the ice shelf because it performs poorly near breaks in slope, commonly associated with the grounding zone. The synthetic aperture radar interferometric (SARin) mode of CryoSat-2, performs better over steeper margins of the ice sheet and allows us to achieve this. The SARin mode combines "delay Doppler" processing with a cross-track interferometer, and enables us to use elevations based on the first return (point of closest approach or POCA) and "swath processed" elevations derived from the time-delayed waveform beyond the first return, to significantly improve coverage. Here, we present a new method to map the limit of tidal motion from a combination of POCA and swath data. We test this new method on the Siple Coast region of the Ross Ice Shelf, and the mapped grounding line is in good agreement with previous observations from DinSAR and ICESat measurements. There is, however, an approximately constant seaward offset between these methods and ours, which we believe is due to the poorer precision of CryoSat-2. This new method has improved the coverage of the grounding zone across the Siple Coast, and can be applied to the rest of Antarctica.

  3. CryoSat: ready to launch (again)

    NASA Astrophysics Data System (ADS)

    Francis, R.; Wingham, D.; Cullen, R.

    2009-12-01

    Over the last ten years the relationship between climate change and the cryosphere has become increasingly important. Evidence of change in the polar regions is widespread, and the subject of public discussion. During this same ten years ESA has been preparing its CryoSat mission, specifically designed to provide measurements to determine the overall change in the mass balance of all of the ice caps and of change in the volume of sea-ice (rather than simply its extent). In fact the mission was ready for launch in October 2005, but a failure in the launch vehicle led to a loss of the satellite some 6 minutes after launch. The determination to rebuild the satellite and complete the mission was widespread in the relevant scientific, industrial and political entities, and the decision to redirect financial resources to the rebuild was sealed with a scientific report confirming that the mission was even more important in 2005 than at its original selection in 1999. The evolution of the cryosphere since then has emphasised that conclusion. In order to make a meaningful measurement of the secular change of the surface legation of ice caps and the thickness of sea-ice, the accuracy required has been specified as about half of the variation expected due to natural variability, over reasonable scales for the surfaces concerned. The selected technique is radar altimetry. Previous altimeter missions have pioneered the method: the CryoSat instrument has been modified to provide the enhanced capabilities needed to significantly extend the spatial coverage of these earlier missions. Thus the radar includes a synthetic aperture mode which enables the along-track resolution to be improved to about 250 m. This will will allow detection of leads in sea-ice which are narrower than those detected hitherto, so that operation deeper into pack-ice can be achieved with a consequent reduction in errors due to omission. Altimetry over the steep edges of ice caps is hampered by the irregular topography which, since the radar ranging is performed to the closest reflector rather than the point directly below, introduces uncertainty into the exactitude of repeat measurements. CryoSat's radar includes a second antenna and receiver chain so that interferometry may be used to determine the arrival angle of the echo and so improve localisation of the reflection. The new satellite was approved in late February 2006, less than 6 months after the failure, and development started almost immediately. In September 2009 the development was completed and the satellite placed into storage awaiting a launch vehicle: the launch, using a Dnepr vehicle (a converted SS-18 ICBM) is anticipated in late February 2010.

  4. CryoSat swath altimetry to measure ice cap and glacier surface elevation change

    NASA Astrophysics Data System (ADS)

    Tepes, P.; Gourmelen, N.; Escorihuela, M. J.; Wuite, J.; Nagler, T.; Foresta, L.; Brockley, D.; Baker, S.; Roca, M.; Shepherd, A.; Plummer, S.

    2016-12-01

    Satellite altimetry has been used extensively in the past few decades to observe changes affecting large and remote regions covered by land ice such as the Greenland and Antarctic ice sheets. Glaciers and ice caps have been studied less extensively due to limitation of altimetry over complex topography. However their role in current sea-level budgets is significant and is expected to continue over the next century and beyond (Gardner et al., 2011), particularly in the Arctic where mean annual surface temperatures have recently been increasing twice as fast as the global average (Screen and Simmonds, 2010). Radar altimetry is well suited to monitor elevation changes over land ice due to its all-weather year-round capability of observing ice surfaces. Since 2010, the Synthetic Interferometric Radar Altimeter (SIRAL) on board the European Space Agency (ESA) radar altimetry CryoSat (CS) mission has been collecting ice elevation measurements over glaciers and ice caps. Its Synthetic Aperture Radar Interferometric (SARIn) processing feature reduces the size of the footprint along-track and locates the across-track origin of a surface reflector in the presence of a slope. This offers new perspectives for the measurement of regions marked by complex topography. More recently, data from the CS-SARIn mode have been used to infer elevation beyond the point of closest approach (POCA) with a novel approach known as "swath processing" (Hawley et al., 2009; Gray et al., 2013; Christie et al., 2016; Smith et al., 2016). Together with a denser ground track interspacing of the CS mission, the swath processing technique provides unprecedented spatial coverage and resolution for space borne altimetry, enabling the study of key processes that underlie current changes of ice caps and glaciers. In this study, we use CS swath observations to generate maps of ice elevation change for selected ice caps and glaciers. We present a validation exercise and discuss the benefit of swath processing for assessing glaciers and ice caps changes and their contribution to changes in sea level.

  5. Variability of Decimetre and Centimetre Scale Ice Surface Roughness and the Potential Consequences on the CryoSat Radar Altimeter Signal

    NASA Astrophysics Data System (ADS)

    Cawkwell, F. G.; Burgess, D. O.; Sharp, M. J.; Demuth, M.

    2004-12-01

    Snow and ice surface roughness affect the backscatter of the pulse emitted by a radar altimeter, and hence the accuracy of the surface elevation calculated from the waveform echo, but the influence of surface roughness has not been quantified. As part of the CryoSat calibration/validation field campaigns on the Devon Ice Cap in 2004, surface roughness measurements were made at 0.1-7km intervals along a 48km transect from near the summit to the southern margin. Measurements were made at the decimetre scale by surveying and at the centimetre scale using digital photography. The data collected were subjected to wavelet analysis to define characteristic roughness wavelengths, and the fractal dimension associated with each of these was calculated using the semi-variogram method. Vario functions were calculated for the photographic data. The survey results show that wavelength scales depend on orientation and distance from the ice cap summit, the fractal dimension depends on the wavelength scale and the orientation, and both are significantly affected by storm events. Profiles aligned with the easterly prevailing wind direction, and thus perpendicular to the predicted satellite track, proved to be more sensitive to meteorological events than those normal to the dominant winds. Wavelet and fractal analysis of the photographic data was less conclusive, potentially due to the `noisier' nature of the data at this scale, where `noise' is actually the superimposition of small scale wavelengths onto larger ones. Vario analysis showed the characteristic wavelengths at the centimetre scale to increase with distance from the summit, although the abrading effect of storm events caused a decrease in wavelength. The amplitude of the roughness also increases with distance from the summit, although following a period of calm this value is significantly decreased along the transect. Orientation with respect to the prevailing wind direction is also a significant factor. Analysis of the return waveforms acquired by an airborne radar altimeter concurrently with ground data will allow the impact of the different roughness scales and orientations to be assessed.

  6. Airborne and ground based measurements in McMurdo Sound, Antarctica, for the validation of satellite derived ice thickness

    NASA Astrophysics Data System (ADS)

    Rack, Wolfgang; Haas, Christian; Langhorne, Pat; Leonard, Greg; Price, Dan; Barnsdale, Kelvin; Soltanzadeh, Iman

    2014-05-01

    Melting and freezing processes in the ice shelf cavities of the Ross and McMurdo Ice Shelves significantly influence the sea ice formation in McMurdo Sound. Between 2009 and 2013 we used a helicopter-borne laser and electromagnetic induction sounder (EM bird) to measure thickness and freeboard profiles across the ice shelf and the landfast sea ice, which was accompanied by extensive field validation, and coordinated with satellite altimeter overpasses. Using freeboard and thickness, the bulk density of all ice types was calculated assuming hydrostatic equilibrium. Significant density steps were detected between first-year and multi-year sea ice, with higher values for the younger sea ice. Values are overestimated in areas with abundance of sub-ice platelets because of overestimation in both ice thickness and freeboard. On the ice shelf, bulk ice densities were sometimes higher than that of pure ice, which can be explained by both the accretion of marine ice and glacial sediments. For thin ice, the freeboard to thickness conversion critically depends on the knowledge of snow properties. Our measurements allow tuning and validation of snow cover simulations using the Weather Research Forecasting (WRF) model. The simulated snowcover is used to calculate ice thickness from satellite derived freeboard. The results of our measurements, which are supported by the New Zealand Antarctic programme, draw a picture of how oceanographic processes influence the ice shelf morphology and sea ice formation in McMurdo Sound, and how satellite derived freeboard of ICESat and CryoSat together with information on snow cover can potentially capture the signature of these processes.

  7. Seasonal sea ice predictions for the Arctic based on assimilation of remotely sensed observations

    NASA Astrophysics Data System (ADS)

    Kauker, F.; Kaminski, T.; Ricker, R.; Toudal-Pedersen, L.; Dybkjaer, G.; Melsheimer, C.; Eastwood, S.; Sumata, H.; Karcher, M.; Gerdes, R.

    2015-10-01

    The recent thinning and shrinking of the Arctic sea ice cover has increased the interest in seasonal sea ice forecasts. Typical tools for such forecasts are numerical models of the coupled ocean sea ice system such as the North Atlantic/Arctic Ocean Sea Ice Model (NAOSIM). The model uses as input the initial state of the system and the atmospheric boundary condition over the forecasting period. This study investigates the potential of remotely sensed ice thickness observations in constraining the initial model state. For this purpose it employs a variational assimilation system around NAOSIM and the Alfred Wegener Institute's CryoSat-2 ice thickness product in conjunction with the University of Bremen's snow depth product and the OSI SAF ice concentration and sea surface temperature products. We investigate the skill of predictions of the summer ice conditions starting in March for three different years. Straightforward assimilation of the above combination of data streams results in slight improvements over some regions (especially in the Beaufort Sea) but degrades the over-all fit to independent observations. A considerable enhancement of forecast skill is demonstrated for a bias correction scheme for the CryoSat-2 ice thickness product that uses a spatially varying scaling factor.

  8. Ku/Ka band observations over polar ice sheets

    NASA Astrophysics Data System (ADS)

    Thibaut, Pierre; Lasne, Yannick; Guillot, Amandine; Picot, Nicolas; Rémy, Frédérique

    2015-04-01

    For the first time, comparisons between Ku and Ka altimeter measurements are possible thanks to the new AltiKa instrument embarked onboard the Saral mission launched on February 25, 2013. This comparison is of particular interest when dealing with ice sheet observations because both frequencies have different penetration characteristics. We propose in this paper to revisit the estimation of the ice sheet topography (and other related parameters) with altimeter systems and to present illustrations of the differences observed in Ku and Ka bands using AltiKa, Envisat/RA-2 but also Cryosat-2 measurements. Working on AltiKa waveforms in the frame of the PEACHI project has allowed us to better understand the impact of the penetration depth on the echo shape, to improve the estimation algorithm and to compare its output with historical results obtained on Envisat and ERS missions. In particular, analyses at cross-overs of the Cryosat-2 and Saral data will be presented. Sentinel-3 mission should be launch during 2015. Operating in Ku band and in delay/doppler mode, it will be crucial to account for penetration effects in order to accurately derive the ice sheet heights and trends. The results of the work presented here, will benefit to the Sentinel-3 mission.

  9. Glacier mass variations from recent ITSG-Grace solutions: Experiences with the point-mass modeling technique in the framework of project SPICE.

    NASA Astrophysics Data System (ADS)

    Reimond, S.; Klinger, B.; Krauss, S.; Mayer-Gürr, T.; Eicker, A.; Zemp, M.

    2017-12-01

    In recent years, remotely sensed observations have become one of the most ubiquitous and valuable sources of information for glacier monitoring. In addition to altimetry and interferometry data (as observed, e.g., by the CryoSat-2 and TanDEM-X satellites), time-variable gravity field data from the GRACE satellite mission has been used by several authors to assess mass changes in glacier systems. The main challenges in this context are i) the limited spatial resolution of GRACE, ii) the gravity signal attenuation in space and iii) the problem of isolating the glaciological signal from the gravitational signatures as detected by GRACE.In order to tackle the challenges i) and ii), we thoroughly investigate the point-mass modeling technique to represent the local gravity field. Instead of simply evaluating global spherical harmonics, we operate on the normal equation level and make use of GRACE K-band ranging data (available since April 2002) processed at the Graz University of Technology. Assessing such small-scale mass changes from space-borne gravimetric data is an ill-posed problem, which we aim to stabilize by utilizing a Genetic Algorithm based Tikhonov regularization. Concerning issue iii), we evaluate three different hydrology models (i.e. GLDAS, LSDM and WGHM) for validation purposes and the derivation of error bounds. The non-glaciological signal is calculated for each region of interest and reduced from the GRACE results.We present mass variations of several alpine glacier systems (e.g. the European Alps, Svalbard or Iceland) and compare our results to glaciological observations provided by the World Glacier Monitoring Service (WGMS) and alternative inversion methods (surface density modeling).

  10. First Assessments of Predicted ICESat-2 Performance Using Aircraft Data

    NASA Technical Reports Server (NTRS)

    Neumann, Thomas; Markus, Thorsten; Cook, William; Hancock, David; Brenner, Anita; Kelly, Brunt; DeMarco, Eugenia; Reed, Daniel; Walsh, Kaitlin

    2012-01-01

    The Ice, Cloud, and land Elevation Satellite-2 (ICESat-2) is a next-generation laser altimeter designed to continue key observations of ice sheet elevation change, sea ice freeboard, vegetation canopy height, earth surface elevation, and sea surface height. Scheduled for launch in mid-2016, ICESat-2 will use a high repetition rate (10 kHz), small footprint (10 m nominal ground diameter) laser, and a single-photon-sensitive detection strategy (photon counting) to measure precise range to the earth's surface. Using green light (532 nm), the six beams of ICESat-2 will provide improved spatial coverage compared with the single beam of ICESat, while the differences in transmit energy among the beams provide a large dynamic range. The six beams are arranged into three pairs of beams which allow slopes to measured on an orbit-by-orbit basis. In order to evaluate models of predicted ICESat-2 performance and provide ICESat-2-like data for algorithm development, an airborne ICESat-2 simulator was developed and first flown in 2010. This simulator, the Multiple Altimeter Beam Experimental Lidar (MABEL) was most recently deployed to Iceland in April 2012 and collected approx 85 hours of science data over land ice, sea ice, and calibration targets. MABEL uses a similar photon-counting measurement strategy to what will be used on ICESat-2. MABEL collects data in 16 green channels and an additional 8 channels in the infrared aligned across the direction of flight. By using NASA's ER-2 aircraft flying at 20km altitude, MABEL flies as close to space as is practical, and collects data through approx 95% of the atmosphere. We present background on the MABEL instrument, and data from the April 2012 deployment to Iceland. Among the 13 MABEL flights, we collected data over the Greenland ice sheet interior and outlet glaciers in the southwest and western Greenland, sea ice data over the Nares Strait and Greenland Sea, and a number of small glaciers and ice caps in Iceland and Svalbard. Several of the flights were coincident in time and space with NASA's Operation IceBridge, which provides an independent data set for validation. MABEL also collected data along CryoSat track 10482 in north central Greenland approximately one month after CryoSat passed overhead.

  11. Characterising and improving the performance of the Sentinel-3 SRAL altimeter: A Report from SCOOP, SHAPE & SPICE Projects

    NASA Astrophysics Data System (ADS)

    Restano, Marco; Ambrózio, Américo; Cotton, David; Scoop Team; Fabry, Pierre; Shape Team; McMillan, Malcolm; Spice Team; Benveniste, Jérôme

    2017-04-01

    Under the ESA Scientific Exploitation of Operational Missions (SEOM) Programme, 3 Projects are currently underway to accurately characterise and improve the performance of the Sentinel-3 SRAL SAR mode altimeter. They are: 1) SCOOP (SAR Altimetry Coastal & Open Ocean Performance Exploitation and Roadmap Study) for Coastal and Open Ocean; 2) SHAPE (Sentinel-3 Hydrologic Altimetry PrototypE) for Inland Water; 3) SPICE (Sentinel-3 Performance improvement for ICE sheets) for Ice Sheets. As projects started before the launch of Sentinel-3 (a full SAR mission), calibrated Cryosat-2 data have been used as input to a processor replicating the Sentinel-3 baseline processing. For the SCOOP project, a first test dataset has been released to end users including data from 10 regions of interest. The successful SAMOSA retracker, adopted in the previous CP4O Project (CryoSat Plus for Oceans), has been readapted to re-track Sentinel-3 waveforms. An improved version of SAMOSA will be released at the end of the project. The SHAPE project is working towards the design and assessment of alternative/innovative techniques not implemented in the Sentinel-3 ground segment (performing no Inland Water dedicated processing). Both rivers and lakes will be studied. Amazon, Brahmaputra and Danube have been selected as rivers, whereas Titicaca and Vanern have been chosen as lakes. The study will include the assimilation of output products into hydrological models for all regions of interest. A final dataset will be provided to end users. The SPICE project is addressing four high level objectives: 1) Assess and improve the Delay-Doppler altimeter processing for ice sheets. 2) Assess and develop SAR waveform retrackers for ice sheets. 3) Evaluate the performance of SAR altimetry relative to conventional pulse limited altimetry. 4) Assess the impact on SAR altimeter measurements of radar wave interaction with the snowpack. Dataset used for validation include ICESat and IceBridge products. Vostok, Dome C and the Spirit Sector (all located in Antarctica) have been selected, along with the Russell Glacier in Greenland, as regions of interest. In the frame of both SCOOP and SHAPE projects, improved wet troposphere corrections will be estimated for all regions of interest.

  12. DPOD2014: a new DORIS extension of ITRF2014 for Precise Orbit Determination

    NASA Astrophysics Data System (ADS)

    Moreaux, G.; Willis, P.; Lemoine, F. G.; Zelensky, N. P.

    2016-12-01

    As one of the tracking systems used to determine orbits of the altimeter mission satellites (such as TOPEX/Poseidon, Envisat, Jason-1/2/3 & Cryosat-2), the position of the DORIS tracking stations provides a fundamental reference for the estimation of the precise orbits and so, by extension is fundamental for the quality of the altimeter data and derived products. Therefore, the time evolution of the position of both the existing and the newest DORIS stations must be precisely modeled and regularly updated. To satisfy operational requirements for precise orbit determination and routine delivery of geodetic products, the International DORIS Service maintains the so-called DPOD solutions, which can be seen as extensions of the latest available ITRF solution from the International Earth Rotation and Reference Systems Service (IERS). In mid-2016, the IDS agreed to change the processing strategy of the DPOD solution. The new solution from the IDS Combination Center (CC) consists of a DORIS cumulative position and velocity solution using the latest IDS combined weekly solutions. The first objective of this study is to describe the new DPOD elaboration scheme and to show the IDS CC internal validation steps. The second purpose is to present the external validation process made by an external team before the new DPOD is made available to all the users. The elaboration and validation procedures will be illustrated by the presentation of first version of the DPOD2014 (ITRF2014 DORIS extension) and focus will be given on the update of the position and velocity of two DORIS sites: Everest (after Gorkha earthquake M7.8 in April 2015) and Thule (Greenland).

  13. Early 21st-Century Mass loss of the North-Atlantic Glaciers and Ice Caps (Arne Richter Award for Outstanding Young Scientists Lecture)

    NASA Astrophysics Data System (ADS)

    Wouters, Bert; Ligtenberg, Stefan; Moholdt, Geir; Gardner, Alex S.; Noel, Brice; Kuipers Munneke, Peter; van den Broeke, Michiel; Bamber, Jonathan L.

    2016-04-01

    Historically, ice loss from mountain glaciers and ice caps has been one of the largest contributors to sea level rise over the last century. Of particular interest are the glaciers and ice caps in the North-Atlantic region of the Arctic. Despite the cold climate in this area, considerable melting and runoff occurs in summer. A small increase in temperature will have an immediate effect on these processes, so that a large change in the Arctic ice volume can be expected in response to the anticipated climate change in the coming century. Unfortunately, direct observations of glaciers are sparse and are biased toward glaciers systems in accessible, mostly maritime, climate conditions. Remote sensing is therefore essential to monitor the state of the the North-Atlantic glaciers and ice caps. In this presentation, we will discuss the progress that has been made in estimating the ice mass balance of these regions, with a particular focus on measurements made by ESA's Cryosat-2 radar altimeter mission (2010-present). Compared to earlier altimeter mission, Cryosat-2 provides unprecedented coverage of the cryosphere, with a resolution down to 1 km or better and sampling at monthly intervals. Combining the Cryosat-2 measurements with the laser altimetry data from ICESat (2003-2009) gives us a 12 yr time series of glacial mass loss in the North Atlantic. We find excellent agreement between the altimetry measurements and independent observations by the GRACE mission, which directly 'weighs' the ice caps, albeit at a much lower resolution. Mass loss in the region has increased from 120 Gigatonnes per year in 2003-2009 to roughly 140 Gt/yr in 2010-2014, with an important contribution from Greenland's peripheral glaciers and ice caps. Importantly, the mass loss is not stationary, but shows large regional interannual variability, with mass loss shifting between eastern and western regions from year to year. Comparison with regional climate models shows that these shifts can be explained by changes in surface mass balance processes, highlighting the sensitivity of the glaciers and ice caps to changes in the atmospheric circulation and underscoring the need for long-term observations of the region.

  14. An updated 26-year (1991-2017) sea level record from the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Kildegaard Rose, Stine; Baltazar Andersen, Ole; Passaro, Marcello; Benveniste, Jerome

    2017-04-01

    In recent years, there has been a large focus of the Arctic due the rapid changes of the region. The sea level of the Arctic Ocean is an important climate indicator. The Arctic sea ice is decreasing and has since 1997 experienced a steepening in the decrease. The Arctic sea level determination is challenging due to the seasonal to permanent sea ice cover, the lack of regional coverage of satellites, the satellite instruments ability to measure ice, insufficient geophysical models, residual orbit errors, challenging retracking of satellite altimeter data. We present the DTU/TUM 26-year sea level record based on satellite altimetry data in the Arctic Ocean from the ERS1 (1991) to CryoSat-2 (present) satellites. The sea level record is compared with several tide gauges and other available partial sea level records contributing to the ESA CCI Sea level initiative. We use updated geophysical corrections and a combination of altimeter data: REAPER (ERS1), ALES+ retracker (ERS2, Envisat), combined Rads and DTUs in-house retracker LARS (CryoSat-2). The ALES+ is an upgraded version of the Adaptive Leading Edge Subwaveform Retracker that has been developed to improve data quality and quantity in the coastal ocean, without degrading the results in the open ocean. ALES+ aims at retracking peaky waveforms typical of lead reflections without modifying the fitting model used in the open ocean.

  15. Dynamic ocean topography from CryoSat-2: examining recent changes in ice-ocean stress and advancing a theory for Beaufort Gyre stabilization

    NASA Astrophysics Data System (ADS)

    Dewey, S.; Morison, J.; Kwok, R.; Dickinson, S.; Morison, D.; Andersen, R.

    2017-12-01

    Model and sparse observational evidence has shown the ocean current speed in the Beaufort Gyre to have increased and recently stabilized. However, full-basin altimetric observations of dynamic ocean topography (DOT) and ocean surface currents have yet to be applied to the dynamics of gyre stabilization. DOT fields from retracked CryoSat-2 retrievals in Arctic Ocean leads have enabled us to calculate 2-month average ocean geostrophic currents. These currents are crucial to accurately computing ice-ocean stress, especially because they have accelerated so that their speed rivals that of the overlying sea ice. Given these observations, we can shift our view of the Beaufort Gyre as a system in which the wind drives the ice and the ice drives a passive ocean to a system with the following feedback: After initial input of energy by wind, ice velocity decreases due to water drag and internal ice stress and the ocean drives the ice, reversing Ekman pumping and decelerating the gyre. This reversal changes the system from a persistently convergent regime to one in which freshwater is released from the gyre and doming of the gyre decreases, without any change in long-term average wind stress curl. Through these processes, the ice-ocean stress provides a key feedback in Beaufort Gyre stabilization.

  16. Grounding Lines Detecting Using LANDSAT8 Oli and CRYOSAT-2 Data Fusion

    NASA Astrophysics Data System (ADS)

    Li, F.; Guo, Y.; Zhang, Y.; Zhang, S.

    2018-04-01

    The grounding zone is the region where ice transitions from grounded ice sheet to freely floating ice shelf, grounding lines are actually more of a zone, typically over several kilometers. The mass loss from Antarctica is strongly linked to changes in the ice shelves and their grounding lines, since the variation in the grounding line can result in very rapid changes in glacier and ice-shelf behavior. Based on remote sensing observations, five global Antarctic grounding line products have been released internationally, including MOA, ASAID, ICESat, MEaSUREs, and Synthesized grounding lines. However, the five products could not provide the annual grounding line products of the whole Antarctic, even some products have stopped updating, which limits the time series analysis of Antarctic material balance to a certain extent. Besides, the accurate of single remote-sensing data based grounding line products is far from satisficed. Therefore, we use algorithms to extract grounding lines with SAR and Cryosat-2 data respectively, and combine the results of two kinds of grounding lines to obtain new products, we obtain a mature grounding line extraction algorithm process, so that we can realize the extraction of grounding line of the Antarctic each year in the future. The comparison between fusion results and the MOA product results indicate that there is a maximum deviation of 188.67 meters between the MOA product and the fusion result.

  17. Re-assessment of the mass balance of the Abbot and Getz sectors of West Antarctica

    NASA Astrophysics Data System (ADS)

    Chuter, S.; Bamber, J. L.

    2016-12-01

    Large discrepancies exist in mass balance estimates for the Getz and Abbot drainage basins, primarily due to previous poor knowledge of ice thickness at the grounding line, poor coverage by previous altimetry missions and signal leakage issues for GRACE. Large errors arise when using ice thickness measurements derived from ERS-1 and/or ICESat altimetry data due to poor track spacing, `loss of lock' issues near the grounding line and the complex morphology of these shelves, requiring fine resolution to derive robust and accurate elevations close to the grounding line. However, the advent of CryoSat-2 with its unique orbit and SARIn mode of operation has overcome these issues and enabled the determination of ice shelf thickness at a much higher accuracy than possible from previous satellites, particularly within the grounding zone. Here we present a contemporary estimate of ice sheet mass balance for the both the Getz and Abbot drainage basins. This is achieved through the use of contemporary velocity data derived from Landsat feature tracking and the use of CryoSat-2 derived ice thickness measurements. Additionally, we use this new ice thickness dataset to reassess mass balance estimates from 2008/2009, where there were large disparities between results from radar altimetry and Input-Output methodologies over the Abbot region in particular. These contemporary results are compared with other present day estimates from gravimetry and altimetry elevation changes.

  18. Towards decadal time series of Arctic and Antarctic sea ice thickness from radar altimetry

    NASA Astrophysics Data System (ADS)

    Hendricks, S.; Rinne, E. J.; Paul, S.; Ricker, R.; Skourup, H.; Kern, S.; Sandven, S.

    2016-12-01

    The CryoSat-2 mission has demonstrated the value of radar altimetry to assess the interannual variability and short-term trends of Arctic sea ice over the existing observational record of 6 winter seasons. CryoSat-2 is a particular successful mission for sea ice mass balance assessment due to its novel radar altimeter concept and orbit configuration, but radar altimetry data is available since 1993 from the ERS-1/2 and Envisat missions. Combining these datasets promises a decadal climate data record of sea ice thickness, but inter-mission biases must be taken into account due to the evolution of radar altimeters and the impact of changing sea ice conditions on retrieval algorithm parametrizations. The ESA Climate Change Initiative on Sea Ice aims to extent the list of data records for Essential Climate Variables (ECV's) with a consistent time series of sea ice thickness from available radar altimeter data. We report on the progress of the algorithm development and choices for auxiliary data sets for sea ice thickness retrieval in the Arctic and Antarctic Oceans. Particular challenges are the classification of surface types and freeboard retrieval based on radar waveforms with significantly varying footprint sizes. In addition, auxiliary data sets, e.g. for snow depth, are far less developed in the Antarctic and we will discuss the expected skill of the sea ice thickness ECV's in both hemispheres.

  19. Mesoscale resolution capability of altimetry: Present and future

    NASA Astrophysics Data System (ADS)

    Dufau, Claire; Orsztynowicz, Marion; Dibarboure, Gérald; Morrow, Rosemary; Le Traon, Pierre-Yves

    2016-07-01

    Wavenumber spectra of along-track Sea Surface Height from the most recent satellite radar altimetry missions [Jason-2, Cryosat-2, and SARAL/Altika) are used to determine the size of ocean dynamical features observable with the present altimetry constellation. A global analysis of the along-track 1-D mesoscale resolution capability of the present-day altimeter missions is proposed, based on a joint analysis of the spectral slopes in the mesoscale band and the error levels observed for horizontal wavelengths lower than 20km. The global sea level spectral slope distribution provided by Xu and Fu with Jason-1 data is revisited with more recent altimeter missions, and maps of altimeter error levels are provided and discussed for each mission. Seasonal variations of both spectral slopes and altimeter error levels are also analyzed for Jason-2. SARAL/Altika, with its lower error levels, is shown to detect smaller structures everywhere. All missions show substantial geographical and temporal variations in their mesoscale resolution capabilities, with variations depending mostly on the error level change but also on slight regional changes in the spectral slopes. In western boundary currents where the signal to noise ratio is favorable, the along-track mesoscale resolution is approximately 40 km for SARAL/AltiKa, 45 km for Cryosat-2, and 50 km for Jason-2. Finally, a prediction of the future 2-D mesoscale sea level resolution capability of the Surface Water and Ocean Topography (SWOT) mission is given using a simulated error level.

  20. Sea ice roughness: the key for predicting Arctic summer ice albedo

    NASA Astrophysics Data System (ADS)

    Landy, J.; Ehn, J. K.; Tsamados, M.; Stroeve, J.; Barber, D. G.

    2017-12-01

    Although melt ponds on Arctic sea ice evolve in stages, ice with smoother surface topography typically allows the pond water to spread over a wider area, reducing the ice-albedo and accelerating further melt. Building on this theory, we simulated the distribution of meltwater on a range of statistically-derived topographies to develop a quantitative relationship between premelt sea ice surface roughness and summer ice albedo. Our method, previously applied to ICESat observations of the end-of-winter sea ice roughness, could account for 85% of the variance in AVHRR observations of the summer ice-albedo [Landy et al., 2015]. Consequently, an Arctic-wide reduction in sea ice roughness over the ICESat operational period (from 2003 to 2008) explained a drop in ice-albedo that resulted in a 16% increase in solar heat input to the sea ice cover. Here we will review this work and present new research linking pre-melt sea ice surface roughness observations from Cryosat-2 to summer sea ice albedo over the past six years, examining the potential of winter roughness as a significant new source of sea ice predictability. We will further evaluate the possibility for high-resolution (kilometre-scale) forecasts of summer sea ice albedo from waveform-level Cryosat-2 roughness data in the landfast sea ice zone of the Canadian Arctic. Landy, J. C., J. K. Ehn, and D. G. Barber (2015), Albedo feedback enhanced by smoother Arctic sea ice, Geophys. Res. Lett., 42, 10,714-10,720, doi:10.1002/2015GL066712.

  1. A Preliminary Assessment of the S-3A SRAL Performances in SAR Mode

    NASA Astrophysics Data System (ADS)

    Dinardo, Salvatore; Scharroo, Remko; Bonekamp, Hans; Lucas, Bruno; Loddo, Carolina; Benveniste, Jerome

    2016-08-01

    The present work aims to assess and characterize the S3-A SRAL Altimeter performance in closed-loop tracking mode and in open ocean conditions. We have processed the Sentinel-3 SAR data products from L0 until L2 using an adaptation of the ESRIN GPOD CryoSat-2 Processor SARvatore.During the Delay-Doppler processing, we have chosen to activate the range zero-padding option.The L2 altimetric geophysical parameters, that are to be validated, are the sea surface height above the ellipsoid (SSH), sea level anomaly (SLA), the significant wave height (SWH) and wind speed (U10), all estimated at 20 Hz.The orbit files are the POD MOE, while the geo- corrections are extracted from the RADS database.In order to assess the accuracy of the wave&wind products, we have been using an ocean wave&wind speed model output (wind speed at 10 meter high above the sea surface) from the ECMWF.We have made a first order approximation of the sea state bias as -4.7% of the SWH.In order to assess the precision performance of SRAL SAR mode, we compute the level of instrumental noise (range, wave height and wind speed) for different conditions of sea state.

  2. Evaluation of the Sentinel-3 Hydrologic Altimetry Processor prototypE (SHAPE) methods.

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Garcia-Mondéjar, A.; Bercher, N.; Fabry, P. L.; Roca, M.; Varona, E.; Fernandes, J.; Lazaro, C.; Vieira, T.; David, G.; Restano, M.; Ambrózio, A.

    2017-12-01

    Inland water scenes are highly variable, both in space and time, which leads to a much broader range of radar signatures than ocean surfaces. This applies to both LRM and "SAR" mode (SARM) altimetry. Nevertheless the enhanced along-track resolution of SARM altimeters should help improve the accuracy and precision of inland water height measurements from satellite. The SHAPE project - Sentinel-3 Hydrologic Altimetry Processor prototypE - which is funded by ESA through the Scientific Exploitation of Operational Missions Programme Element (contract number 4000115205/15/I-BG) aims at preparing for the exploitation of Sentinel-3 data over the inland water domain. The SHAPE Processor implements all of the steps necessary to derive rivers and lakes water levels and discharge from Delay-Doppler Altimetry and perform their validation against in situ data. The processor uses FBR CryoSat-2 and L1A Sentinel-3A data as input and also various ancillary data (proc. param., water masks, L2 corrections, etc.), to produce surface water levels. At a later stage, water level data are assimilated into hydrological models to derive river discharge. This poster presents the improvements obtained with the new methods and algorithms over the regions of interest (Amazon and Danube rivers, Vanern and Titicaca lakes).

  3. Antarctic ice shelf thickness from CryoSat-2 radar altimetry

    NASA Astrophysics Data System (ADS)

    Chuter, Stephen; Bamber, Jonathan

    2016-04-01

    The Antarctic ice shelves provide buttressing to the inland grounded ice sheet, and therefore play a controlling role in regulating ice dynamics and mass imbalance. Accurate knowledge of ice shelf thickness is essential for input-output method mass balance calculations, sub-ice shelf ocean models and buttressing parameterisations in ice sheet models. Ice shelf thickness has previously been inferred from satellite altimetry elevation measurements using the assumption of hydrostatic equilibrium, as direct measurements of ice thickness do not provide the spatial coverage necessary for these applications. The sensor limitations of previous radar altimeters have led to poor data coverage and a lack of accuracy, particularly the grounding zone where a break in slope exists. We present a new ice shelf thickness dataset using four years (2011-2014) of CryoSat-2 elevation measurements, with its SARIn dual antennae mode of operation alleviating the issues affecting previous sensors. These improvements and the dense across track spacing of the satellite has resulted in ˜92% coverage of the ice shelves, with substantial improvements, for example, of over 50% across the Venable and Totten Ice Shelves in comparison to the previous dataset. Significant improvements in coverage and accuracy are also seen south of 81.5° for the Ross and Filchner-Ronne Ice Shelves. Validation of the surface elevation measurements, used to derive ice thickness, against NASA ICESat laser altimetry data shows a mean bias of less than 1 m (equivalent to less than 9 m in ice thickness) and a fourfold decrease in standard deviation in comparison to the previous continental dataset. Importantly, the most substantial improvements are found in the grounding zone. Validation of the derived thickness data has been carried out using multiple Radio Echo Sounding (RES) campaigns across the continent. Over the Amery ice shelf, where extensive RES measurements exist, the mean difference between the datasets is 3.3% and 4.7% across the whole shelf and within 10 km of the grounding line, respectively. These represent a two to three fold improvement in accuracy when compared to the previous data product. The impact of these improvements on Input-Output estimates of mass balance is illustrated for the Abbot Ice Shelf. Our new product shows a mean reduction of 29% in thickness at the grounding line when compared to the previous dataset as well as the elimination of non-physical 'data spikes' that were prevalent in the previous product in areas of complex terrain. The reduction in grounding line thickness equates to a change in mass balance for the areas from -14±9 GTyr-1to -4±9 GTyr-1. We show examples from other sectors including the Getz and George VI ice shelves. The updated estimate is more consistent with the positive surface elevation rate in this region obtained from satellite altimetry. The new thickness dataset will greatly reduce the uncertainty in Input-Output estimates of mass balance for the ˜30% of the grounding line of Antarctica where direct ice thickness measurements do not exist.

  4. A Factor of 2-4 Improvement in Marine Gravity and Predicted Bathymetry from CryoSat, Jason-1, and Envisat Radar Altimetry: Arctic and Coastal Regions

    DTIC Science & Technology

    2013-09-30

    dsandwell@ucsd.edu Award Number: N00014-12-1-0111 http://topex.ucsd.edu LONG-TERM GOALS • Improve our understanding of the ocean basins for...scientific research and Naval operations. OBJECTIVES • Improve global marine gravity maps by a factor of 2 in deep ocean areas and a factor of 4 in...arcsecond bathymetry model (SRTM30_PLUS). • Prepare the next generation of scientists for ocean research. APPROACH 1. Modify waveform

  5. A Factor of 2-4 Improvement in Marine Gravity and Predicted Bathymetry from CryoSat, Jason-1, and Envisat Radar Altimetry: Arctic and Coastal Regions

    DTIC Science & Technology

    2012-09-30

    ucsd.edu Award Number: N00014-12-1-0111 http://topex.ucsd.edu LONG-TERM GOALS • Improve our understanding of the ocean basins for...scientific research and Naval operations. OBJECTIVES • Improve global marine gravity maps by a factor of 2 in deep ocean areas and a factor of 4 in the...arcsecond bathymetry model (SRTM30_PLUS). • Prepare the next generation of scientists for ocean research. APPROACH 1. Modify waveform retracking

  6. GPD+ wet tropospheric corrections for eight altimetric missions for the Sea Level ECV generation

    NASA Astrophysics Data System (ADS)

    Fernandes, Joana; Lázaro, Clara; Benveniste, Jérôme

    2016-04-01

    Due to its large spatio-temporal variability, the delay induced by the water vapour and liquid water content of the atmosphere in the altimeter signal or wet tropospheric correction (WTC) is still one of the largest sources of uncertainty in satellite altimetry. In the scope of the Sea Level (SL) Climate Change Initiative (cci) project, the University of Porto (UPorto) has been developing methods to improve the WTC (Fernandes et al., 2015). Started as a coastal algorithm to remove land effects in the microwave radiometers (MWR) on board altimeter missions, the GNSS-derived Path Delay (GPD) methodology evolved to cover the open ocean, including high latitudes, correcting for invalid observations due to land, ice and rain contamination, band instrument malfunction. The most recent version of the algorithm, GPD Plus (GPD+) computes wet path delays based on: i) WTC from the on-board MWR measurements, whenever they exist and are valid; ii) new WTC values estimated through space-time objective analysis of all available data sources, whenever the previous are considered invalid. In the estimation of the new WTC values, the following data sets are used: valid measurements from the on-board MWR, water vapour products derived from a set of 17 scanning imaging radiometers (SI-MWR) on board various remote sensing satellites and tropospheric delays derived from Global Navigation Satellite Systems (GNSS) coastal and island stations. In the estimation process, WTC derived from an atmospheric model such as the European Centre for Medium-range Weather Forecasts (ECMWF) ReAnalysis (ERA) Interim or the operational (Op) model are used as first guess, which is the adopted value in the absence of measurements. The corrections are provided for all missions used to generate the SL Essential Climate Variable (ECV): TOPEX/Poseidon- T/P, Jason-1, Jason-2, ERS-1, ERS-2, Envisat, CryoSat-2 and SARAL/ALtiKa. To ensure consistency and long term stability of the WTC datasets, the radiometers used in the GPD+ estimations have been inter-calibrated against the stable and independently-calibrated Special Sensor Microwave Imager (SSM/I) and SSMI/I Sounder (SSM/IS) sensors on-board the Defense Meteorological Satellite Program satellite series (F10, F11, F13, F14, F16 and F17). The new products reduce the sea level anomaly variance, both along-track and at crossovers with respect to previous non-calibrated versions and to other WTC data sets such as AVISO Composite (Comp) correction and atmospheric models. Improvements are particularly significant for TP and all ESA missions, especially in the coastal regions and at high latitudes. In comparison with previous GPD versions, the main impacts are on the sea level trends at decadal time scales and on regional sea level trends. For CryoSat-2, the GPD+ WTC improves the SL ECV when compared to the baseline correction from the ECMWF Op model. In view to obtain the best WTC for use in the version 2 of the SL_cci ECV, new products are under development, based on recently released on-board MWR WTC for missions such as Jason-1, Envisat and SARAL. Fernandes, M.J., Clara Lázaro, Michaël Ablain, Nelson Pires, Improved wet path delays for all ESA and reference altimetric missions, Remote Sensing of Environment, Volume 169, November 2015, Pages 50-74, ISSN 0034-4257, http://dx.doi.org/10.1016/j.rse.2015.07.023

  7. Feasibility of synthetic aperture altimeter data in ice charting

    NASA Astrophysics Data System (ADS)

    Rinne, Eero; Kangas, Antti

    We demonstrate the possibility to utilise synthetic aperture altimeter data in operational ice charting. Different waveform parameters from Cryosat-2 SIRAL measurements are compared to AARI ice charts over the Barents and Kara seas. It is shown that polygons of different ice types are distinguishable in the altimeter data. The most important sea ice application of satellite altimeters today is measuring the thickness of Arctic winter sea ice. However, the use of altimeters to support ice mapping has been suggested already more than 30 years ago. Due to advent of imaging instruments more suitable for ice charting, most notably the SAR, altimeters have remained tools for sea ice science. They are however used operationally to determine sea height anomaly and significant wave height. Our input data is the SAR mode Level 1B data of CryoSat-2. We only consider the waveform data and calculate simple parameters describing the shape of the waveform such as the pulse peakiness and backscatter coefficient sigma_0. We compare these to ice stages of development given in the ice chart. As expected, ice edge is clearly visible in the altimeter data. What is more promising for operational ice thickness, areas of old ice can be distinguished from areas of young ice and nilas. Altimeters provide an independent source of sea ice information to complement SAR and passive microwave data. Albeit low resolution, altimeter data may prove valuable at times and locations where other data sources are unavailable. SAR data is frequently available for our study area, but our methods are applicable to areas where SAR data is scarce such as the Southern ice covered seas. Furthermore, our results here are directly applicable to the future Sentinel-3 altimeter data.

  8. Sea surface height and dynamic topography of the ice-covered oceans from CryoSat-2: 2011-2014

    NASA Astrophysics Data System (ADS)

    Kwok, Ron; Morison, James

    2016-01-01

    We examine 4 years (2011-2014) of sea surface heights (SSH) from CryoSat-2 (CS-2) over the ice-covered Arctic and Southern Oceans. Results are from a procedure that identifies and determines the heights of sea surface returns. Along 25 km segments of satellite ground tracks, variability in the retrieved SSHs is between ˜2 and 3 cm (standard deviation) in the Arctic and is slightly higher (˜3 cm) in the summer and the Southern Ocean. Average sea surface tilts (along these 25 km segments) are 0.01 ± 3.8 cm/10 km in the Arctic, and slightly lower (0.01 ± 2.0 cm/10 km) in the Southern Ocean. Intra-seasonal variability of CS-2 dynamic ocean topography (DOT) in the ice-covered Arctic is nearly twice as high as that of the Southern Ocean. In the Arctic, we find a correlation of 0.92 between 3 years of DOT and dynamic heights (DH) from hydrographic stations. Further, correlation of 4 years of area-averaged CS-2 DOT near the North Pole with time-variable ocean-bottom pressure from a pressure gauge and from GRACE, yields coefficients of 0.83 and 0.77, with corresponding differences of <3 cm (RMS). These comparisons contrast the length scale of baroclinic and barotropic features and reveal the smaller amplitude barotropic signals in the Arctic Ocean. Broadly, the mean DOT from CS-2 for both poles compares well with those from the ICESat campaigns and the DOT2008A and DTU13MDT fields. Short length scale topographic variations, due to oceanographic signals and geoid residuals, are especially prominent in the Arctic Basin but less so in the Southern Ocean.

  9. A Decade of Arctic Sea Ice Thickness Change from Airborne and Satellite Altimetry (Invited)

    NASA Astrophysics Data System (ADS)

    Farrell, S. L.; Richter-Menge, J.; Kurtz, N. T.; McAdoo, D. C.; Newman, T.; Zwally, H.; Ruth, J.

    2013-12-01

    Altimeters on both airborne and satellite platforms provide direct measurements of sea ice freeboard from which sea ice thickness may be calculated. Satellite altimetry observations of Arctic sea ice from ICESat and CryoSat-2 indicate a significant decline in ice thickness, and volume, over the last decade. During this time the ice pack has experienced a rapid change in its composition, transitioning from predominantly thick, multi-year ice to thinner, increasingly seasonal ice. We will discuss the regional trends in ice thickness derived from ICESat and IceBridge altimetry between 2003 and 2013, contrasting observations of the multi-year ice pack with seasonal ice zones. ICESat ceased operation in 2009, and the final, reprocessed data set became available recently. We extend our analysis to April 2013 using data from the IceBridge airborne mission, which commenced operations in 2009. We describe our current efforts to more accurately convert from freeboard to ice thickness, with a modified methodology that corrects for range errors, instrument biases, and includes an enhanced treatment of snow depth, with respect to ice type. With the planned launch by NASA of ICESat-2 in 2016 we can expect continuity of the sea ice thickness time series through the end of this decade. Data from the ICESat-2 mission, together with ongoing observations from CryoSat-2, will allow us to understand both the decadal trends and inter-annual variability in the Arctic sea ice thickness record. We briefly present the status of planned ICESat-2 sea ice data products, and demonstrate the utility of micro-pulse, photon-counting laser altimetry over sea ice.

  10. Comparing coastal ocean wavenumber spectra for surface currents and sea level from observations by HF-radar (CODAR) and CryoSat-2 satellite altimetry

    NASA Astrophysics Data System (ADS)

    Wilkin, J.; Hunter, E. J.

    2016-12-01

    An extensive CODAR HF-radar network has been acquiring observations of surface currents in the Mid Atlantic Bight (MAB) continental shelf ocean for several years. The fundamental CODAR observation is the component of velocity in the radial direction of view from a single antenna, geo-located by range and azimuth. Surface velocity vectors can be computed by combining radials observed by multiple sites. We exploit the concave geometry of the MAB coastline and the many possible radial views from numerous antennae to select transects that are substantially along or across isobaths, and compute wavenumber spectra for both along-shelf and across-shelf components of velocity. Comparing spectra computed from radial velocities to spectra for the same vector component extracted from the total vectors we find that the optimal interpolation combiner significantly damps energy for wavenumbers exceeding 0.03 km-1. This has ramifications for our error model in 4DVAR assimilation of CODAR total velocity. We further computed wavenumber spectra for altimeter SSHA from CryoSat-2 for ensembles of tracks in the same region of the MAB that were predominantly across- or along-shelf. Velocity spectra exhibit power law dependence close to k-5/3 down to the limit of resolution, while SSHA spectra are somewhat steeper. The constraint that bathymetry exerts on circulation on this broad, shallow shelf could influence the spectral characteristics of variability, as could winter well mixed versus summer strongly stratified conditions. Velocity and SSHA spectra are being compared to similar spectral estimates from model simulations as an assessment of convergence of the model resolution, and to explore theories of surface quasi-geostrophic turbulence that might explain the observed spectral characteristics.

  11. Determining Coastal Mean Dynamic Topography by Geodetic Methods

    NASA Astrophysics Data System (ADS)

    Huang, Jianliang

    2017-11-01

    In geodesy, coastal mean dynamic topography (MDT) was traditionally determined by spirit leveling technique. Advances in navigation satellite positioning (e.g., GPS) and geoid determination enable space-based leveling with an accuracy of about 3 cm at tide gauges. Recent CryoSat-2, a satellite altimetry mission with synthetic aperture radar (SAR) and SAR interferometric measurements, extends the space-based leveling to the coastal ocean with the same accuracy. However, barriers remain in applying the two space-based geodetic methods for MDT determination over the coastal ocean because current geoid modeling focuses primarily on land as a substitute to spirit leveling to realize the vertical datum.

  12. Measuring sea surface height with a GNSS-Wave Glider

    NASA Astrophysics Data System (ADS)

    Morales Maqueda, Miguel Angel; Penna, Nigel T.; Foden, Peter R.; Martin, Ian; Cipollini, Paolo; Williams, Simon D.; Pugh, Jeff P.

    2017-04-01

    A GNSS-Wave Glider is a novel technique to measure sea surface height autonomously using the Global Navigation Satellite System (GNSS). It consists of an unmanned surface vehicle manufactured by Liquid Robotics, a Wave Glider, and a geodetic-grade GNSS antenna-receiver system, with the antenna installed on a mast on the vehicle's deck. The Wave Glider uses the differential wave motion through the water column for propulsion, thus guaranteeing an, in principle, indefinite autonomy. Solar energy is collected to power all on-board instrumentation, including the GNSS system. The GNSS-Wave Glider was first tested in Loch Ness in 2013, demonstrating that the technology is capable of mapping geoid heights within the loch with an accuracy of a few centimetres. The trial in Loch Ness did not conclusively confirm the reliability of the technique because, during the tests, the state of the water surface was much more benign than would normally be expect in the open ocean. We now report on a first deployment of a GNSS-Wave Glider in the North Sea. The deployment took place in August 2016 and lasted thirteen days, during which the vehicle covered a distance of about 350 nautical miles in the north western North Sea off Great Britain. During the experiment, the GNSS-Wave Glider experienced sea states between 1 (0-0.1 m wave heights) and 5 (2.5-4 m wave heights). The GNSS-Wave Glider data, recorded at 5 Hz frequency, were analysed using a post-processed kinematic GPS-GLONASS precise point positioning (PPP) approach, which were quality controlled using double difference GPS kinematic processing with respect to onshore reference stations. Filtered with a 900 s moving-average window, the PPP heights reveal geoid patterns in the survey area that are very similar to the EGM2008 geoid model, thus demonstrating the potential use of a GNSS-Wave Glider for marine geoid determination. The residual of subtracting the modelled or measured marine geoid from the PPP signal combines information about dynamic topography and sea state. GNSS-Wave Glider data will next be validated against concurrent and co-located satellite altimetry data from the Jason-1, Jason-2, CryoSat-2 and AltiKa missions.

  13. About uncertainties in sea ice thickness retrieval from satellite radar altimetry: results from the ESA-CCI Sea Ice ECV Project Round Robin Exercise

    NASA Astrophysics Data System (ADS)

    Kern, S.; Khvorostovsky, K.; Skourup, H.; Rinne, E.; Parsakhoo, Z. S.; Djepa, V.; Wadhams, P.; Sandven, S.

    2014-03-01

    One goal of the European Space Agency Climate Change Initiative sea ice Essential Climate Variable project is to provide a quality controlled 20 year long data set of Arctic Ocean winter-time sea ice thickness distribution. An important step to achieve this goal is to assess the accuracy of sea ice thickness retrieval based on satellite radar altimetry. For this purpose a data base is created comprising sea ice freeboard derived from satellite radar altimetry between 1993 and 2012 and collocated observations of snow and sea ice freeboard from Operation Ice Bridge (OIB) and CryoSat Validation Experiment (CryoVEx) air-borne campaigns, of sea ice draft from moored and submarine Upward Looking Sonar (ULS), and of snow depth from OIB campaigns, Advanced Microwave Scanning Radiometer aboard EOS (AMSR-E) and the Warren Climatology (Warren et al., 1999). An inter-comparison of the snow depth data sets stresses the limited usefulness of Warren climatology snow depth for freeboard-to-thickness conversion under current Arctic Ocean conditions reported in other studies. This is confirmed by a comparison of snow freeboard measured during OIB and CryoVEx and snow freeboard computed from radar altimetry. For first-year ice the agreement between OIB and AMSR-E snow depth within 0.02 m suggests AMSR-E snow depth as an appropriate alternative. Different freeboard-to-thickness and freeboard-to-draft conversion approaches are realized. The mean observed ULS sea ice draft agrees with the mean sea ice draft computed from radar altimetry within the uncertainty bounds of the data sets involved. However, none of the realized approaches is able to reproduce the seasonal cycle in sea ice draft observed by moored ULS satisfactorily. A sensitivity analysis of the freeboard-to-thickness conversion suggests: in order to obtain sea ice thickness as accurate as 0.5 m from radar altimetry, besides a freeboard estimate with centimetre accuracy, an ice-type dependent sea ice density is as mandatory as a snow depth with centimetre accuracy.

  14. EM Bias-Correction for Ice Thickness and Surface Roughness Retrievals over Rough Deformed Sea Ice

    NASA Astrophysics Data System (ADS)

    Li, L.; Gaiser, P. W.; Allard, R.; Posey, P. G.; Hebert, D. A.; Richter-Menge, J.; Polashenski, C. M.

    2016-12-01

    The very rough ridge sea ice accounts for significant percentage of total ice areas and even larger percentage of total volume. The commonly used Radar altimeter surface detection techniques are empirical in nature and work well only over level/smooth sea ice. Rough sea ice surfaces can modify the return waveforms, resulting in significant Electromagnetic (EM) bias in the estimated surface elevations, and thus large errors in the ice thickness retrievals. To understand and quantify such sea ice surface roughness effects, a combined EM rough surface and volume scattering model was developed to simulate radar returns from the rough sea ice `layer cake' structure. A waveform matching technique was also developed to fit observed waveforms to a physically-based waveform model and subsequently correct the roughness induced EM bias in the estimated freeboard. This new EM Bias Corrected (EMBC) algorithm was able to better retrieve surface elevations and estimate the surface roughness parameter simultaneously. In situ data from multi-instrument airborne and ground campaigns were used to validate the ice thickness and surface roughness retrievals. For the surface roughness retrievals, we applied this EMBC algorithm to co-incident LiDAR/Radar measurements collected during a Cryosat-2 under-flight by the NASA IceBridge missions. Results show that not only does the waveform model fit very well to the measured radar waveform, but also the roughness parameters derived independently from the LiDAR and radar data agree very well for both level and deformed sea ice. For sea ice thickness retrievals, validation based on in-situ data from the coordinated CRREL/NRL field campaign demonstrates that the physically-based EMBC algorithm performs fundamentally better than the empirical algorithm over very rough deformed sea ice, suggesting that sea ice surface roughness effects can be modeled and corrected based solely on the radar return waveforms.

  15. Mapping Ross Ice Shelf with ROSETTA-Ice airborne laser altimetry

    NASA Astrophysics Data System (ADS)

    Becker, M. K.; Fricker, H. A.; Padman, L.; Bell, R. E.; Siegfried, M. R.; Dieck, C. C. M.

    2017-12-01

    The Ross Ocean and ice Shelf Environment and Tectonic setting Through Aerogeophysical surveys and modeling (ROSETTA-Ice) project combines airborne glaciological, geological, and oceanographic observations to enhance our understanding of the history and dynamics of the large ( 500,000 square km) Ross Ice Shelf (RIS). Here, we focus on the Light Detection And Ranging (LiDAR) data collected in 2015 and 2016. This data set represents a significant advance in resolution: Whereas the last attempt to systematically map RIS (the surface-based RIGGS program in the 1970s) was at 55 km grid spacing, the ROSETTA-Ice grid has 10-20 km line spacing and much higher along-track resolution. We discuss two different strategies for processing the raw LiDAR data: one that requires proprietary software (Riegl's RiPROCESS package), and one that employs open-source programs and libraries. With the processed elevation data, we are able to resolve fine-scale ice-shelf features such as the "rampart-moat" ice-front morphology, which has previously been observed on and modeled for icebergs. This feature is also visible in the ROSETTA-Ice shallow-ice radar data; comparing the laser data with radargrams provides insight into the processes leading to their formation. Near-surface firn state and total firn air content can also be investigated through combined analysis of laser altimetry and radar data. By performing similar analyses with data from the radar altimeter aboard CryoSat-2, we demonstrate the utility of the ROSETTA-Ice LiDAR data set in satellite validation efforts. The incorporation of the LiDAR data from the third and final field season (December 2017) will allow us to construct a DEM and an ice thickness map of RIS for the austral summers of 2015-2017. These products will be used to validate and extend observations of height changes from satellite radar and laser altimetry, as well as to update regional models of ocean circulation and ice dynamics.

  16. Changes of Arctic Marine Glaciers and Ice Caps from CryoSat Swath Altimetry

    NASA Astrophysics Data System (ADS)

    Tepes, P.; Gourmelen, N.; Weissgerber, F.; Escorihuela, M. J.; Wuite, J.; Nagler, T.; Foresta, L.; Brockley, D.; Baker, S.; Roca, M.; Shepherd, A.; Plummer, S.

    2017-12-01

    Glaciers and ice caps (GICs) are major contributors to the current budget of global mean sea level change. Ice losses from GICs are expected to increase over the next century and beyond (Gardner et al., 2011), particularly in the Arctic where mean annual surface temperatures have recently been increasing twice as fast as the global average (Screen and Simmonds, 2010). Investigating cryospheric changes over GICs from space-based observations has proven to be challenging due in large part to the limited spatial and temporal resolution of present day observation techniques compared to the relatively small size and the steep and complex terrain that often define GICs. As a result, not much is known about modern changes in ice mass in most of these smaller glaciated regions of the Arctic (Moholdt et al., 2012; Carr et al., 2014). Radar altimetry is well suited to monitoring elevation changes over land ice due to its all-weather year-round capability of observing ice surfaces. Since 2010, the Synthetic Interferometric Radar Altimeter (SIRAL) on board the European Space Agency (ESA) radar altimetry CryoSat (CS) mission has been collecting ice elevation measurements over GICs. Data from the CS-SARIn mode have been used to infer high resolution elevation and elevation change rates using "swath processing" (Hawley et al., 2009; Gray et al., 2013; Christie et al., 2016; Foresta et al., 2016; Smith et al., 2016). Together with a denser ground track interspacing of the CS mission, swath processing provides measurements at unprecedented spatial coverage and resolution, enabling the study of key processes that underlie current changes of GICs in the Arctic. In this study, we use CS swath observations to identify patterns of change of marine versus land-terminating glaciers across the Arctic. We generate maps of ice elevation change rates and present estimates of volumetric changes for GICs outside of Greenland. We then compare marine versus land terminating glaciers in terms of their relative contribution to changes in sea level since 2010.

  17. Sea Ice Mass Reconciliation Exercise (SIMRE) for altimetry derived sea ice thickness data sets

    NASA Astrophysics Data System (ADS)

    Hendricks, S.; Haas, C.; Tsamados, M.; Kwok, R.; Kurtz, N. T.; Rinne, E. J.; Uotila, P.; Stroeve, J.

    2017-12-01

    Satellite altimetry is the primary remote sensing data source for retrieval of Arctic sea-ice thickness. Observational data sets are available from current and previous missions, namely ESA's Envisat and CryoSat as well as NASA ICESat. In addition, freeboard results have been published from the earlier ESA ERS missions and candidates for new data products are the Sentinel-3 constellation, the CNES AltiKa mission and NASA laser altimeter successor ICESat-2. With all the different aspects of sensor type and orbit configuration, all missions have unique properties. In addition, thickness retrieval algorithms have evolved over time and data centers have developed different strategies. These strategies may vary in choice of auxiliary data sets, algorithm parts and product resolution and masking. The Sea Ice Mass Reconciliation Exercise (SIMRE) is a project by the sea-ice radar altimetry community to bridge the challenges of comparing data sets across missions and algorithms. The ESA Arctic+ research program facilitates this project with the objective to collect existing data sets and to derive a reconciled estimate of Arctic sea ice mass balance. Starting with CryoSat-2 products, we compare results from different data centers (UCL, AWI, NASA JPL & NASA GSFC) at full resolution along selected orbits with independent ice thickness estimates. Three regions representative of first-year ice, multiyear ice and mixed ice conditions are used to compare the difference in thickness and thickness change between products over the seasonal cycle. We present first results and provide an outline for the further development of SIMRE activities. The methodology for comparing data sets is designed to be extendible and the project is open to contributions by interested groups. Model results of sea ice thickness will be added in a later phase of the project to extend the scope of SIMRE beyond EO products.

  18. The International DORIS Service contribution to the 2014 realization of the International Terrestrial Reference Frame

    NASA Astrophysics Data System (ADS)

    Moreaux, Guilhem; Lemoine, Frank G.; Capdeville, Hugues; Kuzin, Sergey; Otten, Michiel; Štěpánek, Petr; Willis, Pascal; Ferrage, Pascale

    2016-12-01

    In preparation of the 2014 realization of the International Terrestrial Reference Frame (ITRF2014), the International DORIS Service delivered to the International Earth Rotation and Reference Systems Service a set of 1140 weekly solution files including station coordinates and Earth orientation parameters, covering the time period from 1993.0 to 2015.0. The data come from eleven DORIS satellites: TOPEX/Poseidon, SPOT2, SPOT3, SPOT4, SPOT5, Envisat, Jason-1, Jason-2, Cryosat-2, Saral and HY-2A. In their processing, the six analysis centers which contributed to the DORIS combined solution used the latest time variable gravity models and estimated DORIS ground beacon frequency variations. Furthermore, all the analysis centers but one excepted included in their processing phase center variations for ground antennas. The main objective of this study is to present the combination process and to analyze the impact of the new modeling on the performance of the new combined solution. Comparisons with the IDS contribution to ITRF2008 show that (i) the application of the DORIS ground phase center variations in the data processing shifts the combined scale upward by nearly 7-11 mm and (ii) thanks to estimation of DORIS ground beacon frequency variations, the new combined solution no longer shows any scale discontinuity in early 2002 and does not present unexplained vertical discontinuities in any station position time series. However, analysis of the new series with respect to ITRF2008 exhibits a scale increase late 2011 which is not yet explained. A new DORIS Terrestrial Reference Frame was computed to evaluate the intrinsic quality of the new combined solution. That evaluation shows that the addition of data from the new missions equipped with the latest generation of DORIS receiver (Jason-2, Cryosat-2, HY-2A, Saral), results in an internal position consistency of 10 mm or better after mid-2008.

  19. New Radar Altimeter Missions are Providing a Dramatically Sharper Image of Global Marine Tectonics

    NASA Astrophysics Data System (ADS)

    Sandwell, D. T.; Müller, D.; Garcia, E.; Matthews, K. J.; Smith, W. H. F.; Zaron, E.; Zhang, S.; Bassett, D.; Francis, R.

    2015-12-01

    Marine gravity, derived from satellite radar altimetry, is a powerful tool for mapping tectonic structures, especially in the deep ocean basins where the topography remains unmapped by ships or is buried by thick sediment. The ability to infer seafloor tectonics from space was first demonstrated in 1978 using Seasat altimeter data but the spatial coverage was incomplete because of the short three-month lifetime of the satellite. Most ocean altimeters have repeat ground tracks with spacings of hundreds of kilometers so they do not resolve tectonic structures. Adequate altimeter coverage became available in 1995 when the United States Navy declassified the Geosat radar altimeter data and the ERS-1 altimeter completed a 1-year mapping phase. These mid-1990's altimeter-derived images of the ocean basins remained static for 15 years because there were no new non-repeat altimeter missions. This situation changed dramatically in 2010 when CryoSat-2, with its advanced radar altimeter, was launched into a non-repeat orbit and continues to collect data until perhaps 2020. In addition the Jason-1 altimeter was placed into a 14-month geodetic phase at the end of its lifetime. More recently the 1.5 times higher precision measurements from the AltiKa altimeter aboard the SARAL spacecraft began to drift away from its 35-day repeat trackline. The Chinese HY-2 altimeter is scheduled to begin a dense mapping phase in early 2016. Moreover in 2020 we may enjoy significantly higher resolution maps of the ocean basins from the planned SWOT altimeter mission with its advanced swath mapping ability. All of this new data will provide a much sharper image of the tectonics of the deep ocean basins and continental margins. During this talk we will tour of the new tectonic structures revealed by CryoSat-2 and Jason-1 and speculate on the tectonic views of the ocean basins in 2020 and beyond.

  20. Accurate numerical forward model for optimal retracking of SIRAL2 SAR echoes over open ocean

    NASA Astrophysics Data System (ADS)

    Phalippou, L.; Demeestere, F.

    2011-12-01

    The SAR mode of SIRAL-2 on board Cryosat-2 has been designed to measure primarily sea-ice and continental ice (Wingham et al. 2005). In 2005, K. Raney (KR, 2005) pointed out the improvements brought by SAR altimeter for open ocean. KR results were mostly based on 'rule of thumb' considerations on speckle noise reduction due to the higher PRF and to speckle decorrelation after SAR processing. In 2007, Phalippou and Enjolras (PE,2007) provided the theoretical background for optimal retracking of SAR echoes over ocean with a focus on the forward modelling of the power-waveforms. The accuracies of geophysical parameters (range, significant wave heights, and backscattering coefficient) retrieved from SAR altimeter data were derived accounting for SAR echo shape and speckle noise accurate modelling. The step forward to optimal retracking using numerical forward model (NFM) was also pointed out. NFM of the power waveform avoids analytical approximation, a warranty to minimise the geophysical dependent biases in the retrieval. NFM have been used for many years, in operational meteorology in particular, for retrieving temperature and humidity profiles from IR and microwave radiometers as the radiative transfer function is complex (Eyre, 1989). So far this technique was not used in the field of ocean conventional altimetry as analytical models (e.g. Brown's model for instance) were found to give sufficient accuracy. However, although NFM seems desirable even for conventional nadir altimetry, it becomes inevitable if one wish to process SAR altimeter data as the transfer function is too complex to be approximated by a simple analytical function. This was clearly demonstrated in PE 2007. The paper describes the background to SAR data retracking over open ocean. Since PE 2007 improvements have been brought to the forward model and it is shown that the altimeter on-ground and in flight characterisation (e.g antenna pattern range impulse response, azimuth impulse response, altimeter transfer function) can be accurately accounted for, in order to minimise the systematic errors in the retrieval. The paper presents the retrieval of range and SWH for several Cryosat 2 orbits arcs, spanning different sea state conditions. The retrieval results are found to be in excellent agreement with the noise expectations derived from the Cramer-Rao bounds (see PE 2007.). The improvement upon conventional Low Resolution mode is about a factor of two in range. Improvements in SWH accuracy is also discussed. Comparisons with the MSL and conventional LRM-like retracking is also shown. Finally, the paper will give some insights for future oceanic altimetry missions. References : Wingham et al., 2005 : CryoSat: A mission to determine the fluctuations in Earth's land and marine ice fields. Advances in Space Research 37 (2006) 841-871 Raney, R.K. 2005 : Resolution and precision ofa delayDoppler Radar Altimeter, Proc IEEE OCEANS 2005. Phalippou L, V. Enjolras 2007 : Re-tracking of SAR altimeter ocean power waveforms and related accuracies of Sea surface Height, significant wave height and wind speed. Proc IEEE IGARSS 2007. Eyre, J. 1989 : Inversion of cloudy satellite radiances by non linear estimation : Theory and simulation for TOVS. Quaterly Journal of the Royal Meteorological Society, 115, pp1001-1026.

  1. Estimation of Volume and Freshwater Flux from the Arctic Ocean using SMAP and NCEP CFSv2

    NASA Astrophysics Data System (ADS)

    Bulusu, S.

    2017-12-01

    Spatial and temporal monitoring of sea surface salinity (SSS) plays an important role globally and especially over the Arctic Ocean. The Arctic ice melt has led to an influx of freshwater into the Arctic environment, a process that can be observed in SSS. The recently launched NASA's Soil Moisture Active Passive (SMAP) mission is primarily designed for the global monitoring of soil moisture using L- band (1.4GHz) frequency. SMAP also has the capability of measuring SSS and can thus extend the NASA's Aquarius salinity mission (ended June 7, 2015), salinity data record with improved temporal/spatial sampling. In this research an attempt is made to investigate the retrievability of SSS over the Arctic from SMAP satellite. The objectives of this study are to verify the use of SMAP sea surface salinity (and freshwater) variability in the Arctic Ocean and the extent to estimate freshwater, salt and volume flux from the Arctic Ocean. Along with SMAP data we will use NASA's Ice, Cloud,and land Elevation Satellites (ICESat and ICESat-2), and ESA's CryoSat-2, and NASA's Gravity Recovery and Climate Experiment (GRACE) satellites data to estimate ice melt in the Arctic. The preliminary results from SMAP compared well with the NCEP Climate Forecast System version 2 (CFSv2) salinity data in this region capturing patterns fairly well over the Arctic.

  2. Changes in the Earth's largest surge glacier system from satellite and airborne altimetry and imagery

    NASA Astrophysics Data System (ADS)

    Trantow, T.; Herzfeld, U. C.

    2015-12-01

    The Bering-Bagley Glacier System (BBGS), Alaska, one of the largest ice systems outside of Greenland and Antarctica, has recently surged (2011-2013), providing a rare opportunity to study the surge phenomenon in a large and complex system. Understanding fast-flowing glaciers and accelerations in ice flow, of which surging is one type, is critical to understanding changes in the cryosphere and ultimately changes in sea level. It is important to distinguish between types of accelerations and their consequences, especially between reversible or quasi-cyclic and irreversible forms of glacial acceleration, but current icesheet models treat all accelerating ice identically. Additionally, the surge provides an exceptional opportunity to study the influence of surface roughness and water content on return signals of altimeter systems. In this presentation, we analyze radar and laser altimeter data from CryoSat-2, NASA's Operation IceBridge (OIB), the ICESat Geoscience Laser Altimeter System (GLAS), ICESat-2's predecessor the Multiple Altimeter Beam Experimental Lidar (MABEL), and airborne laser altimeter and imagery campaigns by our research group. These measurements are used to study elevation, elevation change and crevassing throughout the glacier system. Analysis of the imagery from our airborne campaigns provides comprehensive characterizations of the BBGS surface over the course of the surge. Results from the data analysis are compared to numerical modeling experiments.

  3. SPICE: Sentinel-3 Performance Improvement for Ice Sheets

    NASA Astrophysics Data System (ADS)

    McMillan, M.; Escola, R.; Roca, M.; Thibaut, P.; Aublanc, J.; Shepherd, A.; Remy, F.; Benveniste, J.; Ambrózio, A.; Restano, M.

    2017-12-01

    For the past 25 years, polar-orbiting satellite radar altimeters have provided a valuable record of ice sheet elevation change and mass balance. One of the principle challenges associated with radar altimetry comes from the relatively large ground footprint of conventional pulse-limited radars, which reduces their capacity to make measurements in areas of complex topographic terrain. In recent years, progress has been made towards improving ground resolution, through the implementation of Synthetic Aperture Radar (SAR), or Delay-Doppler, techniques. In 2010, the launch of CryoSat-2 heralded the start of a new era of SAR Interferometric (SARIn) altimetry. However, because the satellite operated in SARIn and LRM mode over the ice sheets, many of the non-interferometric SAR altimeter processing techniques have been optimized for water and sea ice surfaces only. The launch of Sentinel-3, which provides full non-interferometric SAR coverage of the ice sheets, therefore presents the opportunity to further develop these SAR processing methodologies over ice sheets. Here we present results from SPICE, a 2 year study that focuses on (1) developing and evaluating Sentinel-3 SAR altimetry processing methodologies over the Polar ice sheets, and (2) investigating radar wave penetration through comparisons of Ku- and Ka-band satellite measurements. The project, which is funded by ESA's SEOM (Scientific Exploitation of Operational Missions) programme, has worked in advance of the operational phase of Sentinel-3, to emulate Sentinel-3 SAR and pseudo-LRM data from dedicated CryoSat-2 SAR acquisitions made at the Lake Vostok, Dome C and Spirit sites in East Antarctica, and from reprocessed SARIn data in Greenland. In Phase 1 of the project we have evaluated existing processing methodologies, and in Phase 2 we are investigating new evolutions to the Delay-Doppler Processing (DDP) and retracking chains. In this presentation we (1) evaluate the existing Sentinel-3 processing chain by comparing our emulated Sentinel-3 elevations to reference airborne datasets, (2) describe new developments to the DDP and retracking algorithms that are aimed at improving the certainty of retrievals over ice sheets, and (3) investigate radar wave penetration by comparing our SAR data to waveforms and elevations acquired by AltiKa at Ka-band.

  4. Surface elevation change over the Patagonia Ice Fields using CryoSat-2 swath altimetry

    NASA Astrophysics Data System (ADS)

    Foresta, Luca; Gourmelen, Noel; José Escorihuela, MarÍa; Garcia Mondejar, Albert; Wuite, Jan; Shepherd, Andrew; Roca, Mònica; Nagler, Thomas; Brockley, David; Baker, Steven; Nienow, Pete

    2017-04-01

    Satellite altimetry has been traditionally used in the past few decades to infer elevation of land ice, quantify changes in ice topography and infer mass balance estimates over large and remote areas such as the Greenland and Antarctic ice sheets. Radar Altimetry (RA) is particularly well suited to this task due to its all-weather year-round capability of observing the ice surface. However, monitoring of ice caps (area < 104 km^2) as well as mountain glaciers has proven more challenging. The large footprint of a conventional radar altimeter and relatively coarse ground track coverage are less suited to monitoring comparatively small regions with complex topography, so that mass balance estimates from RA rely on extrapolation methods to regionalize elevation change. Since 2010, the European Space Agency's CryoSat-2 (CS-2) satellite has collected ice elevation measurements over ice caps with its novel radar altimeter. CS-2 provides higher density of observations w.r.t. previous satellite altimeters, reduces the along-track footprint using Synthetic Aperture Radar (SAR) processing and locates the across-track origin of a surface reflector in the presence of a slope with SAR Interferometry (SARIn). Here, we exploit CS-2 as a swath altimeter [Hawley et al., 2009; Gray et al., 2013; Christie et al., 2016; Ignéczi et al., 2016, Foresta et al., 2016] over the Southern and Northern Patagonian Ice Fields (SPI and NPI, respectively). The SPI and NPI are the two largest ice masses in the southern hemisphere outside of Antarctica and are thinning very rapidly in recent decades [e.g Rignot et al., 2003; Willis et al, 2012]. However, studies of surface, volume and mass change in the literature, covering the entire SPI and NPI, are limited in number due to their remoteness, extremely complex topography and wide range of slopes. In this work, we present rates of surface elevation change for five glaciological years between 2011-2016 using swath-processed CS-2 SARIn heights and discuss the spatial and temporal coverage of elevation and its rate of change over the two regions.

  5. Continuous measurements of surface mass balance, firn compaction, and meltwater retention in Greenland for altimetry validation.

    NASA Astrophysics Data System (ADS)

    de la Peña, S.; Howat, I.; Behar, A.; Price, S. F.; Thanga, J.; Crowell, J. M.; Huseas, S.; Tedesco, M.

    2016-12-01

    Observations made in recent years by repeated altimetry from CryoSat-2 and NASA's Operation IceBridge reveal large fluctuations in the firn volume of the Greenland Ice Sheet. Although an order of magnitude smaller than ice thinning rates observed in some areas at the margins of the ice sheet, short-term departures in surface elevation trends occur over most of the accumulation zone of Greenland. Changes in the thickness of the firn column are influenced by variability in surface mass balance, firn compaction, and abrupt seasonal densification near the surface caused by refreezing at depth of variable amounts of surface meltwater in the summer. These processes and dynamic thinning cannot be differentiated from each other by altimetry alone. Until recently, nearly all information on density and surface mass balance changes over the firn layer came from ice core and snow pit stratigraphy that provided annual rates with relatively large uncertainties. Here we present direct, continuous measurements of firn density and surface mass balance along with annual estimates of firn ice content used to assess observed elevation change in the percolation zone of western Greenland in relation to firn processes. Since 2012, autonomous in-situ firn compaction sensors have monitored several sites in the catchment area of Jakobshavn Isbrae, and since 2015 surface mass balance and surface displacement has been measured continuously using a combination of sensors. In addition to identify the different components in the altimetry signal, The temporal resolution of the data acquired provide a means to monitor short-term changes in the near-surface firn, and identifying individual events causing surface elevation displacement.

  6. DORIS-based point mascons for the long term stability of precise orbit solutions

    NASA Astrophysics Data System (ADS)

    Cerri, L.; Lemoine, J. M.; Mercier, F.; Zelensky, N. P.; Lemoine, F. G.

    2013-08-01

    In recent years non-tidal Time Varying Gravity (TVG) has emerged as the most important contributor in the error budget of Precision Orbit Determination (POD) solutions for altimeter satellites' orbits. The Gravity Recovery And Climate Experiment (GRACE) mission has provided POD analysts with static and time-varying gravity models that are very accurate over the 2002-2012 time interval, but whose linear rates cannot be safely extrapolated before and after the GRACE lifespan. One such model based on a combination of data from GRACE and Lageos from 2002-2010, is used in the dynamic POD solutions developed for the Geophysical Data Records (GDRs) of the Jason series of altimeter missions and the equivalent products from lower altitude missions such as Envisat, Cryosat-2, and HY-2A. In order to accommodate long-term time-variable gravity variations not included in the background geopotential model, we assess the feasibility of using DORIS data to observe local mass variations using point mascons. In particular, we show that the point-mascon approach can stabilize the geographically correlated orbit errors which are of fundamental interest for the analysis of regional Mean Sea Level trends based on altimeter data, and can therefore provide an interim solution in the event of GRACE data loss. The time series of point-mass solutions for Greenland and Antarctica show good agreement with independent series derived from GRACE data, indicating a mass loss at rate of 210 Gt/year and 110 Gt/year respectively.

  7. Sea Ice Thickness Estimates from Data Collected Using Airborne Sensors and Coincident In Situ Data

    NASA Astrophysics Data System (ADS)

    Gardner, J. M.; Brozena, J. M.; Abelev, A.; Hagen, R. A.; Liang, R.; Ball, D.

    2016-12-01

    The Naval Research Laboratory collected data using Airborne sensors and coincident in-situ measurements over multiple sites of floating, but land-fast ice north of Barrow, AK. The in-situ data provide ground-truth for airborne measurements from a scanning LiDAR (Riegl Q 560i), digital photogrammetry (Applanix DSS-439), a low-frequency SAR (P-band in 2014 and P and L bands in 2015 and 2016) and a snow/Ku radar procured from the Center for Remote Sensing of Ice Sheets of the University of Kansas. The CReSIS radar was updated in 2015 to integrate the snow and Ku radars into a single continuous chirp, thus improving resolution. The objective of the surveys was to aid our understanding of the accuracy of ice thickness estimation via the freeboard method using the airborne sensor suite. Airborne data were collected on multiple overflights of the transect areas. The LiDAR measured total freeboard (ice + snow) referenced to leads in the ice, and produced swaths 200-300 m wide. The SAR imaged the ice beneath the snow and the snow/Ku radar measured snow thickness. The freeboard measurements and snow thickness are used to estimate ice thickness via isostasy and density estimates. Comparisons and processing methodology will be shown using data from three field seasons (2014-2016). The results of this ground-truth experiment will inform our analysis of grids of airborne data collected over areas of sea-ice illuminated by Cryosat-2.

  8. Progression of the 2011-2012 Surge of Bering Glacier and Bagley Ice Field, Alaska

    NASA Astrophysics Data System (ADS)

    Herzfeld, U. C.; McDonald, B.; Stachura, M.; Hale, R.; Trantow, T.; Weltman, A.; Chen, P.

    2012-12-01

    Bering Glacier, Alaska, started a surge in late spring 2011. The surge reached the ice front in May 2011 and extended into Bagley Ice Field by summer 2011. New surge-related crevassing was observed in July 2012. We collected aerial observations, including systematic videographic and photographic imagery, GPS data and laser altimeter data in September 2011 and in July 2012. In this talk, an analysis of surge progression and comparison to the early, mature and late stages of the 1993-1995 surge of Bering Glacier and Bagley Ice Field will be presented. A suite of approaches will be used to this end: Analysis of elevation changes based on CryoSat data, 2009 and 2010 IceBridge data and 2011 and 2012 laser altimeter data collected by our group, geostatistical classification of crevasse types based on imagery, classification of laser altimeter data and analysis of high-resolution satellite imagery (Worldview and GEOS).

  9. Satellite laser ranging using superconducting nanowire single-photon detectors at 1064  nm wavelength.

    PubMed

    Xue, Li; Li, Zhulian; Zhang, Labao; Zhai, Dongsheng; Li, Yuqiang; Zhang, Sen; Li, Ming; Kang, Lin; Chen, Jian; Wu, Peiheng; Xiong, Yaoheng

    2016-08-15

    Satellite laser ranging operating at 1064 nm wavelength using superconducting nanowire single-photon detectors (SNSPDs) is successfully demonstrated. A SNSPD with an intrinsic quantum efficiency of 80% and a dark count rate of 100 cps at 1064 nm wavelength is developed and introduced to Yunnan Observatory in China. With improved closed-loop telescope systems (field of view of about 26''), satellites including Cryosat, Ajisai, and Glonass with ranges of 1600 km, 3100 km, and 19,500 km, respectively, are experimentally ranged with mean echo rates of 1200/min, 4200/min, and 320/min, respectively. To the best of our knowledge, this is the first demonstration of laser ranging for satellites using SNSPDs at 1064 nm wavelength. Theoretical analysis of the detection efficiency and the mean echo rate for typical satellites indicate that it is possible for a SNSPD to range satellites from low Earth orbit to geostationary Earth orbit.

  10. Research Activities for the DORIS Contribution to the Next International Terrestrial Reference Frame

    NASA Technical Reports Server (NTRS)

    Soudarin, L.; Moreaux, G.; Lemoine, F.; Willis, P.; Stepanek, P.; Otten, M.; Govind, R.; Kuzin, S.; Ferrage, P.

    2012-01-01

    For the preparation of ITRF2008, the IDS processed data from 1993 to 2008, including data from TOPEX/Poseidon, the SPOT satellites and Envisat in the weekly solutions. Since the development of ITRF2008, the IDS has been engaged in a number of efforts to try and improve the reference frame solutions. These efforts include (i) assessing the contribution of the new DORIS satellites, Jason-2 and Cryosat2 (2008-2011), (ii) individually analyzing the DORIS satellite contributions to geocenter and scale, and (iii) improving orbit dynamics (atmospheric loading effects, satellite surface force modeling. . . ). We report on the preliminary results from these research activities, review the status of the IDS combination which is now routinely generated from the contributions of the IDS analysis centers, and discuss the prospects for continued improvement in the DORIS contribution to the next international reference frame.

  11. Status of DORIS contribution to ITRF2013

    NASA Astrophysics Data System (ADS)

    Moreaux, G.; Lemoine, F. G.; Soudarin, L.; Willis, P.; Stepanek, P.; Ferrage, P.; Otten, M.; Kuzin, S.

    2013-12-01

    In the context of the forthcoming realization of the forthcoming of ITRF 2013, the IDS Combination Center is involved in the estimation of DORIS stations positions/velocities as well as Earth orientation parameters from DORIS data. These computations are based on the latest series of all of the 6 IDS Analysis Centers multi-satellite weekly SINEX solutions from January 1993 to December 2013. Due to some technical issues remaining on Cryosat-2: 1993 to mid-2010, and mid-2010 to 2013. The first objective of this study is to analyze the preliminary version of the DORIS contribution to ITRF2013 from 1993-2010 in terms of (1) geocenter and scale solutions; (2) stations positions. The second purpose is to compare this new DORIS ITRF realization to the previous one (for ITRF2008). Then, we will conclude by addressing latest news on the IDS actions and schedule for final delivery of the entire DORIS contribution to ITRF2013 to IERS.

  12. The NRL 2011 Airborne Sea-Ice Thickness Campaign

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Gardner, J. M.; Liang, R.; Ball, D.; Richter-Menge, J.

    2011-12-01

    In March of 2011, the US Naval Research Laboratory (NRL) performed a study focused on the estimation of sea-ice thickness from airborne radar, laser and photogrammetric sensors. The study was funded by ONR to take advantage of the Navy's ICEX2011 ice-camp /submarine exercise, and to serve as a lead-in year for NRL's five year basic research program on the measurement and modeling of sea-ice scheduled to take place from 2012-2017. Researchers from the Army Cold Regions Research and Engineering Laboratory (CRREL) and NRL worked with the Navy Arctic Submarine Lab (ASL) to emplace a 9 km-long ground-truth line near the ice-camp (see Richter-Menge et al., this session) along which ice and snow thickness were directly measured. Additionally, US Navy submarines collected ice draft measurements under the groundtruth line. Repeat passes directly over the ground-truth line were flown and a grid surrounding the line was also flown to collect altimeter, LiDAR and Photogrammetry data. Five CRYOSAT-2 satellite tracks were underflown, as well, coincident with satellite passage. Estimates of sea ice thickness are calculated assuming local hydrostatic balance, and require the densities of water, ice and snow, snow depth, and freeboard (defined as the elevation of sea ice, plus accumulated snow, above local sea level). Snow thickness is estimated from the difference between LiDAR and radar altimeter profiles, the latter of which is assumed to penetrate any snow cover. The concepts we used to estimate ice thickness are similar to those employed in NASA ICEBRIDGE sea-ice thickness estimation. Airborne sensors used for our experiment were a Reigl Q-560 scanning topographic LiDAR, a pulse-limited (2 nS), 10 GHz radar altimeter and an Applanix DSS-439 digital photogrammetric camera (for lead identification). Flights were conducted on a Twin Otter aircraft from Pt. Barrow, AK, and averaged ~ 5 hours in duration. It is challenging to directly compare results from the swath LiDAR with the pulse-limited radar altimeter that has a footprint that varies from a few meters to a few tens of meters depending on altitude and roughness of the reflective surface. Intercalibration of the two instruments was accomplished at leads in the ice and by multiple over-flights of four radar corner-cubes set ~ 2 m above the snow along the ground-truth line. Direct comparison of successive flights of the ground-truth line to flights done in a grid pattern over and adjacent to the line was complicated by the ~ 20-30 m drift of the ice-floe between successive flight-lines. This rapid ice movement required the laser and radar data be translated into an ice-fixed, rather than a geographic reference frame. This was facilitated by geodetic GPS receiver measurements at the ice-camp and Pt. Barrow. The NRL data set, in combination with the ground-truth line and submarine upward-looking sonar data, will aid in understanding the error budgets of our systems, the ICEBRIDGE airborne measurements (also flown over the ground-truth line), and the CRYOSAT-2 data over a wide range of ice types.

  13. Validation and scaling of soil moisture in a semi-arid environment: SMAP Validation Experiment 2015 (SMAPVEX15)

    USDA-ARS?s Scientific Manuscript database

    The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data product.The main goals of the experiment were to address issues regarding the spatial disaggregation...

  14. CFD validation experiments for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.

  15. Assimilation of satellite altimetry data in hydrological models for improved inland surface water information: Case studies from the "Sentinel-3 Hydrologic Altimetry Processor prototypE" project (SHAPE)

    NASA Astrophysics Data System (ADS)

    Gustafsson, David; Pimentel, Rafael; Fabry, Pierre; Bercher, Nicolas; Roca, Mónica; Garcia-Mondejar, Albert; Fernandes, Joana; Lázaro, Clara; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    This communication is about the Sentinel-3 Hydrologic Altimetry Processor prototypE (SHAPE) project, with a focus on the components dealing with assimilation of satellite altimetry data into hydrological models. The SHAPE research and development project started in September 2015, within the Scientific Exploitation of Operational Missions (SEOM) programme of the European Space Agency. The objectives of the project are to further develop and assess recent improvement in altimetry data, processing algorithms and methods for assimilation in hydrological models, with the overarching goal to support improved scientific use of altimetry data and improved inland water information. The objective is also to take scientific steps towards a future Inland Water dedicated processor on the Sentinel-3 ground segment. The study focuses on three main variables of interest in hydrology: river stage, river discharge and lake level. The improved altimetry data from the project is used to estimate river stage, river discharge and lake level information in a data assimilation framework using the hydrological dynamic and semi-distributed model HYPE (Hydrological Predictions for the Environment). This model has been developed by SMHI and includes data assimilation module based on the Ensemble Kalman filter method. The method will be developed and assessed for a number of case studies with available in situ reference data and satellite altimetry data based on mainly the CryoSat-2 mission on which the new processor will be run; Results will be presented from case studies on the Amazon and Danube rivers and Lake Vänern (Sweden). The production of alti-hydro products (water level time series) are improved thanks to the use of water masks. This eases the geo-selection of the CryoSat-2 altimetric measurements since there are acquired from a geodetic orbit and are thus spread along the river course in space and and time. The specific processing of data from this geodetic orbit space-time pattern will be discussed as well as the subsequent possible strategies for data assimilation into models (and eventually highlight a generalized approach toward multi-mission data processing). Notably, in case of data assimilation along the course of rivers, the river slope might be estimated and compensated for, in order to produce local water level "pseudo time series" at arbitrary locations, and specifically at model's inlets.

  16. Basic Radar Altimetry Toolbox: Tools and Tutorial To Use Radar Altimetry For Cryosphere

    NASA Astrophysics Data System (ADS)

    Benveniste, J. J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious, especially for new Altimetry data products users. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them, including teachers and professors. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are under development, some are in discussion for the future. Data use cases on cryosphere applications will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  17. On the Impact of Snow Salinity on CryoSat-2 First-Year Sea Ice Thickness Retrievals

    NASA Astrophysics Data System (ADS)

    Nandan, V.; Yackel, J.; Geldsetzer, T.; Mahmud, M.

    2017-12-01

    European Space Agency's Ku-band altimeter CryoSat-2 (CS-2) has demonstrated its potential to provide extensive basin-scale spatial and temporal measurements of Arctic sea ice freeboard. It is assumed that CS-2 altimetric returns originate from the snow/sea ice interface (assumed to be the main scattering horizon). However, in newly formed thin ice ( 0.6 m) through to thick first-year sea ice (FYI) ( 2 m), upward wicking of brine into the snow cover from the underlying sea ice surface produces saline snow layers, especially in the bottom 6-8 cm of a snow cover. This in turn modifies the brine volume at/or near the snow/sea ice interface, altering the dielectric and scattering properties of the snow cover, leading to strong Ku-band microwave attenuation within the upper snow volume. Such significant reductions in Ku-band penetration may substantially affect CS-2 FYI freeboard retrievals. Therefore, the goal of this study is to evaluate a theoretical approach to estimate snow salinity induced uncertainty on CS-2 Arctic FYI freeboard measurements. Using the freeboard-to-thickness hydrostatic equilibrium equation, we quantify the error differences between the CS-2 FYI thickness, (assuming complete penetration of CS-2 radar signals to the snow/FYI interface), and the FYI thickness based on the modeled Ku-band main scattering horizon for different snow cover cases. We utilized naturally occurring saline and non-saline snow cover cases ranging between 6 cm to 32 cm from the Canadian Arctic, observed during late-winter from 1993 to 2017, on newly-formed ice ( 0.6 m), medium ( 1.5 m) and thick FYI ( 2 m). Our results suggest that irrespective of the thickness of the snow cover overlaying FYI, the thickness of brine-wetted snow layers and actual FYI freeboard strongly influence the amount with which CS-2 FYI freeboard estimates and thus thickness calculations are overestimated. This effect is accentuated for increasingly thicker saline snow covers overlaying newly-formed ice, which accounted to an overestimated FYI thickness by 250%, when compared to 80% overestimations on thinner saline snow covers, and the error reduces with increase in FYI thickness. Our study recommends the CS-2 sea ice community to add snow salinity as a potential error source, affecting CS-2 Arctic FYI freeboard and thickness retrievals.

  18. Informing a hydrological model of the Ogooué with multi-mission remote sensing data

    NASA Astrophysics Data System (ADS)

    Kittel, Cecile; Bauer-Gottwein, Peter; Nielsen, Karina; Tøttrup, Christian

    2017-04-01

    Knowledge on hydrological regimes of river basins is crucial for water management. However, data requirements often limit the applicability of hydrological models in basins with scarce in-situ data. Remote sensing provides a unique possibility to acquire information on hydrological variables in these basins. This study explores how multi-mission remote sensing data can inform a hydrological model. The Ogooué basin in Gabon is used as study area. No previous modelling efforts have been conducted for the basin and only historical flow and precipitation observations are available. Publicly available remote sensing observations are used to parametrize, force, calibrate and validate a hydrological model of the Ogooué. The modelling framework used in the study, is a lumped conceptual rainfall-runoff model based on the Budyko framework coupled to a Muskingum routing scheme. Precipitation is a crucial driver of the land-surface water balance, therefore two satellite-based rainfall estimates, Tropical Rainfall Measuring Mission (TRMM) product 3B42 version 7 and Famine Early Warning System - Rainfall Estimate (FEWS-RFE), are compared. The comparison shows good seasonal and spatial agreement between the products; however, TRMM consistently predicts significantly more precipitation: 1726 mm on average per year against 1556 mm for FEWS-RFE. Best modeling results are obtained with the TRMM precipitation forcing. Model calibration combines historical in-situ flow observations and GRACE total water storage observations using the Jet Propulsion Laboratory (JPL) mascon solution in a multi-objective approach. The two models are calibrated using flow duration curves and climatology benchmarks to overcome the lack of simultaneity between simulated and observed discharge. The objectives are aggregated into a global objective function, and the models are calibrated using the Shuffled Complex Evolution Algorithm. Water height observations from drifting orbit altimetry missions are extracted along the river line, using a detailed water mask based on Sentinel-1 SAR imagery. 1399 single CryoSat-2 altimetry observations and 48 ICESat observations are acquired. Additionally, water heights have been measured by the repeat-orbit satellite missions Envisat and Jason-2 at 12 virtual stations along the river. The four missions show generally good agreement in terms of mean annual water height amplitudes. The altimetry observations are used to validate the hydrological model of the Ogooué River. By combining hydrological modelling and remote sensing, new information on an otherwise unstudied basin is obtained. The study shows the potential of using remote sensing observations to parameterize, force, calibrate and validate models of poorly gauged river basins. Specifically, the study shows how Sentinel-1 SAR imagery supports the extraction of satellite altimetry data over rivers. The model can be used to assess climate change scenarios, evaluate hydraulic infrastructure development projects and predict the impact of irrigation diversions.

  19. Model-data comparisons of crevasses in accelerating glaciers exemplified for the 2011-2013 surge of Bering Glacier, Alaska

    NASA Astrophysics Data System (ADS)

    Trantow, T.; Herzfeld, U. C.

    2017-12-01

    Glacier acceleration, ubiquitous along the periphery of the major icesheets, presents one of the main uncertainties in modeling future global sea-level rise according to the IPCC 5th Assessment Report (2013). The surge phenomenon is one type of glacial acceleration and is the least understood. During a surge, large-scale elevation change and significant crevassing occurs throughout the entire ice system. Crevasses are the most obvious manifestations of the surge dynamics and provide a source of geophysical information that allows reconstruction of deformation processes. The recent surge of the Bering-Bagley Glacier System (BBGS), Alaska, in 2011-2013 provides an excellent test case to study surging through airborne and satellite observations together with numerical modeling. A 3D full-Stokes finite element model of the BBGS has been created using the Elmer/Ice software for structural and dynamical investigations of the surge. A von Mises condition is applied to modeled surface stresses to predict where crevassing would occur during the surge. The model uses CryoSat-2 derived surface topography (Baseline-C), bedrock topography, Glen's flow law with an isothermal assumption and a uniform linear friction law at the ice/bedrock boundary to represent the surge state in early 2011 when peak velocities were observed. Additionally, geostatistical characterization applied to optical satellite imagery provides an observational data set for model-data comparisons. Observed and modeled crevasse characteristics are compared with respect to their location, magnitude and orientation. Similarity mapping applied to the modeled von Mises stress and observed surface roughness values indicates that the two quantities are correlated. Results indicate that large-scale surface crevasses resulting from a surge are connected to the bedrock topography of the glacier system. The model-data comparisons used in this analysis serve to validate the numerical model and provide insight into the quality of our model input.

  20. Temporal changes in surface roughness around 88°S from repeat high-resolution Airborne Topographic Mapper laser altimetry

    NASA Astrophysics Data System (ADS)

    Studinger, M.; Brunt, K. M.; Medley, B.; Casey, K.; Neumann, T.

    2017-12-01

    The southern convergence of all ICESat-2 and CryoSat-2 tracks at 88°S is in a region of relatively low accumulation and surface slope making it ideal for satellite altimetry calibration and validation. In order to evaluate the stability and surface characteristics of the area we have analyzed repeat airborne laser altimetry measurements acquired around 88°S during 2014 and 2016 by NASA's Airborne Topographic Mapper (ATM) as part of Operation IceBridge. ATM is a conical scanner that operates at a wavelength of 532 nm, with a footprint of 1 meter and a 250-m-wide swath on the ground. The ATM Level 2 ICESSN data product includes slope and roughness estimates in 80 m × 80 m platelets across the swath. The mean surface roughness around 88°S for the 2014 data is 9.4 ± 2.0 cm, with the repeat flights in 2016 showing 8.6 ± 2.8 cm. The 2014 data reveals several areas where surface roughness doubles over very short spatial scales of only a few hundred meters. These features are several tens of km wide and appear to be oriented parallel to the main sastrugi direction visible in ATM spot elevation data and Digital Mapping System (DMS) visual imagery collected simultaneously. The rougher surface features are also present in the CReSIS snow radar data collected at the same time. These areas of increased surface roughness disappear in 2016 or seem to be significantly reduced in amplitude with the sharpness of the edges significantly reduced. The combination of simultaneous altimetry, snow radar and visual imagery on a regional scale provides a unique data set to study small scale deposition and erosional processes and their temporal variability. Our long-term goal is to quantify the spatial variability in snow accumulation rates south of 86°S in support of past, current and future altimetry measurements and surface mass balance model evaluation.

  1. Development and Validation of an Internet Use Attitude Scale

    ERIC Educational Resources Information Center

    Zhang, Yixin

    2007-01-01

    This paper describes the development and validation of a new 40-item Internet Attitude Scale (IAS), a one-dimensional inventory for measuring the Internet attitudes. The first experiment initiated a generic Internet attitude questionnaire, ensured construct validity, and examined factorial validity and reliability. The second experiment further…

  2. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  3. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  4. Variable Basal Melt Rates of Antarctic Peninsula Ice Shelves, 1994-2016

    NASA Astrophysics Data System (ADS)

    Adusumilli, Susheel; Fricker, Helen Amanda; Siegfried, Matthew R.; Padman, Laurie; Paolo, Fernando S.; Ligtenberg, Stefan R. M.

    2018-05-01

    We have constructed 23-year (1994-2016) time series of Antarctic Peninsula (AP) ice-shelf height change using data from four satellite radar altimeters (ERS-1, ERS-2, Envisat, and CryoSat-2). Combining these time series with output from atmospheric and firn models, we partitioned the total height-change signal into contributions from varying surface mass balance, firn state, ice dynamics, and basal mass balance. On the Bellingshausen coast of the AP, ice shelves lost 84 ± 34 Gt a-1 to basal melting, compared to contributions of 50 ± 7 Gt a-1 from surface mass balance and ice dynamics. Net basal melting on the Weddell coast was 51 ± 71 Gt a-1. Recent changes in ice-shelf height include increases over major AP ice shelves driven by changes in firn state. Basal melt rates near Bawden Ice Rise, a major pinning point of Larsen C Ice Shelf, showed large increases, potentially leading to substantial loss of buttressing if sustained.

  5. Monitoring lake level changes by altimetry in the arid region of Central Asia

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Liao, J. J.; Shen, G. Z.; Zhang, X. L.

    2017-07-01

    The study of lake level changes in arid region of Central Asia not only has important significance for the management and sustainable development of inland water resources, but also provides the basis for further study on the response of lakes to climate change and human activities. Therefore, in this paper, eleven typical lakes in Central Asia were observed. The lake edges were obtained through image interpretation using the quasi-synchronous MODIS image, and then water level information with long period (2002-2015) was acquired using ENVISAT/RA-2 and Cryosat-2 satellite borne radar altimeter data. The results show that these 11 lakes all have obvious seasonal changes of water level in a year with a high peak at different month. During 2002 - 2015, their water levels present decreased trend generally except Sarygamysh Lake, Alakol Lake and North Aral Sea. The alpine lakes are most stables, while open lakes’ levels change the most violently and closed lakes change diversely among different lakes.

  6. Mass balance reassessment of glaciers draining into the Abbot and Getz Ice Shelves of West Antarctica

    NASA Astrophysics Data System (ADS)

    Chuter, S. J.; Martín-Español, A.; Wouters, B.; Bamber, J. L.

    2017-07-01

    We present a reassessment of input-output method ice mass budget estimates for the Abbot and Getz regions of West Antarctica using CryoSat-2-derived ice thickness estimates. The mass budget is 8 ± 6 Gt yr-1 and 5 ± 17 Gt yr-1 for the Abbot and Getz sectors, respectively, for the period 2006-2008. Over the Abbot region, our results resolve a previous discrepancy with elevation rates from altimetry, due to a previous 30% overestimation of ice thickness. For the Getz sector, our results are at the more positive bound of estimates from other techniques. Grounding line velocity increases up to 20% between 2007 and 2014 alongside mean elevation rates of -0.67 ± 0.13 m yr-1 between 2010 and 2013 indicate the onset of a dynamic thinning signal. Mean snowfall trends of -0.33 m yr-1 water equivalent since 2006 indicate recent mass trends are driven by both ice dynamics and surface processes.

  7. Assessment of Glacial Isostatic Adjustment in Greenland using GPS

    NASA Astrophysics Data System (ADS)

    Khan, S. A.; Bevis, M. G.; Sasgen, I.; van Dam, T. M.; Wahr, J. M.; Wouters, B.; Bamber, J. L.; Willis, M. J.; Knudsen, P.; Helm, V.; Kuipers Munneke, P.; Muresan, I. S.

    2015-12-01

    The Greenland GPS network (GNET) was constructed to provide a new means to assess viscoelastic and elastic adjustments driven by past and present-day changes in ice mass. Here we assess existing glacial isostatic adjustments (GIA) predictions by analysing 1995-2015 data from 61 continuous GPS receivers located along the margin of the Greenland ice sheet. Since GPS receivers measure both the GIA and elastic signals, we isolate GIA, by removing the elastic adjustments of the lithosphere due to present-day mass changes using high-resolution fields of ice surface elevation change derived from satellite and airborne altimetry measurements (ERS1/2, ICESat, ATM, ENVISAT, and CryoSat-2). For most GPS stations, our observed GIA rates contradict GIA predictions; particularly, we find huge uplift rates in southeast Greenland of up to 14 mm/yr while models predict rates of 0-2 mm/yr. Our results suggest possible improvements of GIA predictions, and hence of the poorly constrained ice load history and Earth structure models for Greenland.

  8. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.

  9. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    PubMed

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  10. New approaches to observation and modeling of fast-moving glaciers and ice streams

    NASA Astrophysics Data System (ADS)

    Herzfeld, U. C.; Trantow, T.; Markle, M. J.; Medley, G.; Markus, T.; Neumann, T.

    2016-12-01

    In this paper, we will give an overview of several new approaches to remote-sensing observations and analysis and to modeling of fast glacier flow. The approaches will be applied in case studies of different types of fast-moving glaciers: (1) The Bering-Bagley Glacier System, Alaska (a surge-type glacier system), (2) Jakobshavn Isbræ, Greenland (a tide-water terminating fjord glacier and outlet of the Greenland Inland Ice), and (3) Icelandic Ice Caps (manifestations of the interaction of volcanic and glaciologic processes). On the observational side, we will compare the capabilities of lidar and radar altimeters, including ICESat's Geoscience Laser Altimeter System (GLAS), CryoSat-2's Synthetic Aperture Interferometric Radar Altimeter (SIRAL) and the future ICESat-2 Advanced Topographic Laser Altimeter System (ATLAS), especially regarding retrieval of surface heights over crevassed regions as typical of spatial and temporal acceleration. Properties that can be expected from ICESat-2 ATLAS data will be illustrated based on analyses of data from ICESat-2 simulator instruments: the Slope Imaging Multi-polarization Photon-counting Lidar (SIMPL) and the Multiple Altimeter Beam Experimental Lidar (MABEL). Information from altimeter data will be augmented by an automated surface classification based on image data, which includes satellite imagery such as LANDSAT and WorldView as well as airborne video imagery of ice surfaces. Numerical experiments using Elmer/Ice will be employed to link parameters derived in observations to physical processes during the surge of the Bering Bagley Glacier System. This allows identification of processes that can be explained in an existing framework and processes that may require new concepts for glacier evolution. Topics include zonation of surge progression in a complex glacier system and crevassing as an indication, storage of glacial water, influence of basal topography and the role of friction laws.

  11. Improved estimate of accelerated Antarctica ice mass loses from GRACE, Altimetry and surface mass balance from regional climate model output

    NASA Astrophysics Data System (ADS)

    Velicogna, I.; Sutterley, T. C.; A, G.; van den Broeke, M. R.; Ivins, E. R.

    2016-12-01

    We use Gravity Recovery and Climate Experiment (GRACE) monthly gravity fields to determine the regional acceleration in ice mass loss in Antarctica for 2002-2016. We find that the total mass loss is controlled by only a few regions. In Antarctica, the Amundsen Sea (AS) sector and the Antarctic Peninsula account for 65% and 18%, respectively, of the total loss (186 ± 10 Gt/yr) mainly from ice dynamics. The AS sector contributes most of the acceleration in loss (9 ± 1 Gt/yr2 ), and Queen Maud Land, East Antarctica, is the only sector with a significant mass gain due to a local increase in SMB (57 ± 5 Gt/yr). We compare GRACE regional mass balance estimates with independent estimates from ICESat-1 and Operation IceBridge laser altimetry, CryoSat-2 radar altimetry, and surface mass balance outputs from RACMO2.3. In the Amundsen Sea Embayment of West Antarctica, an area experiencing rapid retreat and mass loss to the sea, we find good agreement between GRACE and altimetry estimates. Comparison of GRACE with these independent techniques in East Antarctic shows that GIA estimates from the new regional ice deglaciation models underestimate the GIA correction in the EAIS interior, which implies larger losses of the Antarctica ice sheet by about 70 Gt/yr. Sectors where we are observing the largest losses are closest to warm circumpolar water, and with polar constriction of the westerlies enhanced by climate warming, we expect these sectors to contribute more and more to sea level as the ice shelves that protect these glaciers will melt faster in contact with more heat from the surrounding oc

  12. Methodology for turbulence code validation: Quantification of simulation-experiment agreement and application to the TORPEX experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, Paolo; Theiler, C.; Fasoli, A.

    A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less

  13. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    NASA Technical Reports Server (NTRS)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  14. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jernigan, Dann A.; Blanchat, Thomas K.

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less

  15. Electrolysis Performance Improvement and Validation Experiment

    NASA Technical Reports Server (NTRS)

    Schubert, Franz H.

    1992-01-01

    Viewgraphs on electrolysis performance improvement and validation experiment are presented. Topics covered include: water electrolysis: an ever increasing need/role for space missions; static feed electrolysis (SFE) technology: a concept developed for space applications; experiment objectives: why test in microgravity environment; and experiment description: approach, hardware description, test sequence and schedule.

  16. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  17. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1993-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  18. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin

    PubMed Central

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2016-01-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957

  19. Patient Experience and Satisfaction with Inpatient Service: Development of Short Form Survey Instrument Measuring the Core Aspect of Inpatient Experience

    PubMed Central

    Wong, Eliza L. Y.; Coulter, Angela; Hewitson, Paul; Cheung, Annie W. L.; Yam, Carrie H. K.; Lui, Siu fai; Tam, Wilson W. S.; Yeoh, Eng-kiong

    2015-01-01

    Patient experience reflects quality of care from the patients’ perspective; therefore, patients’ experiences are important data in the evaluation of the quality of health services. The development of an abbreviated, reliable and valid instrument for measuring inpatients’ experience would reflect the key aspect of inpatient care from patients’ perspective as well as facilitate quality improvement by cultivating patient engagement and allow the trends in patient satisfaction and experience to be measured regularly. The study developed a short-form inpatient instrument and tested its ability to capture a core set of inpatients’ experiences. The Hong Kong Inpatient Experience Questionnaire (HKIEQ) was established in 2010; it is an adaptation of the General Inpatient Questionnaire of the Care Quality Commission created by the Picker Institute in United Kingdom. This study used a consensus conference and a cross-sectional validation survey to create and validate a short-form of the Hong Kong Inpatient Experience Questionnaire (SF-HKIEQ). The short-form, the SF-HKIEQ, consisted of 18 items derived from the HKIEQ. The 18 items mainly covered relational aspects of care under four dimensions of the patient’s journey: hospital staff, patient care and treatment, information on leaving the hospital, and overall impression. The SF-HKIEQ had a high degree of face validity, construct validity and internal reliability. The validated SF-HKIEQ reflects the relevant core aspects of inpatients’ experience in a hospital setting. It provides a quick reference tool for quality improvement purposes and a platform that allows both healthcare staff and patients to monitor the quality of hospital care over time. PMID:25860775

  20. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    NASA Astrophysics Data System (ADS)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  1. The Development and Validation of a Life Experience Inventory for the Identification of Creative Electrical Engineers.

    ERIC Educational Resources Information Center

    Michael, William B.; Colson, Kenneth R.

    1979-01-01

    The construction and validation of the Life Experience Inventory (LEI) for the identification of creative electrical engineers are described. Using the number of patents held or pending as a criterion measure, the LEI was found to have high concurrent validity. (JKS)

  2. CFD Modeling Needs and What Makes a Good Supersonic Combustion Validation Experiment

    NASA Technical Reports Server (NTRS)

    Gaffney, Richard L., Jr.; Cutler, Andrew D.

    2005-01-01

    If a CFD code/model developer is asked what experimental data he wants to validate his code or numerical model, his answer will be: "Everything, everywhere, at all times." Since this is not possible, practical, or even reasonable, the developer must understand what can be measured within the limits imposed by the test article, the test location, the test environment and the available diagnostic equipment. At the same time, it is important for the expermentalist/diagnostician to understand what the CFD developer needs (as opposed to wants) in order to conduct a useful CFD validation experiment. If these needs are not known, it is possible to neglect easily measured quantities at locations needed by the developer, rendering the data set useless for validation purposes. It is also important for the experimentalist/diagnostician to understand what the developer is trying to validate so that the experiment can be designed to isolate (as much as possible) the effects of a particular physical phenomena that is associated with the model to be validated. The probability of a successful validation experiment can be greatly increased if the two groups work together, each understanding the needs and limitations of the other.

  3. An Overlooked Population in Community College: International Students' (In)Validation Experiences With Academic Advising

    ERIC Educational Resources Information Center

    Zhang, Yi

    2016-01-01

    Objective: Guided by validation theory, this study aims to better understand the role that academic advising plays in international community college students' adjustment. More specifically, this study investigated how academic advising validates or invalidates their academic and social experiences in a community college context. Method: This…

  4. Validation Experiences and Persistence among Urban Community College Students

    ERIC Educational Resources Information Center

    Barnett, Elisabeth A.

    2007-01-01

    The purpose of this research was to examine the extent to which urban community college students' experiences with validation by faculty contributed to their sense of integration in college and whether this, in turn, contributed to their intent to persist in college. This study focused on urban community college students' validating experiences…

  5. The Mediterranean Forecasting System: recent developments

    NASA Astrophysics Data System (ADS)

    Tonani, Marina; Oddo, Paolo; Korres, Gerasimos; Clementi, Emanuela; Dobricic, Srdjan; Drudi, Massimiliano; Pistoia, Jenny; Guarnieri, Antonio; Romaniello, Vito; Girardi, Giacomo; Grandi, Alessandro; Bonaduce, Antonio; Pinardi, Nadia

    2014-05-01

    Recent developments of the Mediterranean Monitoring and Forecasting Centre of the EU-Copernicus marine service, the Mediterranean Forecasting System (MFS), are presented. MFS provides forecast, analysis and reanalysis for the physical and biogeochemical parameters of the Mediterranean Sea. The different components of the system are continuously updated in order to provide to the users the best available product. This work is focus on the physical component of the system. The physical core of MFS is composed by an ocean general circulation model (NEMO) coupled with a spectral wave model (Wave Watch-III). The NEMO model provides to WW-III surface currents and SST fields, while WW-III returns back to NEMO the neutral component of the surface drag coefficient. Satellite Sea Level Anomaly observations and in-situ T & S vertical profiles are assimilated into this system using a variational assimilation scheme based on 3DVAR (Dobricic, 2008) . Sensitive experiments have been performed in order to assess the impact of the assimilation of the latest available SLA missions, Altika and Cryosat together with the long term available mission of Jason2. The results show a significant improvement of the MFS skill due to the multi-mission along track assimilation. The primitive equations module has been recently upgraded with the introduction of the atmospheric pressure term and a new, explicit, numerical scheme has been adopted to solve the barotropic component of the equations of motion. The SLA satellite observations for data assimilation have been consequently modified in order to account for the new atmospheric pressure term introduced in the equations. This new system has been evaluated using tide gauge coastal buoys and the satellite along track data. The quality of the SSH has improved significantly while a minor impact has been observed on the other state variables (temperature, salinity and currents). Experiments with a higher resolution NWP (numerical weather prediction) forcing provided by the COSMO-MED system (provided by the Italian Meteorological Office), have been performed and a pre-operational 3-day forecast production system has been developed. The comparison between this system and the official one forced by the ECMWF NWP data will be discussed.

  6. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  7. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    NASA Technical Reports Server (NTRS)

    Storey, Jed; Kirk, Daniel (Editor); Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  8. You don't have to believe everything you read: background knowledge permits fast and efficient validation of information.

    PubMed

    Richter, Tobias; Schroeder, Sascha; Wöhrmann, Britta

    2009-03-01

    In social cognition, knowledge-based validation of information is usually regarded as relying on strategic and resource-demanding processes. Research on language comprehension, in contrast, suggests that validation processes are involved in the construction of a referential representation of the communicated information. This view implies that individuals can use their knowledge to validate incoming information in a routine and efficient manner. Consistent with this idea, Experiments 1 and 2 demonstrated that individuals are able to reject false assertions efficiently when they have validity-relevant beliefs. Validation processes were carried out routinely even when individuals were put under additional cognitive load during comprehension. Experiment 3 demonstrated that the rejection of false information occurs automatically and interferes with affirmative responses in a nonsemantic task (epistemic Stroop effect). Experiment 4 also revealed complementary interference effects of true information with negative responses in a nonsemantic task. These results suggest the existence of fast and efficient validation processes that protect mental representations from being contaminated by false and inaccurate information.

  9. Validation of a dye stain assay for vaginally inserted HEC-filled microbicide applicators

    PubMed Central

    Katzen, Lauren L.; Fernández-Romero, José A.; Sarna, Avina; Murugavel, Kailapuri G.; Gawarecki, Daniel; Zydowsky, Thomas M.; Mensch, Barbara S.

    2011-01-01

    Background The reliability and validity of self-reports of vaginal microbicide use are questionable given the explicit understanding that participants are expected to comply with study protocols. Our objective was to optimize the Population Council's previously validated dye stain assay (DSA) and related procedures, and establish predictive values for the DSA's ability to identify vaginally inserted single-use, low-density polyethylene microbicide applicators filled with hydroxyethylcellulose gel. Methods Applicators, inserted by 252 female sex workers enrolled in a microbicide feasibility study in Southern India, served as positive controls for optimization and validation experiments. Prior to validation, optimal dye concentration and staining time were ascertained. Three validation experiments were conducted to determine sensitivity, specificity, negative predictive values and positive predictive values. Results The dye concentration of 0.05% (w/v) FD&C Blue No. 1 Granular Food Dye and staining time of five seconds were determined to be optimal and were used for the three validation experiments. There were a total of 1,848 possible applicator readings across validation experiments; 1,703 (92.2%) applicator readings were correct. On average, the DSA performed with 90.6% sensitivity, 93.9% specificity, and had a negative predictive value of 93.8% and a positive predictive value of 91.0%. No statistically significant differences between experiments were noted. Conclusions The DSA was optimized and successfully validated for use with single-use, low-density polyethylene applicators filled with hydroxyethylcellulose (HEC) gel. We recommend including the DSA in future microbicide trials involving vaginal gels in order to identify participants who have low adherence to dosing regimens. In doing so, we can develop strategies to improve adherence as well as investigate the association between product use and efficacy. PMID:21992983

  10. Goals and Status of the NASA Juncture Flow Experiment

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Morrison, Joseph H.

    2016-01-01

    The NASA Juncture Flow experiment is a new effort whose focus is attaining validation data in the juncture region of a wing-body configuration. The experiment is designed specifically for the purpose of CFD validation. Current turbulence models routinely employed by Reynolds-averaged Navier-Stokes CFD are inconsistent in their prediction of corner flow separation in aircraft juncture regions, so experimental data in the near-wall region of such a configuration will be useful both for assessment as well as for turbulence model improvement. This paper summarizes the Juncture Flow effort to date, including preliminary risk-reduction experiments already conducted and planned future experiments. The requirements and challenges associated with conducting a quality validation test are discussed.

  11. CFD validation experiments at the Lockheed-Georgia Company

    NASA Technical Reports Server (NTRS)

    Malone, John B.; Thomas, Andrew S. W.

    1987-01-01

    Information is given in viewgraph form on computational fluid dynamics (CFD) validation experiments at the Lockheed-Georgia Company. Topics covered include validation experiments on a generic fighter configuration, a transport configuration, and a generic hypersonic vehicle configuration; computational procedures; surface and pressure measurements on wings; laser velocimeter measurements of a multi-element airfoil system; the flowfield around a stiffened airfoil; laser velocimeter surveys of a circulation control wing; circulation control for high lift; and high angle of attack aerodynamic evaluations.

  12. Observations on CFD Verification and Validation from the AIAA Drag Prediction Workshops

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Kleb, Bil; Vassberg, John C.

    2014-01-01

    The authors provide observations from the AIAA Drag Prediction Workshops that have spanned over a decade and from a recent validation experiment at NASA Langley. These workshops provide an assessment of the predictive capability of forces and moments, focused on drag, for transonic transports. It is very difficult to manage the consistency of results in a workshop setting to perform verification and validation at the scientific level, but it may be sufficient to assess it at the level of practice. Observations thus far: 1) due to simplifications in the workshop test cases, wind tunnel data are not necessarily the “correct” results that CFD should match, 2) an average of core CFD data are not necessarily a better estimate of the true solution as it is merely an average of other solutions and has many coupled sources of variation, 3) outlier solutions should be investigated and understood, and 4) the DPW series does not have the systematic build up and definition on both the computational and experimental side that is required for detailed verification and validation. Several observations regarding the importance of the grid, effects of physical modeling, benefits of open forums, and guidance for validation experiments are discussed. The increased variation in results when predicting regions of flow separation and increased variation due to interaction effects, e.g., fuselage and horizontal tail, point out the need for validation data sets for these important flow phenomena. Experiences with a recent validation experiment at NASA Langley are included to provide guidance on validation experiments.

  13. Validation of Skills, Knowledge and Experience in Lifelong Learning in Europe

    ERIC Educational Resources Information Center

    Ogunleye, James

    2012-01-01

    The paper examines systems of validation of skills and experience as well as the main methods/tools currently used for validating skills and knowledge in lifelong learning. The paper uses mixed methods--a case study research and content analysis of European Union policy documents and frameworks--as a basis for this research. The selection of the…

  14. Changes and Issues in the Validation of Experience

    ERIC Educational Resources Information Center

    Triby, Emmanuel

    2005-01-01

    This article analyses the main changes in the rules for validating experience in France and of what they mean for society. It goes on to consider university validation practices. The way in which this system is evolving offers a chance to identify the issues involved for the economy and for society, with particular attention to the expected…

  15. Reliability and validity of the neurorehabilitation experience questionnaire for inpatients.

    PubMed

    Kneebone, Ian I; Hull, Samantha L; McGurk, Rhona; Cropley, Mark

    2012-09-01

    Patient-centered measures of the inpatient neurorehabilitation experience are needed to assess services. The objective of this study was to develop a valid and reliable Neurorehabilitation Experience Questionnaire (NREQ) to assess whether neurorehabilitation inpatients experience service elements important to them. Based on the themes established in prior qualitative research, adopting questions from established inventories and using a literature review, a draft version of the NREQ was generated. Focus groups and interviews were conducted with 9 patients and 26 staff from neurological rehabilitation units to establish face validity. Then, 70 patients were recruited to complete the NREQ to ascertain reliability (internal and test-retest) and concurrent validity. On the basis of the face validity testing, several modifications were made to the draft version of the NREQ. Subsequently, internal reliability (time 1 α = .76, time 2 α = .80), test retest reliability (r = 0.70), and concurrent validity (r = 0.32 and r = 0.56) were established for the revised version. Whereas responses were associated with positive mood (r = 0.30), they appeared not to be influenced by negative mood, age, education, length of stay, sex, functional independence, or whether a participant had been a patient on a unit previously. Preliminary validation of the NREQ suggests promise for use with its target population.

  16. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  17. The Grand Banks ERS-1 SAR wave spectra validation experiment

    NASA Technical Reports Server (NTRS)

    Vachon, P. W.; Dobson, F. W.; Smith, S. D.; Anderson, R. J.; Buckley, J. R.; Allingham, M.; Vandemark, D.; Walsh, E. J.; Khandekar, M.; Lalbeharry, R.

    1993-01-01

    As part of the ERS-1 validation program, the ERS-1 Synthetic Aperture Radar (SAR) wave spectra validation experiment was carried out over the Grand Banks of Newfoundland (Canada) in Nov. 1991. The principal objective of the experiment was to obtain complete sets of wind and wave data from a variety of calibrated instruments to validate SAR measurements of ocean wave spectra. The field program activities are described and the rather complex wind and wave conditions which were observed are summarized. Spectral comparisons with ERS-1 SAR image spectra are provided. The ERS-1 SAR is shown to have measured swell and range traveling wind seas, but did not measure azimuth traveling wind seas at any time during the experiment. Results of velocity bunching forward mapping and new measurements of the relationship between wind stress and sea state are also shown.

  18. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments.

    PubMed

    Munkácsy, Gyöngyi; Sztupinszki, Zsófia; Herman, Péter; Bán, Bence; Pénzváltó, Zsófia; Szarvas, Nóra; Győrffy, Balázs

    2016-09-27

    No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA) for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal-Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC) of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E-06). Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR) or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E-04). There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  19. Multi-decadal Arctic sea ice roughness.

    NASA Astrophysics Data System (ADS)

    Tsamados, M.; Stroeve, J.; Kharbouche, S.; Muller, J. P., , Prof; Nolin, A. W.; Petty, A.; Haas, C.; Girard-Ardhuin, F.; Landy, J.

    2017-12-01

    The transformation of Arctic sea ice from mainly perennial, multi-year ice to a seasonal, first-year ice is believed to have been accompanied by a reduction of the roughness of the ice cover surface. This smoothening effect has been shown to (i) modify the momentum and heat transfer between the atmosphere and ocean, (ii) to alter the ice thickness distribution which in turn controls the snow and melt pond repartition over the ice cover, and (iii) to bias airborne and satellite remote sensing measurements that depend on the scattering and reflective characteristics over the sea ice surface topography. We will review existing and novel remote sensing methodologies proposed to estimate sea ice roughness, ranging from airborne LIDAR measurement (ie Operation IceBridge), to backscatter coefficients from scatterometers (ASCAT, QUICKSCAT), to multi angle maging spectroradiometer (MISR), and to laser (Icesat) and radar altimeters (Envisat, Cryosat, Altika, Sentinel-3). We will show that by comparing and cross-calibrating these different products we can offer a consistent multi-mission, multi-decadal view of the declining sea ice roughness. Implications for sea ice physics, climate and remote sensing will also be discussed.

  20. Mass loss of the Greenland peripheral glaciers and ice caps from satellite altimetry

    NASA Astrophysics Data System (ADS)

    Wouters, Bert; Noël, Brice; Moholdt, Geir; Ligtenberg, Stefan; van den Broeke, Michiel

    2017-04-01

    At its rapidly warming margins, the Greenland Ice Sheet is surrounded by (semi-)detached glaciers and ice caps (GIC). Although they cover only roughly 5% of the total glaciated area in the region, they are estimated to account for 15-20% of the total sea level rise contribution of Greenland. The spatial and temporal evolution of the mass changes of the peripheral GICs, however, remains poorly constrained. In this presentation, we use satellite altimetry from ICESat and Cryosat-2 combined with a high-resolution regional climate model to derive a 14 year time series (2003-2016) of regional elevation and mass changes. The total mass loss has been relatively constant during this period, but regionally, the GICs show marked temporal variations. Whereas thinning was concentrated along the eastern margin during 2003-2009, western GICs became the prime sea level rise contributors in recent years. Mass loss in the northern region has been steadily increasing throughout the record, due to a strong atmospheric warning and a deterioration of the capacity of the firn layer to buffer the resulting melt water.

  1. Validation of Competences and Professionalisation of Teachers and Trainers = Validation des Acquis et Professionnalisation des Enseignants et Formateurs. CEDEFOP Dossier Series.

    ERIC Educational Resources Information Center

    de Blignieres-Legeraud, Anne; Bjornavold, Jens; Charraud, Anne-Marie; Gerard, Francoise; Diamanti, Stamatina; Freundlinger, Alfred; Bjerknes, Ellen; Covita, Horacio

    A workshop aimed to clarify under what conditions the validation of knowledge gained through experience can be considered a professionalizing factor for European Union teachers and trainers by creating a better link between experience and training and between vocational training and qualifications. Seven papers were presented in addition to an…

  2. Comparison of airborne passive and active L-band System (PALS) brightness temperature measurements to SMOS observations during the SMAP validation experiment 2012 (SMAPVEX12)

    USDA-ARS?s Scientific Manuscript database

    The purpose of SMAP (Soil Moisture Active Passive) Validation Experiment 2012 (SMAPVEX12) campaign was to collect data for the pre-launch development and validation of SMAP soil moisture algorithms. SMAP is a National Aeronautics and Space Administration’s (NASA) satellite mission designed for the m...

  3. SeaSat-A Satellite Scatterometer (SASS) Validation and Experiment Plan

    NASA Technical Reports Server (NTRS)

    Schroeder, L. C. (Editor)

    1978-01-01

    This plan was generated by the SeaSat-A satellite scatterometer experiment team to define the pre-and post-launch activities necessary to conduct sensor validation and geophysical evaluation. Details included are an instrument and experiment description/performance requirements, success criteria, constraints, mission requirements, data processing requirement and data analysis responsibilities.

  4. The inventory for déjà vu experiences assessment. Development, utility, reliability, and validity.

    PubMed

    Sno, H N; Schalken, H F; de Jonghe, F; Koeter, M W

    1994-01-01

    In this article the development, utility, reliability, and validity of the Inventory for Déjà vu Experiences Assessment (IDEA) are described. The IDEA is a 23-item self-administered questionnaire consisting of a general section of nine questions and qualitative section of 14 questions. The latter questions comprise 48 topics. The questionnaire appeared to be a user-friendly instrument with satisfactory to good reliability and validity. The IDEA permits the study of quantitative and qualitative characteristics of déjà vu experiences.

  5. Results from SMAP Validation Experiments 2015 and 2016

    NASA Astrophysics Data System (ADS)

    Colliander, A.; Jackson, T. J.; Cosh, M. H.; Misra, S.; Crow, W.; Powers, J.; Wood, E. F.; Mohanty, B.; Judge, J.; Drewry, D.; McNairn, H.; Bullock, P.; Berg, A. A.; Magagi, R.; O'Neill, P. E.; Yueh, S. H.

    2017-12-01

    NASA's Soil Moisture Active Passive (SMAP) mission was launched in January 2015. The objective of the mission is global mapping of soil moisture and freeze/thaw state. Well-characterized sites with calibrated in situ soil moisture measurements are used to determine the quality of the soil moisture data products; these sites are designated as core validation sites (CVS). To support the CVS-based validation, airborne field experiments are used to provide high-fidelity validation data and to improve the SMAP retrieval algorithms. The SMAP project and NASA coordinated airborne field experiments at three CVS locations in 2015 and 2016. SMAP Validation Experiment 2015 (SMAPVEX15) was conducted around the Walnut Gulch CVS in Arizona in August, 2015. SMAPVEX16 was conducted at the South Fork CVS in Iowa and Carman CVS in Manitoba, Canada from May to August 2016. The airborne PALS (Passive Active L-band Sensor) instrument mapped all experiment areas several times resulting in 30 coincidental measurements with SMAP. The experiments included intensive ground sampling regime consisting of manual sampling and augmentation of the CVS soil moisture measurements with temporary networks of soil moisture sensors. Analyses using the data from these experiments have produced various results regarding the SMAP validation and related science questions. The SMAPVEX15 data set has been used for calibration of a hyper-resolution model for soil moisture product validation; development of a multi-scale parameterization approach for surface roughness, and validation of disaggregation of SMAP soil moisture with optical thermal signal. The SMAPVEX16 data set has been already used for studying the spatial upscaling within a pixel with highly heterogeneous soil texture distribution; for understanding the process of radiative transfer at plot scale in relation to field scale and SMAP footprint scale over highly heterogeneous vegetation distribution; for testing a data fusion based soil moisture downscaling approach; and for investigating soil moisture impact on estimation of vegetation fluorescence from airborne measurements. The presentation will describe the collected data and showcase some of the most important results achieved so far.

  6. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.

  7. Development and Validation of the Caring Loneliness Scale.

    PubMed

    Karhe, Liisa; Kaunonen, Marja; Koivisto, Anna-Maija

    2016-12-01

    The Caring Loneliness Scale (CARLOS) includes 5 categories derived from earlier qualitative research. This article assesses the reliability and construct validity of a scale designed to measure patient experiences of loneliness in a professional caring relationship. Statistical analysis with 4 different sample sizes included Cronbach's alpha and exploratory factor analysis with principal axis factoring extraction. The sample size of 250 gave the most useful and comprehensible structure, but all 4 samples yielded underlying content of loneliness experiences. The initial 5 categories were reduced to 4 factors with 24 items and Cronbach's alpha ranging from .77 to .90. The findings support the reliability and validity of CARLOS for the assessment of Finnish breast cancer and heart surgery patients' experiences but as all instruments, further validation is needed.

  8. Improving Generalizations from Experiments Using Propensity Score Subclassification: Assumptions, Properties, and Contexts

    ERIC Educational Resources Information Center

    Tipton, Elizabeth

    2013-01-01

    As a result of the use of random assignment to treatment, randomized experiments typically have high internal validity. However, units are very rarely randomly selected from a well-defined population of interest into an experiment; this results in low external validity. Under nonrandom sampling, this means that the estimate of the sample average…

  9. PSI-Center Simulations of Validation Platform Experiments

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2013-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), PHD/ELF (UW/MSNW), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, non-local closures, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition is proving to be a powerful method to compare global temporal and spatial structures for validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  10. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Barton L.

    2016-08-16

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical givenmore » the time and funds available« less

  11. [Ethic review on clinical experiments of medical devices in medical institutions].

    PubMed

    Shuai, Wanjun; Chao, Yong; Wang, Ning; Xu, Shining

    2011-07-01

    Clinical experiments are always used to evaluate the safety and validity of medical devices. The experiments have two types of clinical trying and testing. Ethic review must be done by the ethics committee of the medical department with the qualification of clinical research, and the approval must be made before the experiments. In order to ensure the safety and validity of clinical experiments of medical devices in medical institutions, the contents, process and approval criterions of the ethic review were analyzed and discussed.

  12. Mission design concepts for repeat groundtrack orbits and application to the ICESat mission

    NASA Astrophysics Data System (ADS)

    Pie, Nadege

    The primary objective of the NASA sponsored ICESat mission is to study the short and long term changes in the ice mass in the Greenland and Antarctica regions. The satellite was therefore placed into a frozen near-polar near-circular repeat groundtrack to ensure an adequate coverage of the polar regions while keeping the groundtrack periodic and reducing the variations in the orbital elements, and more specifically the semi-major axis of the ICESat orbit. After launch, a contingency plan had to be devised to compensate for a laser that dangerously compromised the lifetime of the ICESat mission. This new plan makes an intensive use of the ICESat subcycles, a characteristic of the repeat groundtrack orbits often over-looked. The subcycle of a repeat groundtrack orbit provide global coverage within a time shorter than the groundtrack repetition period. For a satellite with an off-nadir pointing capacity, the subcycles provide near-repeat tracks which represents added opportunity for altimetry measurement over a specific track. The ICESat subcycles were also used in a very innovative fashion to reposition the satellite within its repeat cycle via orbital maneuvers called phasing maneuver. The necessary theoretical framework is provided for the subcycle analysis and the implementation of phasing maneuvers for any future repeat orbit mission. In the perspective of performing cross-validation of missions like CryoSat using the ICESat off-nadir capacity, a study was conducted to determine the geolocations of crossovers between two different repeat groundtrack Keplerian orbits. The general analytical solution was applied to ICESat vs. several other repeat groundtrack orbit mission, including the future ICESat-II mission. ICESat's repeat groundtrack orbit was designed using a disturbing force model that includes only the Earth geopotential. Though the third body effect from the Sun and the Moon was neglected in the orbit design, it does in fact disrupt the repeatability condition of the groundtrack and consequently implies orbit correction maneuvers. The perturbations on ICESat orbit due to the third body effect are studied as a preliminary work towards including these forces in the design of the future ICESat-II repeat groundtrack orbit.

  13. The Faculty Self-Reported Assessment Survey (FRAS): Differentiating Faculty Knowledge and Experience in Assessment

    PubMed Central

    Hanauer, David I.; Bauerle, Cynthia

    2015-01-01

    Science, technology, engineering, and mathematics education reform efforts have called for widespread adoption of evidence-based teaching in which faculty members attend to student outcomes through assessment practice. Awareness about the importance of assessment has illuminated the need to understand what faculty members know and how they engage with assessment knowledge and practice. The Faculty Self-Reported Assessment Survey (FRAS) is a new instrument for evaluating science faculty assessment knowledge and experience. Instrument validation was composed of two distinct studies: an empirical evaluation of the psychometric properties of the FRAS and a comparative known-groups validation to explore the ability of the FRAS to differentiate levels of faculty assessment experience. The FRAS was found to be highly reliable (α = 0.96). The dimensionality of the instrument enabled distinction of assessment knowledge into categories of program design, instrumentation, and validation. In the known-groups validation, the FRAS distinguished between faculty groups with differing levels of assessment experience. Faculty members with formal assessment experience self-reported higher levels of familiarity with assessment terms, higher frequencies of assessment activity, increased confidence in conducting assessment, and more positive attitudes toward assessment than faculty members who were novices in assessment. These results suggest that the FRAS can reliably and validly differentiate levels of expertise in faculty knowledge of assessment. PMID:25976653

  14. CFD validation experiments at McDonnell Aircraft Company

    NASA Technical Reports Server (NTRS)

    Verhoff, August

    1987-01-01

    Information is given in viewgraph form on computational fluid dynamics (CFD) validation experiments at McDonnell Aircraft Company. Topics covered include a high speed research model, a supersonic persistence fighter model, a generic fighter wing model, surface grids, force and moment predictions, surface pressure predictions, forebody models with 65 degree clipped delta wings, and the low aspect ratio wing/body experiment.

  15. Cultural Adaptation of the Portuguese Version of the “Sniffin’ Sticks” Smell Test: Reliability, Validity, and Normative Data

    PubMed Central

    Ribeiro, João Carlos; Simões, João; Silva, Filipe; Silva, Eduardo D.; Hummel, Cornelia; Hummel, Thomas; Paiva, António

    2016-01-01

    The cross-cultural adaptation and validation of the Sniffin`Sticks test for the Portuguese population is described. Over 270 people participated in four experiments. In Experiment 1, 67 participants rated the familiarity of presented odors and seven descriptors of the original test were adapted to a Portuguese context. In Experiment 2, the Portuguese version of Sniffin`Sticks test was administered to 203 healthy participants. Older age, male gender and active smoking status were confirmed as confounding factors. The third experiment showed the validity of the Portuguese version of Sniffin`Sticks test in discriminating healthy controls from patients with olfactory dysfunction. In Experiment 4, the test-retest reliability for both the composite score (r71 = 0.86) and the identification test (r71 = 0.62) was established (p<0.001). Normative data for the Portuguese version of Sniffin`Sticks test is provided, showing good validity and reliability and effectively distinguishing patients from healthy controls with high sensitivity and specificity. The Portuguese version of Sniffin`Sticks test identification test is a clinically suitable screening tool in routine outpatient Portuguese settings. PMID:26863023

  16. Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.

    PubMed

    Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B

    2018-01-01

    The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.

  17. DC-8 and ER-2 in Sweden for the Sage III Ozone Loss and Validation Experiment (SOLVE)

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This 48 second video shows Dryden's Airborne Science aircraft in Kiruna Sweden in January 2000. The DC-8 and ER-2 conducted atmospheric studies for the Sage III Ozone Loss and Validation Experiment (SOLVE).

  18. Linguistic and content validation of a German-language PRO-CTCAE-based patient-reported outcomes instrument to evaluate the late effect symptom experience after allogeneic hematopoietic stem cell transplantation.

    PubMed

    Kirsch, Monika; Mitchell, Sandra A; Dobbels, Fabienne; Stussi, Georg; Basch, Ethan; Halter, Jorg P; De Geest, Sabina

    2015-02-01

    The aim of this sequential mixed methods study was to develop a PRO-CTCAE (Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events)-based measure of the symptom experience of late effects in German speaking long-term survivors of allogeneic stem cell transplantation (SCT), and to examine its content validity. The US National Cancer Institute's PRO-CTAE item library was translated into German and linguistically validated. PRO-CTCAE symptoms prevalent in ≥50% of survivors (n = 15) and recognized in its importance by SCT experts (n = 9) were identified. Additional concepts relevant to the symptom experience and its consequences were elicited. Content validity of the PROVIVO (Patient-Reported Outcomes of long-term survivors after allogeneic SCT) instrument was assessed through an additional round of cognitive debriefing in 15 patients, and item and scale content validity indices by 9 experts. PROVIVO is comprised of a total of 49 items capturing the experience of physical, emotional and cognitive symptoms. To improve the instrument's utility for clinical decision-making, questions soliciting limitations in activities of daily living, frequent infections, and overall well-being were added. Cognitive debriefings demonstrated that items were well understood and relevant to the SCT survivor experience. Scale Content Validity Index (CVI) (0.94) and item CVI (median = 1; range 0.75-1) were very high. Qualitative and quantitative data provide preliminary evidence supporting the content validity of PROVIVO and identify a PRO-CTCAE item bundle for use in SCT survivors. A study to evaluate the measurement properties of PROVIVO and to examine its capacity to improve survivorship care planning is underway. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Cloud computing and validation of expandable in silico livers.

    PubMed

    Ropella, Glen E P; Hunt, C Anthony

    2010-12-03

    In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware.

  20. Finite element analysis of dental implants with validation: to what extent can we expect the model to predict biological phenomena? A literature review and proposal for classification of a validation process.

    PubMed

    Chang, Yuanhan; Tambe, Abhijit Anil; Maeda, Yoshinobu; Wada, Masahiro; Gonda, Tomoya

    2018-03-08

    A literature review of finite element analysis (FEA) studies of dental implants with their model validation process was performed to establish the criteria for evaluating validation methods with respect to their similarity to biological behavior. An electronic literature search of PubMed was conducted up to January 2017 using the Medical Subject Headings "dental implants" and "finite element analysis." After accessing the full texts, the context of each article was searched using the words "valid" and "validation" and articles in which these words appeared were read to determine whether they met the inclusion criteria for the review. Of 601 articles published from 1997 to 2016, 48 that met the eligibility criteria were selected. The articles were categorized according to their validation method as follows: in vivo experiments in humans (n = 1) and other animals (n = 3), model experiments (n = 32), others' clinical data and past literature (n = 9), and other software (n = 2). Validation techniques with a high level of sufficiency and efficiency are still rare in FEA studies of dental implants. High-level validation, especially using in vivo experiments tied to an accurate finite element method, needs to become an established part of FEA studies. The recognition of a validation process should be considered when judging the practicality of an FEA study.

  1. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  2. Free Radicals and Reactive Intermediates for the SAGE III Ozone Loss and Validation Experiment (SOLVE) Mission

    NASA Technical Reports Server (NTRS)

    Anderson, James G.

    2001-01-01

    This grant provided partial support for participation in the SAGE III Ozone Loss and Validation Experiment. The NASA-sponsored SOLVE mission was conducted Jointly with the European Commission-sponsored Third European Stratospheric Experiment on Ozone (THESEO 2000). Researchers examined processes that control ozone amounts at mid to high latitudes during the arctic winter and acquired correlative data needed to validate the Stratospheric Aerosol and Gas Experiment (SAGE) III satellite measurements that are used to quantitatively assess high-latitude ozone loss. The campaign began in September 1999 with intercomparison flights out of NASA Dryden Flight Research Center in Edwards. CA. and continued through March 2000. with midwinter deployments out of Kiruna. Sweden. SOLVE was co-sponsored by the Upper Atmosphere Research Program (UARP). Atmospheric Effects of Aviation Project (AEAP). Atmospheric Chemistry Modeling and Analysis Program (ACMAP). and Earth Observing System (EOS) of NASA's Earth Science Enterprise (ESE) as part of the validation program for the SAGE III instrument.

  3. Earth Radiation Budget Experiment (ERBE) validation

    NASA Technical Reports Server (NTRS)

    Barkstrom, Bruce R.; Harrison, Edwin F.; Smith, G. Louis; Green, Richard N.; Kibler, James F.; Cess, Robert D.

    1990-01-01

    During the past 4 years, data from the Earth Radiation Budget Experiment (ERBE) have been undergoing detailed examination. There is no direct source of groundtruth for the radiation budget. Thus, this validation effort has had to rely heavily upon intercomparisons between different types of measurements. The ERBE SCIENCE Team chose 10 measures of agreement as validation criteria. Late in August 1988, the Team agreed that the data met these conditions. As a result, the final, monthly averaged data products are being archived. These products, their validation, and some results for January 1986 are described. Information is provided on obtaining the data from the archive.

  4. Initial Retrieval Validation from the Joint Airborne IASI Validation Experiment (JAIVEx)

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Smith, WIlliam L.; Larar, Allen M.; Taylor, Jonathan P.; Revercomb, Henry E.; Mango, Stephen A.; Schluessel, Peter; Calbet, Xavier

    2007-01-01

    The Joint Airborne IASI Validation Experiment (JAIVEx) was conducted during April 2007 mainly for validation of the Infrared Atmospheric Sounding Interferometer (IASI) on the MetOp satellite, but also included a strong component focusing on validation of the Atmospheric InfraRed Sounder (AIRS) aboard the AQUA satellite. The cross validation of IASI and AIRS is important for the joint use of their data in the global Numerical Weather Prediction process. Initial inter-comparisons of geophysical products have been conducted from different aspects, such as using different measurements from airborne ultraspectral Fourier transform spectrometers (specifically, the NPOESS Airborne Sounder Testbed Interferometer (NAST-I) and the Scanning-High resolution Interferometer Sounder (S-HIS) aboard the NASA WB-57 aircraft), UK Facility for Airborne Atmospheric Measurements (FAAM) BAe146-301 aircraft insitu instruments, dedicated dropsondes, radiosondes, and ground based Raman Lidar. An overview of the JAIVEx retrieval validation plan and some initial results of this field campaign are presented.

  5. Further Validation of the Coach Identity Prominence Scale

    ERIC Educational Resources Information Center

    Pope, J. Paige; Hall, Craig R.

    2014-01-01

    This study was designed to examine select psychometric properties of the Coach Identity Prominence Scale (CIPS), including the reliability, factorial validity, convergent validity, discriminant validity, and predictive validity. Coaches (N = 338) who averaged 37 (SD = 12.27) years of age, had a mean of 13 (SD = 9.90) years of coaching experience,…

  6. Modeling the effects of argument length and validity on inductive and deductive reasoning.

    PubMed

    Rotello, Caren M; Heit, Evan

    2009-09-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were observed: Induction judgments were more affected by argument length, and deduction judgments were more affected by validity. In Experiment 2, fluency was manipulated by displaying the materials in a low-contrast font, leading to increased sensitivity to logical validity. Several variants of 1-process and 2-process models of reasoning were assessed against the results. A 1-process model that assumed the same scale of argument strength underlies induction and deduction was not successful. A 2-process model that assumed separate, continuous informational dimensions of apparent deductive validity and associative strength gave the more successful account. (c) 2009 APA, all rights reserved.

  7. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  8. Robotic suturing on the FLS model possesses construct validity, is less physically demanding, and is favored by more surgeons compared with laparoscopy.

    PubMed

    Stefanidis, Dimitrios; Hope, William W; Scott, Daniel J

    2011-07-01

    The value of robotic assistance for intracorporeal suturing is not well defined. We compared robotic suturing with laparoscopic suturing on the FLS model with a large cohort of surgeons. Attendees (n=117) at the SAGES 2006 Learning Center robotic station placed intracorporeal sutures on the FLS box-trainer model using conventional laparoscopic instruments and the da Vinci® robot. Participant performance was recorded using a validated objective scoring system, and a questionnaire regarding demographics, task workload, and suturing modality preference was completed. Construct validity for both tasks was assessed by comparing the performance scores of subjects with various levels of experience. A validated questionnaire was used for workload measurement. Of the participants, 84% had prior laparoscopic and 10% prior robotic suturing experience. Within the allotted time, 83% of participants completed the suturing task laparoscopically and 72% with the robot. Construct validity was demonstrated for both simulated tasks according to the participants' advanced laparoscopic experience, laparoscopic suturing experience, and self-reported laparoscopic suturing ability (p<0.001 for all) and according to prior robotic experience, robotic suturing experience, and self-reported robotic suturing ability (p<0.001 for all), respectively. While participants achieved higher suturing scores with standard laparoscopy compared with the robot (84±75 vs. 56±63, respectively; p<0.001), they found the laparoscopic task more physically demanding (NASA score 13±5 vs. 10±5, respectively; p<0.001) and favored the robot as their method of choice for intracorporeal suturing (62 vs. 38%, respectively; p<0.01). Construct validity was demonstrated for robotic suturing on the FLS model. Suturing scores were higher using standard laparoscopy likely as a result of the participants' greater experience with laparoscopic suturing versus robotic suturing. Robotic assistance decreases the physical demand of intracorporeal suturing compared with conventional laparoscopy and, in this study, was the preferred suturing method by most surgeons. Curricula for robotic suturing training need to be developed.

  9. In-Space Structural Validation Plan for a Stretched-Lens Solar Array Flight Experiment

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Woods-Vedeler, Jessica A.; Jones, Thomas W.

    2001-01-01

    This paper summarizes in-space structural validation plans for a proposed Space Shuttle-based flight experiment. The test article is an innovative, lightweight solar array concept that uses pop-up, refractive stretched-lens concentrators to achieve a power/mass density of at least 175 W/kg, which is more than three times greater than current capabilities. The flight experiment will validate this new technology to retire the risk associated with its first use in space. The experiment includes structural diagnostic instrumentation to measure the deployment dynamics, static shape, and modes of vibration of the 8-meter-long solar array and several of its lenses. These data will be obtained by photogrammetry using the Shuttle payload-bay video cameras and miniature video cameras on the array. Six accelerometers are also included in the experiment to measure base excitations and small-amplitude tip motions.

  10. Development and initial validation of the Parental PELICAN Questionnaire (PaPEQu)--an instrument to assess parental experiences and needs during their child's end-of-life care.

    PubMed

    Zimmermann, Karin; Cignacco, Eva; Eskola, Katri; Engberg, Sandra; Ramelet, Anne-Sylvie; Von der Weid, Nicolas; Bergstraesser, Eva

    2015-12-01

    To develop and test the Parental PELICAN Questionnaire, an instrument to retrospectively assess parental experiences and needs during their child's end-of-life care. To offer appropriate care for dying children, healthcare professionals need to understand the illness experience from the family perspective. A questionnaire specific to the end-of-life experiences and needs of parents losing a child is needed to evaluate the perceived quality of paediatric end-of-life care. This is an instrument development study applying mixed methods based on recommendations for questionnaire design and validation. The Parental PELICAN Questionnaire was developed in four phases between August 2012-March 2014: phase 1: item generation; phase 2: validity testing; phase 3: translation; phase 4: pilot testing. Psychometric properties were assessed after applying the Parental PELICAN Questionnaire in a sample of 224 bereaved parents in April 2014. Validity testing covered the evidence based on tests of content, internal structure and relations to other variables. The Parental PELICAN Questionnaire consists of approximately 90 items in four slightly different versions accounting for particularities of the four diagnostic groups. The questionnaire's items were structured according to six quality domains described in the literature. Evidence of initial validity and reliability could be demonstrated with the involvement of healthcare professionals and bereaved parents. The Parental PELICAN Questionnaire holds promise as a measure to assess parental experiences and needs and is applicable to a broad range of paediatric specialties and settings. Future validation is needed to evaluate its suitability in different cultures. © 2015 John Wiley & Sons Ltd.

  11. Validation experiments to determine radiation partitioning of heat flux to an object in a fully turbulent fire.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricks, Allen; Blanchat, Thomas K.; Jernigan, Dann A.

    2006-06-01

    It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heatmore » (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.« less

  12. In-Trail Procedure Air Traffic Control Procedures Validation Simulation Study

    NASA Technical Reports Server (NTRS)

    Chartrand, Ryan C.; Hewitt, Katrin P.; Sweeney, Peter B.; Graff, Thomas J.; Jones, Kenneth M.

    2012-01-01

    In August 2007, Airservices Australia (Airservices) and the United States National Aeronautics and Space Administration (NASA) conducted a validation experiment of the air traffic control (ATC) procedures associated with the Automatic Dependant Surveillance-Broadcast (ADS-B) In-Trail Procedure (ITP). ITP is an Airborne Traffic Situation Awareness (ATSA) application designed for near-term use in procedural airspace in which ADS-B data are used to facilitate climb and descent maneuvers. NASA and Airservices conducted the experiment in Airservices simulator in Melbourne, Australia. Twelve current operational air traffic controllers participated in the experiment, which identified aspects of the ITP that could be improved (mainly in the communication and controller approval process). Results showed that controllers viewed the ITP as valid and acceptable. This paper describes the experiment design and results.

  13. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments.

    PubMed

    Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua

    2018-01-04

    Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    PubMed Central

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  15. Validation of an Instrument to Measure Community College Student Satisfaction

    ERIC Educational Resources Information Center

    Zhai, Lijuan

    2012-01-01

    This article reports the development and validation of a survey instrument to measure community college students' satisfaction with their educational experiences. The initial survey included 95 questions addressing community college student experiences. Data were collected from 558 community college students during spring of 2001. An exploratory…

  16. Validation Experiences and Persistence among Community College Students

    ERIC Educational Resources Information Center

    Barnett, Elisabeth A.

    2011-01-01

    The purpose of this correlational research was to examine the extent to which community college students' experiences with validation by faculty (Rendon, 1994, 2002) predicted: (a) their sense of integration, and (b) their intent to persist. The research was designed as an elaboration of constructs within Tinto's (1993) Longitudinal Model of…

  17. Interpreting operational altimetry signals in near-coastal areas using underwater autonomous vehicles and remotely sensed ocean colour data

    NASA Astrophysics Data System (ADS)

    Borrione, Ines; Oddo, Paolo; Russo, Aniello; Coelho, Emanuel

    2017-04-01

    During the LOGMEC16 (Long-Term Glider Mission for Environmental Characterization) sea trial carried out in the eastern Ligurian Sea (Northwestern Mediterranean Sea), two oceanographic gliders rated to a maximum depth of 1000m were operating continuously from 3 May to 27 June 2016. When possible, glider tracks were synchronized with the footprints of contemporaneous altimeters (i.e., Jason 2, Altika and Cryosat 2). Temperature and salinity measured by the gliders along the tracks that were co-localized with the altimeter passages, were used to calculate along-track dynamic heights. The latter were then compared with near-real time absolute sea level CMEMS-TAPAS (Copernicus Marine Environment Monitoring Service - Tailored Product for Data Assimilation) product. TAPAS provides along-track sea level anomaly (SLA) estimates together with all the terms used in the correction and the associated Mean Dynamic Topography. Where available, the CMEMS near-real time 1km resolution, Aqua-MODIS ocean colour data was also used as a tracer of the main oceanographic features of the region. Comparison between SLA derived from gliders and TAPAS along common transects, indicates that differences increase for larger sampling time lags between platforms and especially when time differences exceed 20 hrs. In fact, contemporaneous ocean color images reveal the presence of several mesoscale/sub-mesoscale structures (i.e., transient meanders and filaments), suggesting that the oceanographic variability of the region is likely the main cause for the differences observed between the glider and altimetry-based SLA. Results from this study provide additional evidence of the advantages on using a networked ocean observing system. In fact, the interpretation of in-situ observations obtained from a continuously operating sampling platform (also during ongoing experiments at sea) can be greatly improved when combined with other operational datasets, as the CMEMS SLA used here.

  18. Reconceptualising the external validity of discrete choice experiments.

    PubMed

    Lancsar, Emily; Swait, Joffre

    2014-10-01

    External validity is a crucial but under-researched topic when considering using discrete choice experiment (DCE) results to inform decision making in clinical, commercial or policy contexts. We present the theory and tests traditionally used to explore external validity that focus on a comparison of final outcomes and review how this traditional definition has been empirically tested in health economics and other sectors (such as transport, environment and marketing) in which DCE methods are applied. While an important component, we argue that the investigation of external validity should be much broader than a comparison of final outcomes. In doing so, we introduce a new and more comprehensive conceptualisation of external validity, closely linked to process validity, that moves us from the simple characterisation of a model as being or not being externally valid on the basis of predictive performance, to the concept that external validity should be an objective pursued from the initial conceptualisation and design of any DCE. We discuss how such a broader definition of external validity can be fruitfully used and suggest innovative ways in which it can be explored in practice.

  19. Preparing for the Validation Visit--Guidelines for Optimizing the Experience.

    ERIC Educational Resources Information Center

    Osborn, Hazel A.

    2003-01-01

    Urges child care programs to seek accreditation from NAEYC's National Academy of Early Childhood Programs to increase program quality and provides information on the validation process. Includes information on the validation visit and the validator's role and background. Offers suggestions for preparing the director, staff, children, and families…

  20. Ego-Dissolution and Psychedelics: Validation of the Ego-Dissolution Inventory (EDI).

    PubMed

    Nour, Matthew M; Evans, Lisa; Nutt, David; Carhart-Harris, Robin L

    2016-01-01

    The experience of a compromised sense of "self", termed ego-dissolution, is a key feature of the psychedelic experience. This study aimed to validate the Ego-Dissolution Inventory (EDI), a new 8-item self-report scale designed to measure ego-dissolution. Additionally, we aimed to investigate the specificity of the relationship between psychedelics and ego-dissolution. Sixteen items relating to altered ego-consciousness were included in an internet questionnaire; eight relating to the experience of ego-dissolution (comprising the EDI), and eight relating to the antithetical experience of increased self-assuredness, termed ego-inflation. Items were rated using a visual analog scale. Participants answered the questionnaire for experiences with classical psychedelic drugs, cocaine and/or alcohol. They also answered the seven questions from the Mystical Experiences Questionnaire (MEQ) relating to the experience of unity with one's surroundings. Six hundred and ninety-one participants completed the questionnaire, providing data for 1828 drug experiences (1043 psychedelics, 377 cocaine, 408 alcohol). Exploratory factor analysis demonstrated that the eight EDI items loaded exclusively onto a single common factor, which was orthogonal to a second factor comprised of the items relating to ego-inflation (rho = -0.110), demonstrating discriminant validity. The EDI correlated strongly with the MEQ-derived measure of unitive experience (rho = 0.735), demonstrating convergent validity. EDI internal consistency was excellent (Cronbach's alpha 0.93). Three analyses confirmed the specificity of ego-dissolution for experiences occasioned by psychedelic drugs. Firstly, EDI score correlated with drug-dose for psychedelic drugs (rho = 0.371), but not for cocaine (rho = 0.115) or alcohol (rho = -0.055). Secondly, the linear regression line relating the subjective intensity of the experience to ego-dissolution was significantly steeper for psychedelics (unstandardized regression coefficient = 0.701) compared with cocaine (0.135) or alcohol (0.144). Ego-inflation, by contrast, was specifically associated with cocaine experiences. Finally, a binary Support Vector Machine classifier identified experiences occasioned by psychedelic drugs vs. cocaine or alcohol with over 85% accuracy using ratings of ego-dissolution and ego-inflation alone. Our results demonstrate the psychometric structure, internal consistency and construct validity of the EDI. Moreover, we demonstrate the close relationship between ego-dissolution and the psychedelic experience. The EDI will facilitate the study of the neuronal correlates of ego-dissolution, which is relevant for psychedelic-assisted psychotherapy and our understanding of psychosis.

  1. Development and validation of the Consumer Quality index instrument to measure the experience and priority of chronic dialysis patients.

    PubMed

    van der Veer, Sabine N; Jager, Kitty J; Visserman, Ella; Beekman, Robert J; Boeschoten, Els W; de Keizer, Nicolette F; Heuveling, Lara; Stronks, Karien; Arah, Onyebuchi A

    2012-08-01

    Patient experience is an established indicator of quality of care. Validated tools that measure both experiences and priorities are lacking for chronic dialysis care, hampering identification of negative experiences that patients actually rate important. We developed two Consumer Quality (CQ) index questionnaires, one for in-centre haemodialysis (CHD) and the other for peritoneal dialysis and home haemodialysis (PHHD) care. The instruments were validated using exploratory factor analyses, reliability analysis of identified scales and assessing the association between reliable scales and global ratings. We investigated opportunities for improvement by combining suboptimal experience with patient priority. Sixteen dialysis centres participated in our study. The pilot CQ index for CHD care consisted of 71 questions. Based on data of 592 respondents, we identified 42 core experience items in 10 scales with Cronbach's α ranging from 0.38 to 0.88; five were reliable (α ≥ 0.70). The instrument identified information on centres' fire procedures as the aspect of care exhibiting the biggest opportunity for improvement. The pilot CQ index PHHD comprised 56 questions. The response of 248 patients yielded 31 core experience items in nine scales with Cronbach's α ranging between 0.53 and 0.85; six were reliable. Information on kidney transplantation during pre-dialysis showed most room for improvement. However, for both types of care, opportunities for improvement were mostly limited. The CQ index reliably and validly captures dialysis patient experience. Overall, most care aspects showed limited room for improvement, mainly because patients participating in our study rated their experience to be optimal. To evaluate items with high priority, but with which relatively few patients have experience, more qualitative instruments should be considered.

  2. A 25-year Record of Antarctic Ice Sheet Elevation and Mass Change

    NASA Astrophysics Data System (ADS)

    Shepherd, A.; Muir, A. S.; Sundal, A.; McMillan, M.; Briggs, K.; Hogg, A.; Engdahl, M.; Gilbert, L.

    2017-12-01

    Since 1992, the European Remote-Sensing (ERS-1 and ERS-2), ENVISAT, and CryoSat-2 satellite radar altimeters have measured the Antarctic ice sheet surface elevation, repeatedly, at approximately monthly intervals. These data constitute the longest continuous record of ice sheet wide change. In this paper, we use these observations to determine changes in the elevation, volume and mass of the East Antarctic and West Antarctic ice sheets, and of parts of the Antarctic Peninsula ice sheet, over a 25-year period. The root mean square difference between elevation rates computed from our survey and 257,296 estimates determined from airborne laser measurements is 54 cm/yr. The longevity of the satellite altimeter data record allows to identify and chart the evolution of changes associated with meteorology and ice flow, and we estimate that 3.6 % of the continental ice sheet, and 21.7 % of West Antarctica, is in a state of dynamical imbalance. Based on this partitioning, we estimate the mass balance of the East and West Antarctic ice sheet drainage basins and the root mean square difference between these and independent estimates derived from satellite gravimetry is less than 5 Gt yr-1.

  3. Minimum Number of Observation Points for LEO Satellite Orbit Estimation by OWL Network

    NASA Astrophysics Data System (ADS)

    Park, Maru; Jo, Jung Hyun; Cho, Sungki; Choi, Jin; Kim, Chun-Hwey; Park, Jang-Hyun; Yim, Hong-Suh; Choi, Young-Jun; Moon, Hong-Kyu; Bae, Young-Ho; Park, Sun-Youp; Kim, Ji-Hye; Roh, Dong-Goo; Jang, Hyun-Jung; Park, Young-Sik; Jeong, Min-Ji

    2015-12-01

    By using the Optical Wide-field Patrol (OWL) network developed by the Korea Astronomy and Space Science Institute (KASI) we generated the right ascension and declination angle data from optical observation of Low Earth Orbit (LEO) satellites. We performed an analysis to verify the optimum number of observations needed per arc for successful estimation of orbit. The currently functioning OWL observatories are located in Daejeon (South Korea), Songino (Mongolia), and Oukaïmeden (Morocco). The Daejeon Observatory is functioning as a test bed. In this study, the observed targets were Gravity Probe B, COSMOS 1455, COSMOS 1726, COSMOS 2428, SEASAT 1, ATV-5, and CryoSat-2 (all in LEO). These satellites were observed from the test bed and the Songino Observatory of the OWL network during 21 nights in 2014 and 2015. After we estimated the orbit from systematically selected sets of observation points (20, 50, 100, and 150) for each pass, we compared the difference between the orbit estimates for each case, and the Two Line Element set (TLE) from the Joint Space Operation Center (JSpOC). Then, we determined the average of the difference and selected the optimal observation points by comparing the average values.

  4. The Reference Elevation Model of Antarctica (REMA): A High Resolution, Time-Stamped Digital Elevation Model for the Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    Howat, I.; Noh, M. J.; Porter, C. C.; Smith, B. E.; Morin, P. J.

    2017-12-01

    We are creating the Reference Elevation Model of Antarctica (REMA), a continuous, high resolution (2-8 m), high precision (accuracy better than 1 m) reference surface for a wide range of glaciological and geodetic applications. REMA will be constructed from stereo-photogrammetric Digital Surface Models (DSM) extracted from pairs of submeter resolution DigitalGlobe satellite imagery and vertically registred to precise elevations from near-coincident airborne LiDAR, ground-based GPS surveys and Cryosat-2 radar altimetry. Both a seamless mosaic and individual, time-stamped DSM strips, collected primarily between 2012 and 2016, will be distributed to enable change measurement. These data will be used for mapping bed topography from ice thickness, measuring ice thickness changes, constraining ice flow and geodynamic models, mapping glacial geomorphology, terrain corrections and filtering of remote sensing observations, and many other science tasks. Is will also be critical for mapping ice traverse routes, landing sites and other field logistics planning. REMA will also provide a critical elevation benchmark for future satellite altimetry missions including ICESat-2. Here we report on REMA production progress, initial accuracy assessment and data availability.

  5. Statistical Mechanics and the Climatology of the Arctic Sea Ice Thickness Distribution

    NASA Astrophysics Data System (ADS)

    Wettlaufer, John; Toppaladoddi, Srikanth

    We study the seasonal changes in the thickness distribution of Arctic sea ice, g (h) , under climate forcing. Our analytical and numerical approach is based on a Fokker-Planck equation for g (h) , in which the thermodynamic growth growth rates are determined using observed climatology. In particular, the Fokker-Planck equation is coupled to an observationally consistent thermodynamic model. We find that due to the combined effects of thermodynamics and mechanics, g (h) spreads during winter and contracts during summer. This behavior is in agreement with recent satellite observations from CryoSat-2. Because g (h) is a probability density function, we quantify all of the key moments (e.g., mean thickness, fraction of thin/thick ice, mean albedo, relaxation time scales) as greenhouse-gas radiative forcing, ΔF0 , increases. The mean ice thickness decays exponentially with ΔF0 , but much slower than do solely thermodynamic models. This exhibits the crucial role that ice mechanics plays in maintaining the ice cover, by redistributing thin ice to thick ice-far more rapidly than can thermal growth alone. NASA Grant NNH13ZDA001N-CRYO and Swedish Research Council Grant No. 638-2013-9243.

  6. M2 Internal Tides and Their Observed Wavenumber Spectra from Satellite Altimetry*

    NASA Technical Reports Server (NTRS)

    Ray, R. D.; Zaron, E. D.

    2015-01-01

    A near-global chart of surface elevations associated with the stationary M2 internal tide is empirically constructed from multi-mission satellite altimeter data. An advantage of a strictly empirical mapping approach is that results are independent of assumptions about ocean wave dynamics and, in fact, can be used to test such assumptions. A disadvantage is that present-day altimeter coverage is only marginally adequate to support mapping such short-wavelength features. Moreover, predominantly north-south ground-track orientations and contamination from nontidal oceanographic variability can lead to deficiencies in mapped tides. Independent data from Cryosphere Satellite-2 (CryoSat-2) and other altimeters are used to test the solutions and show positive reduction in variance except in regions of large mesoscale variability. The tidal fields are subjected to two-dimensional wavenumber spectral analysis, which allows for the construction of an empirical map of modal wavelengths. Mode-1 wavelengths show good agreement with theoretical wavelengths calculated from the ocean's mean stratification, with a few localized exceptions (e.g., Tasman Sea). Mode-2 waves are detectable in much of the ocean, with wavelengths in reasonable agreement with theoretical expectations, but their spectral signatures grow too weak to map in some regions.

  7. The Inventory of High-School Students' Recent Life Experiences: A Decontaminated Measure of Adolescents' Hassles.

    ERIC Educational Resources Information Center

    Kohn, Paul M.; Milrose, Jill A.

    1993-01-01

    A decontaminated measure of exposures to hassles for adolescents, the Inventory of High-School Students' Recent Life Experiences (IHSSRLE), was developed and validated with 94 male and 82 female Canadian high school students. The IHSSRLE shows adequate internal consistency reliability and validity against the criterion of subjectively appraised…

  8. Pathways to Engineering: The Validation Experiences of Transfer Students

    ERIC Educational Resources Information Center

    Zhang, Yi; Ozuna, Taryn

    2015-01-01

    Community college engineering transfer students are a critical student population of engineering degree recipients and technical workforce in the United States. Focusing on this group of students, we adopted Rendón's (1994) validation theory to explore the students' experiences in community colleges prior to transferring to a four-year…

  9. Validity of Adult Retrospective Reports of Adverse Childhood Experiences: Review of the Evidence

    ERIC Educational Resources Information Center

    Hardt, Jochen; Rutter, Michael

    2004-01-01

    Background: Influential studies have cast doubt on the validity of retrospective reports by adults of their own adverse experiences in childhood. Accordingly, many researchers view retrospective reports with scepticism. Method: A computer-based search, supplemented by hand searches, was used to identify studies reported between 1980 and 2001 in…

  10. Validity And Practicality of Experiment Integrated Guided Inquiry-Based Module on Topic of Colloidal Chemistry for Senior High School Learning

    NASA Astrophysics Data System (ADS)

    Andromeda, A.; Lufri; Festiyed; Ellizar, E.; Iryani, I.; Guspatni, G.; Fitri, L.

    2018-04-01

    This Research & Development study aims to produce a valid and practical experiment integrated guided inquiry based module on topic of colloidal chemistry. 4D instructional design model was selected in this study. Limited trial of the product was conducted at SMAN 7 Padang. Instruments used were validity and practicality questionnaires. Validity and practicality data were analyzed using Kappa moment. Analysis of the data shows that Kappa moment for validity was 0.88 indicating a very high degree of validity. Kappa moments for the practicality from students and teachers were 0.89 and 0.95 respectively indicating high degree of practicality. Analysis on the module filled in by students shows that 91.37% students could correctly answer critical thinking, exercise, prelab, postlab and worksheet questions asked in the module. These findings indicate that the integrated guided inquiry based module on topic of colloidal chemistry was valid and practical for chemistry learning in senior high school.

  11. A Complete Reporting of MCNP6 Validation Results for Electron Energy Deposition in Single-Layer Extended Media for Source Energies <= 1-MeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, David A.; Hughes, Henry Grady

    In this paper, we expand on previous validation work by Dixon and Hughes. That is, we present a more complete suite of validation results with respect to to the well-known Lockwood energy deposition experiment. Lockwood et al. measured energy deposition in materials including beryllium, carbon, aluminum, iron, copper, molybdenum, tantalum, and uranium, for both single- and multi-layer 1-D geometries. Source configurations included mono-energetic, mono-directional electron beams with energies of 0.05-MeV, 0.1-MeV, 0.3- MeV, 0.5-MeV, and 1-MeV, in both normal and off-normal angles of incidence. These experiments are particularly valuable for validating electron transport codes, because they are closely represented bymore » simulating pencil beams incident on 1-D semi-infinite slabs with and without material interfaces. Herein, we include total energy deposition and energy deposition profiles for the single-layer experiments reported by Lockwood et al. (a more complete multi-layer validation will follow in another report).« less

  12. Cloud computing and validation of expandable in silico livers

    PubMed Central

    2010-01-01

    Background In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. Results The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. Conclusions The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware. PMID:21129207

  13. Psychometric Properties of Korean Version of the Second Victim Experience and Support Tool (K-SVEST).

    PubMed

    Kim, Eun-Mi; Kim, Sun-Aee; Lee, Ju-Ry; Burlison, Jonathan D; Oh, Eui Geum

    2018-02-13

    "Second victims" are defined as healthcare professionals whose wellness is influenced by adverse clinical events. The Second Victim Experience and Support Tool (SVEST) was used to measure the second-victim experience and quality of support resources. Although the reliability and validity of the original SVEST have been validated, those for the Korean tool have not been validated. The aim of the study was to evaluate the psychometric properties of the Korean version of the SVEST. The study included 305 clinical nurses as participants. The SVEST was translated into Korean via back translation. Content validity was assessed by seven experts, and test-retest reliability was evaluated by 30 clinicians. Internal consistency and construct validity were assessed via confirmatory factor analysis. The analyses were performed using SPSS 23.0 and STATA 13.0 software. The content validity index value demonstrated validity; item- and scale-level content validity index values were both 0.95. Test-retest reliability and internal consistency reliability were satisfactory: the intraclass consistent coefficient was 0.71, and Cronbach α values ranged from 0.59 to 0.87. The CFA showed a significantly good fit for an eight-factor structure (χ = 578.21, df = 303, comparative fit index = 0.92, Tucker-Lewis index = 0.90, root mean square error of approximation = 0.05). The K-SVEST demonstrated good psychometric properties and adequate validity and reliability. The results showed that the Korean version of SVEST demonstrated the extent of second victimhood and support resources in Korean healthcare workers and could aid in the development of support programs and evaluation of their effectiveness.

  14. The Environmental Reward Observation Scale (EROS): development, validity, and reliability.

    PubMed

    Armento, Maria E A; Hopko, Derek R

    2007-06-01

    Researchers acknowledge a strong association between the frequency and duration of environmental reward and affective mood states, particularly in relation to the etiology, assessment, and treatment of depression. Given behavioral theories that outline environmental reward as a strong mediator of affect and the unavailability of an efficient, reliable, and valid self-report measure of environmental reward, we developed the Environmental Reward Observation Scale (EROS) and examined its psychometric properties. In Experiment 1, exploratory factor analysis supported a unidimensional 10-item measure with strong internal consistency and test-retest reliability. When administered to a replication sample, confirmatory factor analysis suggested an excellent fit to the 1-factor model and convergent/discriminant validity data supported the construct validity of the EROS. In Experiment 2, further support for the convergent validity of the EROS was obtained via moderate correlations with the Pleasant Events Schedule (PES; MacPhillamy & Lewinsohn, 1976). In Experiment 3, hierarchical regression supported the ecological validity of the EROS toward predicting daily diary reports of time spent in highly rewarding behaviors and activities. Above and beyond variance accounted for by depressive symptoms (BDI), the EROS was associated with significant incremental variance in accounting for time spent in both low and high reward behaviors. The EROS may represent a brief, reliable and valid measure of environmental reward that may improve the psychological assessment of negative mood states such as clinical depression.

  15. Chemometric and biological validation of a capillary electrophoresis metabolomic experiment of Schistosoma mansoni infection in mice.

    PubMed

    Garcia-Perez, Isabel; Angulo, Santiago; Utzinger, Jürg; Holmes, Elaine; Legido-Quigley, Cristina; Barbas, Coral

    2010-07-01

    Metabonomic and metabolomic studies are increasingly utilized for biomarker identification in different fields, including biology of infection. The confluence of improved analytical platforms and the availability of powerful multivariate analysis software have rendered the multiparameter profiles generated by these omics platforms a user-friendly alternative to the established analysis methods where the quality and practice of a procedure is well defined. However, unlike traditional assays, validation methods for these new multivariate profiling tools have yet to be established. We propose a validation for models obtained by CE fingerprinting of urine from mice infected with the blood fluke Schistosoma mansoni. We have analysed urine samples from two sets of mice infected in an inter-laboratory experiment where different infection methods and animal husbandry procedures were employed in order to establish the core biological response to a S. mansoni infection. CE data were analysed using principal component analysis. Validation of the scores consisted of permutation scrambling (100 repetitions) and a manual validation method, using a third of the samples (not included in the model) as a test or prediction set. The validation yielded 100% specificity and 100% sensitivity, demonstrating the robustness of these models with respect to deciphering metabolic perturbations in the mouse due to a S. mansoni infection. A total of 20 metabolites across the two experiments were identified that significantly discriminated between S. mansoni-infected and noninfected control samples. Only one of these metabolites, allantoin, was identified as manifesting different behaviour in the two experiments. This study shows the reproducibility of CE-based metabolic profiling methods for disease characterization and screening and highlights the importance of much needed validation strategies in the emerging field of metabolomics.

  16. Father for the first time - development and validation of a questionnaire to assess fathers’ experiences of first childbirth (FTFQ)

    PubMed Central

    2012-01-01

    Background A father’s experience of the birth of his first child is important not only for his birth-giving partner but also for the father himself, his relationship with the mother and the newborn. No validated questionnaire assessing first-time fathers' experiences during childbirth is currently available. Hence, the aim of this study was to develop and validate an instrument to assess first-time fathers’ experiences of childbirth. Method Domains and items were initially derived from interviews with first-time fathers, and supplemented by a literature search and a focus group interview with midwives. The comprehensibility, comprehension and relevance of the items were evaluated by four paternity research experts and a preliminary questionnaire was pilot tested in eight first-time fathers. A revised questionnaire was completed by 200 first-time fathers (response rate = 81%) Exploratory factor analysis using principal component analysis with varimax rotation was performed and multitrait scaling analysis was used to test scaling assumptions. External validity was assessed by means of known-groups analysis. Results Factor analysis yielded four factors comprising 22 items and accounting 48% of the variance. The domains found were Worry, Information, Emotional support and Acceptance. Multitrait analysis confirmed the convergent and discriminant validity of the domains; however, Cronbach’s alpha did not meet conventional reliability standards in two domains. The questionnaire was sensitive to differences between groups of fathers hypothesized to differ on important socio demographic or clinical variables. Conclusions The questionnaire adequately measures important dimensions of first-time fathers’ childbirth experience and may be used to assess aspects of fathers’ experiences during childbirth. To obtain the FTFQ and permission for its use, please contact the corresponding author. PMID:22594834

  17. Father for the first time--development and validation of a questionnaire to assess fathers' experiences of first childbirth (FTFQ).

    PubMed

    Premberg, Åsa; Taft, Charles; Hellström, Anna-Lena; Berg, Marie

    2012-05-17

    A father's experience of the birth of his first child is important not only for his birth-giving partner but also for the father himself, his relationship with the mother and the newborn. No validated questionnaire assessing first-time fathers' experiences during childbirth is currently available. Hence, the aim of this study was to develop and validate an instrument to assess first-time fathers' experiences of childbirth. Domains and items were initially derived from interviews with first-time fathers, and supplemented by a literature search and a focus group interview with midwives. The comprehensibility, comprehension and relevance of the items were evaluated by four paternity research experts and a preliminary questionnaire was pilot tested in eight first-time fathers. A revised questionnaire was completed by 200 first-time fathers (response rate = 81%) Exploratory factor analysis using principal component analysis with varimax rotation was performed and multitrait scaling analysis was used to test scaling assumptions. External validity was assessed by means of known-groups analysis. Factor analysis yielded four factors comprising 22 items and accounting 48% of the variance. The domains found were Worry, Information, Emotional support and Acceptance. Multitrait analysis confirmed the convergent and discriminant validity of the domains; however, Cronbach's alpha did not meet conventional reliability standards in two domains. The questionnaire was sensitive to differences between groups of fathers hypothesized to differ on important socio demographic or clinical variables. The questionnaire adequately measures important dimensions of first-time fathers' childbirth experience and may be used to assess aspects of fathers' experiences during childbirth. To obtain the FTFQ and permission for its use, please contact the corresponding author.

  18. Moving to Capture Children's Attention: Developing a Methodology for Measuring Visuomotor Attention.

    PubMed

    Hill, Liam J B; Coats, Rachel O; Mushtaq, Faisal; Williams, Justin H G; Aucott, Lorna S; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child's development. However, methodological limitations currently make large-scale assessment of children's attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of 'Visual Motor Attention' (VMA)-a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method's core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults' attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action).

  19. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  20. Validation of the Land-Surface Energy Budget and Planetary Boundary Layer for Several Intensive field Experiments

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Schubert, Siegfried; Molod, Andrea; Houser, Paul R.

    1999-01-01

    Land-surface processes in a data assimilation system influence the lower troposphere and must be properly represented. With the recent incorporation of the Mosaic Land-surface Model (LSM) into the GEOS Data Assimilation System (DAS), the detailed land-surface processes require strict validation. While global data sources can identify large-scale systematic biases at the monthly timescale, the diurnal cycle is difficult to validate. Moreover, global data sets rarely include variables such as evaporation, sensible heat and soil water. Intensive field experiments, on the other hand, can provide high temporal resolution energy budget and vertical profile data for sufficiently long periods, without global coverage. Here, we evaluate the GEOS DAS against several intensive field experiments. The field experiments are First ISLSCP Field Experiment (FIFE, Kansas, summer 1987), Cabauw (as used in PILPS, Netherlands, summer 1987), Atmospheric Radiation Measurement (ARM, Southern Great Plains, winter and summer 1998) and the Surface Heat Budget of the Arctic Ocean (SHEBA, Arctic ice sheet, winter and summer 1998). The sites provide complete surface energy budget data for periods of at least one year, and some periods of vertical profiles. This comparison provides a detailed validation of the Mosaic LSM within the GEOS DAS for a variety of climatologic and geographic conditions.

  1. Teachers' Perceptions and Undergraduate Students' Experience in E-Exam in Higher Institution in Nigeria

    ERIC Educational Resources Information Center

    Hamsatu, Pur; Yusufu, Gambo; Mohammed, Habib A.

    2016-01-01

    This study was conducted to explore teachers' perceptions, and students' experiences in e-Examination in University of Maiduguri. Questionnaires were distributed to 30 teachers and 50 students, and the 80 collated instruments were valid for data analysis, representing a response rate of 100%. The validity of the questionnaire was approved by some…

  2. A Validation Study of the Adolescent Dissociative Experiences Scale

    ERIC Educational Resources Information Center

    Keck Seeley, Susan. M.; Perosa, Sandra, L.; Perosa, Linda, M.

    2004-01-01

    Objective: The purpose of this study was to further the validation process of the Adolescent Dissociative Experiences Scale (A-DES). In this study, a 6-item Likert response format with descriptors was used when responding to the A-DES rather than the 11-item response format used in the original A-DES. Method: The internal reliability and construct…

  3. Students' and Teacher's Experiences of the Validity and Reliability of Assessment in a Bioscience Course

    ERIC Educational Resources Information Center

    Räisänen, Milla; Tuononen, Tarja; Postareff, Liisa; Hailikari, Telle; Virtanen, Viivi

    2016-01-01

    This case study explores the assessment of students' learning outcomes in a second-year lecture course in biosciences. The aim is to deeply explore the teacher's and the students' experiences of the validity and reliability of assessment and to compare those perspectives. The data were collected through stimulated recall interviews. The results…

  4. An Examination and Validation of an Adapted Youth Experience Scale for University Sport

    ERIC Educational Resources Information Center

    Rathwell, Scott; Young, Bradley W.

    2016-01-01

    Limited tools assess positive development through university sport. Such a tool was validated in this investigation using two independent samples of Canadian university athletes. In Study 1, 605 athletes completed 99 survey items drawn from the Youth Experience Scale (YES 2.0), and separate a priori measurement models were evaluated (i.e., 99…

  5. (In)validation in the Minority: The Experiences of Latino Students Enrolled in an HBCU

    ERIC Educational Resources Information Center

    Allen, Taryn Ozuna

    2016-01-01

    This qualitative, phenomenological study examined the academic and interpersonal validation experiences of four female and four male Latino students who were enrolled in their second- to fifth-year at an HBCU in Texas. Using interviews, campus observations, a questionnaire, and analytic memos, this study sought to understand the role of in- and…

  6. Validation of the thermal transport model used for ITER startup scenario predictions with DIII-D experimental data

    DOE PAGES

    Casper, T. A.; Meyer, W. H.; Jackson, G. L.; ...

    2010-12-08

    We are exploring characteristics of ITER startup scenarios in similarity experiments conducted on the DIII-D Tokamak. In these experiments, we have validated scenarios for the ITER current ramp up to full current and developed methods to control the plasma parameters to achieve stability. Predictive simulations of ITER startup using 2D free-boundary equilibrium and 1D transport codes rely on accurate estimates of the electron and ion temperature profiles that determine the electrical conductivity and pressure profiles during the current rise. Here we present results of validation studies that apply the transport model used by the ITER team to DIII-D discharge evolutionmore » and comparisons with data from our similarity experiments.« less

  7. DSMC Simulations of Hypersonic Flows and Comparison With Experiments

    NASA Technical Reports Server (NTRS)

    Moss, James N.; Bird, Graeme A.; Markelov, Gennady N.

    2004-01-01

    This paper presents computational results obtained with the direct simulation Monte Carlo (DSMC) method for several biconic test cases in which shock interactions and flow separation-reattachment are key features of the flow. Recent ground-based experiments have been performed for several biconic configurations, and surface heating rate and pressure measurements have been proposed for code validation studies. The present focus is to expand on the current validating activities for a relatively new DSMC code called DS2V that Bird (second author) has developed. Comparisons with experiments and other computations help clarify the agreement currently being achieved between computations and experiments and to identify the range of measurement variability of the proposed validation data when benchmarked with respect to the current computations. For the test cases with significant vibrational nonequilibrium, the effect of the vibrational energy surface accommodation on heating and other quantities is demonstrated.

  8. Real-time remote scientific model validation

    NASA Technical Reports Server (NTRS)

    Frainier, Richard; Groleau, Nicolas

    1994-01-01

    This paper describes flight results from the use of a CLIPS-based validation facility to compare analyzed data from a space life sciences (SLS) experiment to an investigator's preflight model. The comparison, performed in real-time, either confirms or refutes the model and its predictions. This result then becomes the basis for continuing or modifying the investigator's experiment protocol. Typically, neither the astronaut crew in Spacelab nor the ground-based investigator team are able to react to their experiment data in real time. This facility, part of a larger science advisor system called Principal Investigator in a Box, was flown on the space shuttle in October, 1993. The software system aided the conduct of a human vestibular physiology experiment and was able to outperform humans in the tasks of data integrity assurance, data analysis, and scientific model validation. Of twelve preflight hypotheses associated with investigator's model, seven were confirmed and five were rejected or compromised.

  9. The Precise Orbit and the Challenge of Long Term Stability

    NASA Technical Reports Server (NTRS)

    Lemoine, Frank G.; Cerri, Luca; Otten, Michiel; Bertiger, William; Zelensky, Nikita; Willis, Pascal

    2012-01-01

    The computation of a precise orbit reference is a fundamental component of the altimetric measurement. Since the dawn of the modern altimeter age, orbit accuracy has been determined by the quality of the GPS, SLR, and DORIS tracking systems, the fidelity of the measurement and force models, and the choice of parameterization for the orbit solutions, and whether a dynamic or a reduced-dynamic strategy is used to calculate the orbits. At the start of the TOPEX mission, the inaccuracies in the modeling of static gravity, dynamic ocean tides, and the nonconservative forces dominated the orbit error budget. Much of the error due to dynamic mismodeling can be compensated by reduced-dynamic tracking techniques depending on the measurement system strength. In the last decade, the launch of the GRACE mission has eliminated the static gravity field as a concern, and the background force models and the terrestrial reference frame have been systematically refined. GPS systems have realized many improvements, including better modeling of the forces on the GPS spacecraft, large increases in the ground tracking network, and improved modeling of the GPS measurements. DORIS systems have achieved improvements through the use of new antennae, more stable monumentation, and of satellite receivers that can track multiple beacons, and as well as through improved modeling of the nonconservative forces. Many of these improvements have been applied in the new reprocessed time series of orbits produced for the ERS satellites, Envisat, TOPEX/Poseidon and the Jason satellites, and as well as for the most recent Cryosat-2 and HY2A. We now face the challenge of maintaining a stable orbit reference for these altimetric satellites. Changes in the time-variable gravity field of the Earth and how these are modelled have been shown to affect the orbit evolution, and the calibration of the altimetric data with tide gauges. The accuracy of the reference frame realizations, and their projection into the future remains a source of error. Other sources of omission error include the geocenter for which no consensus model is as of yet applied. Although progress has been made in nonconservative force modeling through the use of detailed satellite-specific models, radiation pressure modeling, and atmospheric density modeling remain a potential source of orbit error. The longer term influence of variations in the solar and terrestrial radiation fields over annual and solar cycles remains principally untested. Also the long term variation in optical and thermal properties of the space vehicle surfaces would contribute to biases in the orbital frame if ignored. We review the status of altimetric precision orbit determination as exemplified by the recent computations undertaken by the different analysis centers for ERS, Envisat, TOPEX/Poseidon, Jason, Cryosat2 and HY2A, and we provide a perspective on the challenges for future missions such as the Jason-3, SENTINEL-3 and SWOT.

  10. Virtual experiments, physical validation: dental morphology at the intersection of experiment and theory

    PubMed Central

    Anderson, P. S. L.; Rayfield, E. J.

    2012-01-01

    Computational models such as finite-element analysis offer biologists a means of exploring the structural mechanics of biological systems that cannot be directly observed. Validated against experimental data, a model can be manipulated to perform virtual experiments, testing variables that are hard to control in physical experiments. The relationship between tooth form and the ability to break down prey is key to understanding the evolution of dentition. Recent experimental work has quantified how tooth shape promotes fracture in biological materials. We present a validated finite-element model derived from physical compression experiments. The model shows close agreement with strain patterns observed in photoelastic test materials and reaction forces measured during these experiments. We use the model to measure strain energy within the test material when different tooth shapes are used. Results show that notched blades deform materials for less strain energy cost than straight blades, giving insights into the energetic relationship between tooth form and prey materials. We identify a hypothetical ‘optimal’ blade angle that minimizes strain energy costs and test alternative prey materials via virtual experiments. Using experimental data and computational models offers an integrative approach to understand the mechanics of tooth morphology. PMID:22399789

  11. PSI-Center Validation Studies

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Sutherland, D. A.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2014-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with 3D extended MHD simulations using the NIMROD, HiFi, and PSI-TET codes. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), HBT-EP (Columbia), HIT-SI (U Wash-UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition (BOD) is used to compare experiments with simulations. BOD separates data sets into spatial and temporal structures, giving greater weight to dominant structures. Several BOD metrics are being formulated with the goal of quantitive validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  12. HBOI Underwater Imaging and Communication Research - Phase 1

    DTIC Science & Technology

    2012-04-19

    validation of one-way pulse stretching radiative transfer code The objective was to develop and validate time-resolved radiative transfer models that...and validation of one-way pulse stretching radiative transfer code The models were subjected to a series of validation experiments over 12.5 meter...about the theoretical basis of the model together with validation results can be found in Dalgleish et al., (20 1 0). Forward scattering Mueller

  13. Study design elements for rigorous quasi-experimental comparative effectiveness research.

    PubMed

    Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan

    2013-03-01

    Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.

  14. Ego-Dissolution and Psychedelics: Validation of the Ego-Dissolution Inventory (EDI)

    PubMed Central

    Nour, Matthew M.; Evans, Lisa; Nutt, David; Carhart-Harris, Robin L.

    2016-01-01

    Aims: The experience of a compromised sense of “self”, termed ego-dissolution, is a key feature of the psychedelic experience. This study aimed to validate the Ego-Dissolution Inventory (EDI), a new 8-item self-report scale designed to measure ego-dissolution. Additionally, we aimed to investigate the specificity of the relationship between psychedelics and ego-dissolution. Method: Sixteen items relating to altered ego-consciousness were included in an internet questionnaire; eight relating to the experience of ego-dissolution (comprising the EDI), and eight relating to the antithetical experience of increased self-assuredness, termed ego-inflation. Items were rated using a visual analog scale. Participants answered the questionnaire for experiences with classical psychedelic drugs, cocaine and/or alcohol. They also answered the seven questions from the Mystical Experiences Questionnaire (MEQ) relating to the experience of unity with one’s surroundings. Results: Six hundred and ninety-one participants completed the questionnaire, providing data for 1828 drug experiences (1043 psychedelics, 377 cocaine, 408 alcohol). Exploratory factor analysis demonstrated that the eight EDI items loaded exclusively onto a single common factor, which was orthogonal to a second factor comprised of the items relating to ego-inflation (rho = −0.110), demonstrating discriminant validity. The EDI correlated strongly with the MEQ-derived measure of unitive experience (rho = 0.735), demonstrating convergent validity. EDI internal consistency was excellent (Cronbach’s alpha 0.93). Three analyses confirmed the specificity of ego-dissolution for experiences occasioned by psychedelic drugs. Firstly, EDI score correlated with drug-dose for psychedelic drugs (rho = 0.371), but not for cocaine (rho = 0.115) or alcohol (rho = −0.055). Secondly, the linear regression line relating the subjective intensity of the experience to ego-dissolution was significantly steeper for psychedelics (unstandardized regression coefficient = 0.701) compared with cocaine (0.135) or alcohol (0.144). Ego-inflation, by contrast, was specifically associated with cocaine experiences. Finally, a binary Support Vector Machine classifier identified experiences occasioned by psychedelic drugs vs. cocaine or alcohol with over 85% accuracy using ratings of ego-dissolution and ego-inflation alone. Conclusion: Our results demonstrate the psychometric structure, internal consistency and construct validity of the EDI. Moreover, we demonstrate the close relationship between ego-dissolution and the psychedelic experience. The EDI will facilitate the study of the neuronal correlates of ego-dissolution, which is relevant for psychedelic-assisted psychotherapy and our understanding of psychosis. PMID:27378878

  15. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics.

    PubMed

    Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L

    2017-02-01

    To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.

  16. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    1999-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include ASTER, CERES, MISR, MODIS and MOPITT. In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS), AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2, though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community. Detailed information about the EOS Terra validation Program can be found on the EOS Validation program homepage i/e.: http://ospso.gsfc.nasa.gov/validation/valpage.html).

  17. A user-targeted synthesis of the VALUE perfect predictor experiment

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutierrez, Jose; Kotlarski, Sven; Hertig, Elke; Wibig, Joanna; Rössler, Ole; Huth, Radan

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. We consider different aspects: (1) marginal aspects such as mean, variance and extremes; (2) temporal aspects such as spell length characteristics; (3) spatial aspects such as the de-correlation length of precipitation extremes; and multi-variate aspects such as the interplay of temperature and precipitation or scale-interactions. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur. Experiment 1 (perfect predictors): what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Experiment 2 (Global climate model predictors): how is the overall representation of regional climate, including errors inherited from global climate models? Experiment 3 (pseudo reality): do methods fail in representing regional climate change? Here, we present a user-targeted synthesis of the results of the first VALUE experiment. In this experiment, downscaling methods are driven with ERA-Interim reanalysis data to eliminate global climate model errors, over the period 1979-2008. As reference data we use, depending on the question addressed, (1) observations from 86 meteorological stations distributed across Europe; (2) gridded observations at the corresponding 86 locations or (3) gridded spatially extended observations for selected European regions. With more than 40 contributing methods, this study is the most comprehensive downscaling inter-comparison project so far. The results clearly indicate that for several aspects, the downscaling skill varies considerably between different methods. For specific purposes, some methods can therefore clearly be excluded.

  18. The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).

    PubMed

    Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S

    2016-12-01

    The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.

  19. Measuring Educators' Attitudes and Beliefs about Evaluation: Construct Validity and Reliability of the Teacher Evaluation Experience Scale

    ERIC Educational Resources Information Center

    Reddy, Linda A.; Dudek, Christopher M.; Kettler, Ryan J.; Kurz, Alexander; Peters, Stephanie

    2016-01-01

    This study presents the reliability and validity of the Teacher Evaluation Experience Scale--Teacher Form (TEES-T), a multidimensional measure of educators' attitudes and beliefs about teacher evaluation. Confirmatory factor analyses of data from 583 teachers were conducted on the TEES-T hypothesized five-factor model, as well as on alternative…

  20. Measuring Efficacy Sources: Development and Validation of the Sources of Teacher Efficacy Questionnaire (STEQ) for Chinese Teachers

    ERIC Educational Resources Information Center

    Hoi, Cathy Ka Weng; Zhou, Mingming; Teo, Timothy; Nie, Youyan

    2017-01-01

    The aim of the current study is to develop and validate an instrument to measure the four sources of teacher efficacy among Chinese primary school teachers. A 26-item Sources of Teacher Efficacy Questionnaire (STEQ) was proposed with four subscales: mastery experience, vicarious experience, social persuasion, and physiological arousal. The results…

  1. Evidences of Validity of a Scale for Mapping Professional as Defining Competences and Performance by Brazilian Tutors

    ERIC Educational Resources Information Center

    Coelho, Francisco Antonio, Jr.; Ferreira, Rodrigo Rezende; Paschoal, Tatiane; Faiad, Cristiane; Meneses, Paulo Murce

    2015-01-01

    The purpose of this study was twofold: to assess evidences of construct validity of the Brazilian Scale of Tutors Competences in the field of Open and Distance Learning and to examine if variables such as professional experience, perception of the student´s learning performance and prior experience influence the development of technical and…

  2. Modeling the Effects of Argument Length and Validity on Inductive and Deductive Reasoning

    ERIC Educational Resources Information Center

    Rotello, Caren M.; Heit, Evan

    2009-01-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were…

  3. Test Takers' Beliefs and Experiences of a High-Stakes Computer-Based English Listening and Speaking Test

    ERIC Educational Resources Information Center

    Zhan, Ying; Wan, Zhi Hong

    2016-01-01

    Test takers' beliefs or experiences have been overlooked in most validation studies in language education. Meanwhile, a mutual exclusion has been observed in the literature, with little or no dialogue between validation studies and studies concerning the uses and consequences of testing. To help fill these research gaps, a group of Senior III…

  4. Development and Validation of an Assessment Instrument for Course Experience in a General Education Integrated Science Course

    ERIC Educational Resources Information Center

    Liu, Juhong Christie; St. John, Kristen; Courtier, Anna M. Bishop

    2017-01-01

    Identifying instruments and surveys to address geoscience education research (GER) questions is among the high-ranked needs in a 2016 survey of the GER community (St. John et al., 2016). The purpose of this study was to develop and validate a student-centered assessment instrument to measure course experience in a general education integrated…

  5. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    ERIC Educational Resources Information Center

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  6. Predeployment validation of fault-tolerant systems through software-implemented fault insertion

    NASA Technical Reports Server (NTRS)

    Czeck, Edward W.; Siewiorek, Daniel P.; Segall, Zary Z.

    1989-01-01

    Fault injection-based automated testing (FIAT) environment, which can be used to experimentally characterize and evaluate distributed realtime systems under fault-free and faulted conditions is described. A survey is presented of validation methodologies. The need for fault insertion based on validation methodologies is demonstrated. The origins and models of faults, and motivation for the FIAT concept are reviewed. FIAT employs a validation methodology which builds confidence in the system through first providing a baseline of fault-free performance data and then characterizing the behavior of the system with faults present. Fault insertion is accomplished through software and allows faults or the manifestation of faults to be inserted by either seeding faults into memory or triggering error detection mechanisms. FIAT is capable of emulating a variety of fault-tolerant strategies and architectures, can monitor system activity, and can automatically orchestrate experiments involving insertion of faults. There is a common system interface which allows ease of use to decrease experiment development and run time. Fault models chosen for experiments on FIAT have generated system responses which parallel those observed in real systems under faulty conditions. These capabilities are shown by two example experiments each using a different fault-tolerance strategy.

  7. Convergent Validity of the Early Memory Index in Two Primary Care Samples.

    PubMed

    Porcerelli, John H; Cogan, Rosemary; Melchior, Katherine A; Jasinski, Matthew J; Richardson, Laura; Fowler, Shannon; Morris, Pierre; Murdoch, William

    2016-01-01

    Karliner, Westrich, Shedler, and Mayman (1996) developed the Early Memory Index (EMI) to assess mental health, narrative coherence, and traumatic experiences in reports of early memories. We assessed the convergent validity of EMI scales with data from 103 women from an urban primary care clinic (Study 1) and data from 48 women and 24 men from a suburban primary care clinic (Study 2). Patients provided early memory narratives and completed self-report measures of psychopathology, trauma, and health care utilization. In both studies, lower scores on the Mental Health scale and higher scores on the Traumatic Experiences scale were related to higher scores on measures of psychopathology and childhood trauma. Less consistent associations were found between the Mental Health and Traumatic Experiences scores and measures of health care utilization. The Narrative Coherence scale showed inconsistent relationships across measures in both samples. In analyses assessing the overall fit between hypothesized and actual correlations between EMI scores and measures of psychopathology, severity of trauma symptoms, and health care utilization, the Mental Health scale of the EMI demonstrated stronger convergent validity than the EMI Traumatic Experiences scale. The results provide support for the convergent validity of the Mental Health scale of the EMI.

  8. Panamanian women׳s experience of vaginal examination in labour: A questionnaire validation.

    PubMed

    Bonilla-Escobar, Francisco J; Ortega-Lenis, Delia; Rojas-Mirquez, Johanna C; Ortega-Loubon, Christian

    2016-05-01

    to validate a tool that allows healthcare providers to obtain accurate information regarding Panamanian women׳s thoughts and feelings about vaginal examination during labour that can be used in other Latin-American countries. validation study based on a database from a cross-sectional study carried out in two tertiary care hospitals in Panama City, Panama. Women in the immediate postpartum period who had spontaneous labour onset and uncomplicated deliveries were included in the study from April to August 2008. Researchers used a survey designed by Lewin et al. that included 20 questions related to a patient׳s experience during a vaginal examination. five constructs (factors) related to a patient׳s experience of vaginal examination during labour were identified: Approval (Alpha Cronbach׳s 0.72), Perception (0.67), Rejection (0.40), Consent (0.51), and Stress (0.20). it was demonstrated the validity of the scale and its constructs used to obtain information related to vaginal examination during labour, including patients' experiences with examination and healthcare staff performance. utilisation of the scale will allow institutions to identify items that need improvement and address these areas in order to promote the best care for patients in labour. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Assessing birth experience in fathers as an important aspect of clinical obstetrics: how applicable is Salmon's Item List for men?

    PubMed

    Gawlik, Stephanie; Müller, Mitho; Hoffmann, Lutz; Dienes, Aimée; Reck, Corinna

    2015-01-01

    validated questionnaire assessment of fathers' experiences during childbirth is lacking in routine clinical practice. Salmon's Item List is a short, validated method used for the assessment of birth experience in mothers in both English- and German-speaking communities. With little to no validated data available for fathers, this pilot study aimed to assess the applicability of the German version of Salmon's Item List, including a multidimensional birth experience concept, in fathers. longitudinal study. Data were collected by questionnaires. University hospital in Germany. the birth experiences of 102 fathers were assessed four to six weeks post partum using the German version of Salmon's Item List. construct validity testing with exploratory factor analysis using principal component analysis with varimax rotation was performed to identify the dimensions of childbirth experiences. Internal consistency was also analysed. factor analysis yielded a four-factor solution comprising 17 items that accounted for 54.5% of the variance. The main domain was 'fulfilment', and the secondary domains were 'emotional distress', 'physical discomfort' and 'emotional adaption'. For fulfilment, Cronbach's α met conventional reliability standards (0.87). Salmon's Item List is an appropriate instrument to assess birth experience in fathers in terms of fulfilment. Larger samples need to be examined in order to prove the stability of the factor structure before this can be extended to routine clinical assessment. a reduced version of Salmon's Item List may be useful as a screening tool for general assessment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. The Sensed Presence Questionnaire (SenPQ): initial psychometric validation of a measure of the “Sensed Presence” experience

    PubMed Central

    Bell, Vaughan

    2017-01-01

    Background The experience of ‘sensed presence’—a feeling or sense that another entity, individual or being is present despite no clear sensory or perceptual evidence—is known to occur in the general population, appears more frequently in religious or spiritual contexts, and seems to be prominent in certain psychiatric or neurological conditions and may reflect specific functions of social cognition or body-image representation systems in the brain. Previous research has relied on ad-hoc measures of the experience and no specific psychometric scale to measure the experience exists to date. Methods Based on phenomenological description in the literature, we created the 16-item Sensed Presence Questionnaire (SenPQ). We recruited participants from (i) a general population sample, and; (ii) a sample including specific selection for religious affiliation, to complete the SenPQ and additional measures of well-being, schizotypy, social anxiety, social imagery, and spiritual experience. We completed an analysis to test internal reliability, the ability of the SenPQ to distinguish between religious and non-religious participants, and whether the SenPQ was specifically related to positive schizotypical experiences and social imagery. A factor analysis was also conducted to examine underlying latent variables. Results The SenPQ was found to be reliable and valid, with religious participants significantly endorsing more items than non-religious participants, and the scale showing a selective relationship with construct relevant measures. Principal components analysis indicates two potential underlying factors interpreted as reflecting ‘benign’ and ‘malign’ sensed presence experiences. Discussion The SenPQ appears to be a reliable and valid measure of sensed presence experience although further validation in neurological and psychiatric conditions is warranted. PMID:28367379

  11. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  12. The 1-min Screening Test for Reading Problems in College Students: Psychometric Properties of the 1-min TIL.

    PubMed

    Fernandes, Tânia; Araújo, Susana; Sucena, Ana; Reis, Alexandra; Castro, São Luís

    2017-02-01

    Reading is a central cognitive domain, but little research has been devoted to standardized tests for adults. We, thus, examined the psychometric properties of the 1-min version of Teste de Idade de Leitura (Reading Age Test; 1-min TIL), the Portuguese version of Lobrot L3 test, in three experiments with college students: typical readers in Experiment 1A and B, dyslexic readers and chronological age controls in Experiment 2. In Experiment 1A, test-retest reliability and convergent validity were evaluated in 185 students. Reliability was >.70, and phonological decoding underpinned 1-min TIL. In Experiment 1B, internal consistency was assessed by presenting two 45-s versions of the test to 19 students, and performance in these versions was significantly associated (r = .78). In Experiment 2, construct validity, criterion validity and clinical utility of 1-min TIL were investigated. A multiple regression analysis corroborated construct validity; both phonological decoding and listening comprehension were reliable predictors of 1-min TIL scores. Logistic regression and receiver operating characteristics analyses revealed the high accuracy of this test in distinguishing dyslexic from typical readers. Therefore, the 1-min TIL, which assesses reading comprehension and potential reading difficulties in college students, has the necessary psychometric properties to become a useful screening instrument in neuropsychological assessment and research. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Examining students' views about validity of experiments: From introductory to Ph.D. students

    NASA Astrophysics Data System (ADS)

    Hu, Dehui; Zwickl, Benjamin M.

    2018-06-01

    We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.

  14. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    ERIC Educational Resources Information Center

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  15. Construction and Initial Validation of the Multiracial Experiences Measure (MEM)

    PubMed Central

    Yoo, Hyung Chol; Jackson, Kelly; Guevarra, Rudy P.; Miller, Matthew J.; Harrington, Blair

    2015-01-01

    This article describes the development and validation of the Multiracial Experiences Measure (MEM): a new measure that assesses uniquely racialized risks and resiliencies experienced by individuals of mixed racial heritage. Across two studies, there was evidence for the validation of the 25-item MEM with 5 subscales including Shifting Expressions, Perceived Racial Ambiguity, Creating Third Space, Multicultural Engagement, and Multiracial Discrimination. The 5-subscale structure of the MEM was supported by a combination of exploratory and confirmatory factor analyses. Evidence of criterion-related validity was partially supported with MEM subscales correlating with measures of racial diversity in one’s social network, color-blind racial attitude, psychological distress, and identity conflict. Evidence of discriminant validity was supported with MEM subscales not correlating with impression management. Implications for future research and suggestions for utilization of the MEM in clinical practice with multiracial adults are discussed. PMID:26460977

  16. Construction and initial validation of the Multiracial Experiences Measure (MEM).

    PubMed

    Yoo, Hyung Chol; Jackson, Kelly F; Guevarra, Rudy P; Miller, Matthew J; Harrington, Blair

    2016-03-01

    This article describes the development and validation of the Multiracial Experiences Measure (MEM): a new measure that assesses uniquely racialized risks and resiliencies experienced by individuals of mixed racial heritage. Across 2 studies, there was evidence for the validation of the 25-item MEM with 5 subscales including Shifting Expressions, Perceived Racial Ambiguity, Creating Third Space, Multicultural Engagement, and Multiracial Discrimination. The 5-subscale structure of the MEM was supported by a combination of exploratory and confirmatory factor analyses. Evidence of criterion-related validity was partially supported with MEM subscales correlating with measures of racial diversity in one's social network, color-blind racial attitude, psychological distress, and identity conflict. Evidence of discriminant validity was supported with MEM subscales not correlating with impression management. Implications for future research and suggestions for utilization of the MEM in clinical practice with multiracial adults are discussed. (c) 2016 APA, all rights reserved).

  17. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  18. Observing System Simulation Experiments

    NASA Technical Reports Server (NTRS)

    Prive, Nikki

    2015-01-01

    This presentation gives an overview of Observing System Simulation Experiments (OSSEs). The components of an OSSE are described, along with discussion of the process for validating, calibrating, and performing experiments. a.

  19. Moving to Capture Children’s Attention: Developing a Methodology for Measuring Visuomotor Attention

    PubMed Central

    Coats, Rachel O.; Mushtaq, Faisal; Williams, Justin H. G.; Aucott, Lorna S.; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child’s development. However, methodological limitations currently make large-scale assessment of children’s attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of ‘Visual Motor Attention’ (VMA)—a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method’s core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults’ attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  20. The Play Experience Scale: development and validation of a measure of play.

    PubMed

    Pavlas, Davin; Jentsch, Florian; Salas, Eduardo; Fiore, Stephen M; Sims, Valerie

    2012-04-01

    A measure of play experience in video games was developed through literature review and two empirical validation studies. Despite the considerable attention given to games in the behavioral sciences, play experience remains empirically underexamined. One reason for this gap is the absence of a scale that measures play experience. In Study 1, the initial Play Experience Scale (PES) was tested through an online validation that featured three different games (N = 203). In Study 2, a revised PES was assessed with a serious game in the laboratory (N = 77). Through principal component analysis of the Study 1 data, the initial 20-item PES was revised, resulting in the 16-item PES-16. Study 2 showed the PES-16 to be a robust instrument with the same patterns of correlations as in Study 1 via (a) internal consistency estimates, (b) correlations with established scales of motivation, (c) distributions of PES-16 scores in different game conditions, and (d) examination of the average variance extracted of the PES and the Intrinsic Motivation Scale. We suggest that the PES is appropriate for use in further validation studies. Additional examinations of the scale are required to determine its applicability to other contexts and its relationship with other constructs. The PES is potentially relevant to human factors undertakings involving video games, including basic research into play, games, and learning; prototype testing; and exploratory learning studies.

  1. The Dimensionality of Reasoning: Inductive and Deductive Inference can be Explained by a Single Process.

    PubMed

    Hayes, Brett K; Stephens, Rachel G; Ngo, Jeremy; Dunn, John C

    2018-02-01

    Three-experiments examined the number of qualitatively different processing dimensions needed to account for inductive and deductive reasoning. In each study, participants were presented with arguments that varied in logical validity and consistency with background knowledge (believability), and evaluated them according to deductive criteria (whether the conclusion was necessarily true given the premises) or inductive criteria (whether the conclusion was plausible given the premises). We examined factors including working memory load (Experiments 1 and 2), individual working memory capacity (Experiments 1 and 2), and decision time (Experiment 3), which according to dual-processing theories, modulate the contribution of heuristic and analytic processes to reasoning. A number of empirical dissociations were found. Argument validity affected deduction more than induction. Argument believability affected induction more than deduction. Lower working memory capacity reduced sensitivity to argument validity and increased sensitivity to argument believability, especially under induction instructions. Reduced decision time led to decreased sensitivity to argument validity. State-trace analyses of each experiment, however, found that only a single underlying dimension was required to explain patterns of inductive and deductive judgments. These results show that the dissociations, which have traditionally been seen as supporting dual-processing models of reasoning, are consistent with a single-process model that assumes a common evidentiary scale for induction and deduction. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Validation results of satellite mock-up capturing experiment using nets

    NASA Astrophysics Data System (ADS)

    Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil

    2017-05-01

    The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly configured according to the parabolic flight scenario, and executed in order to generate the validation data. Both datasets have been compared according to different metrics in order to perform the validation of the PATENDER simulator.

  3. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    PubMed

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  4. Disruption Tolerant Networking Flight Validation Experiment on NASA's EPOXI Mission

    NASA Technical Reports Server (NTRS)

    Wyatt, Jay; Burleigh, Scott; Jones, Ross; Torgerson, Leigh; Wissler, Steve

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions. This activity was part of a larger NASA space DTN development program to mature DTN to flight readiness for a wide variety of mission types by the end of 2011. This paper describes the DTN protocols, the flight demo implementation, validation metrics which were created for the experiment, and validation results.

  5. Contrasting Predictions of the Extended Comparator Hypothesis and Acquisition-Focused Models of Learning Concerning Retrospective Revaluation

    PubMed Central

    McConnell, Bridget L.; Urushihara, Kouji; Miller, Ralph R.

    2009-01-01

    Three conditioned suppression experiments with rats investigated contrasting predictions made by the extended comparator hypothesis and acquisition-focused models of learning, specifically, modified SOP and the revised Rescorla-Wagner model, concerning retrospective revaluation. Two target cues (X and Y) were partially reinforced using a stimulus relative validity design (i.e., AX-Outcome/ BX-No outcome/ CY-Outcome/ DY-No outcome), and subsequently one of the companion cues for each target was extinguished in compound (BC-No outcome). In Experiment 1, which used spaced trials for relative validity training, greater suppression was observed to target cue Y for which the excitatory companion cue had been extinguished relative to target cue X for which the nonexcitatory companion cue had been extinguished. Experiment 2 replicated these results in a sensory preconditioning preparation. Experiment 3 massed the trials during relative validity training, and the opposite pattern of data was observed. The results are consistent with the predictions of the extended comparator hypothesis. Furthermore, this set of experiments is unique in being able to differentiate between these models without invoking higher-order comparator processes. PMID:20141324

  6. Examining the Internal Validity and Statistical Precision of the Comparative Interrupted Time Series Design by Comparison with a Randomized Experiment

    ERIC Educational Resources Information Center

    St.Clair, Travis; Cook, Thomas D.; Hallberg, Kelly

    2014-01-01

    Although evaluators often use an interrupted time series (ITS) design to test hypotheses about program effects, there are few empirical tests of the design's validity. We take a randomized experiment on an educational topic and compare its effects to those from a comparative ITS (CITS) design that uses the same treatment group as the experiment…

  7. A Performance Management Framework for Civil Engineering

    DTIC Science & Technology

    1990-09-01

    cultural change. A non - equivalent control group design was chosen to augment the case analysis. Figure 3.18 shows the form of the quasi-experiment. The...The non - equivalent control group design controls the following obstacles to internal validity: history, maturation, testing, and instrumentation. The...and Stanley, 1963:48,50) Table 7. Validity of Quasi-Experiment The non - equivalent control group experimental design controls the following obstacles to

  8. Validation of the Italian version of the dissociative experience scale for adolescents and young adults.

    PubMed

    De Pasquale, Concetta; Sciacca, Federica; Hichy, Zira

    2016-01-01

    The Dissociative Experience Scale for adolescent (A-DES), a 30-item, multidimensional, self-administered questionnaire, was validated using a large sample of American young people sample. We reported the linguistic validation process and the metric validity of the Italian version of A-DES in the Italy. A set of questionnaires was provided to a total of 633 participants from March 2015 to April 2016. The participants consisted of 282 boys and 351 girls, and their average age was between 18 and 24 years old. The translation process consisted of two consecutive steps: forward-backward translation and acceptability testing. The psychometric testing was applied to Italian students who were recruited from the Italian Public Schools and Universities in Sicily. Informed consent was obtained from all participants at the research. All individuals completed the A-DES. Reliability and validity were tested. The translated version was validated on a total of 633 Italian students. The reliability of A-DES total is .926. It is composed by 4 subscales: Dissociative amnesia, Absorption and imaginative involvement, Depersonalization and derealization, and Passive influence. The reliability of each subscale is: .756 for dissociative amnesia, .659 for absorption and imaginative involvement, .850 for depersonalization and derealization, and .743 for passive influence. The Italian version of the A-DES constitutes a useful instrument to measure dissociative experience in adolescents and young adults in Italy.

  9. Remote Patron Validation: Posting a Proxy Server at the Digital Doorway.

    ERIC Educational Resources Information Center

    Webster, Peter

    2002-01-01

    Discussion of remote access to library services focuses on proxy servers as a method for remote access, based on experiences at Saint Mary's University (Halifax). Topics include Internet protocol user validation; browser-directed proxies; server software proxies; vendor alternatives for validating remote users; and Internet security issues. (LRW)

  10. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  11. Construct validity of individual and summary performance metrics associated with a computer-based laparoscopic simulator.

    PubMed

    Rivard, Justin D; Vergis, Ashley S; Unger, Bertram J; Hardy, Krista M; Andrew, Chris G; Gillman, Lawrence M; Park, Jason

    2014-06-01

    Computer-based surgical simulators capture a multitude of metrics based on different aspects of performance, such as speed, accuracy, and movement efficiency. However, without rigorous assessment, it may be unclear whether all, some, or none of these metrics actually reflect technical skill, which can compromise educational efforts on these simulators. We assessed the construct validity of individual performance metrics on the LapVR simulator (Immersion Medical, San Jose, CA, USA) and used these data to create task-specific summary metrics. Medical students with no prior laparoscopic experience (novices, N = 12), junior surgical residents with some laparoscopic experience (intermediates, N = 12), and experienced surgeons (experts, N = 11) all completed three repetitions of four LapVR simulator tasks. The tasks included three basic skills (peg transfer, cutting, clipping) and one procedural skill (adhesiolysis). We selected 36 individual metrics on the four tasks that assessed six different aspects of performance, including speed, motion path length, respect for tissue, accuracy, task-specific errors, and successful task completion. Four of seven individual metrics assessed for peg transfer, six of ten metrics for cutting, four of nine metrics for clipping, and three of ten metrics for adhesiolysis discriminated between experience levels. Time and motion path length were significant on all four tasks. We used the validated individual metrics to create summary equations for each task, which successfully distinguished between the different experience levels. Educators should maintain some skepticism when reviewing the plethora of metrics captured by computer-based simulators, as some but not all are valid. We showed the construct validity of a limited number of individual metrics and developed summary metrics for the LapVR. The summary metrics provide a succinct way of assessing skill with a single metric for each task, but require further validation.

  12. Development and psychometric testing of the rural pregnancy experience scale (RPES).

    PubMed

    Kornelsen, Jude; Stoll, Kathrin; Grzybowski, Stefan

    2011-01-01

    Rural pregnant woman who lack local access to maternity care due to their remote living circumstances may experience stress and anxiety related to pregnancy and parturition. The Rural Pregnancy Experience Scale (RPES) was designed to assess the unique worry and concerns reflective of the stress and anxiety of rural pregnant women related to pregnancy and parturition. The items of the scale were designed based on the results of a qualitative study of the experiences of pregnant rural women, thereby building a priori content validity into the measure. The relevancy content validity index (CVI) for this instrument was 1.0 and the clarity CVI was .91, as rated by maternity care specialists. A field test of the RPES with 187 pregnant rural women from British Columbia indicated that it had two factors: financial worries and worries/concerns about maternity care services, which were consistent with the conceptual base of the tool. Cronbach's alpha for the total RPES was .91; for the financial worries subscale and the worries/concerns about maternity care services subscale, alpha were .89 and .88, respectively. Construct validity was supported by significant correlations between the total scores of the RPES and the Depression Anxiety Stress Scales (DASS [r =.39, p < .01]), and subscale scores on the RPES were significantly correlated and converged with the depression, anxiety, and stress subscales of the DASS supporting convergent validity (correlations ranged between .20; p < .05 and .43; p < .01). Construct validity was also supported by findings that the level of access and availability of maternity care services were significantly associated with RPES scores. It was concluded that the RPES is a reliable and valid measure of worries and concerns reflective of rural pregnant women's stress and anxiety related to pregnancy and parturition.

  13. Improved understanding of geologic CO{sub 2} storage processes requires risk-driven field experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldenburg, C.M.

    2011-06-01

    The need for risk-driven field experiments for CO{sub 2} geologic storage processes to complement ongoing pilot-scale demonstrations is discussed. These risk-driven field experiments would be aimed at understanding the circumstances under which things can go wrong with a CO{sub 2} capture and storage (CCS) project and cause it to fail, as distinguished from accomplishing this end using demonstration and industrial scale sites. Such risk-driven tests would complement risk-assessment efforts that have already been carried out by providing opportunities to validate risk models. In addition to experimenting with high-risk scenarios, these controlled field experiments could help validate monitoring approaches to improvemore » performance assessment and guide development of mitigation strategies.« less

  14. The Faculty Self-Reported Assessment Survey (FRAS): differentiating faculty knowledge and experience in assessment.

    PubMed

    Hanauer, David I; Bauerle, Cynthia

    2015-01-01

    Science, technology, engineering, and mathematics education reform efforts have called for widespread adoption of evidence-based teaching in which faculty members attend to student outcomes through assessment practice. Awareness about the importance of assessment has illuminated the need to understand what faculty members know and how they engage with assessment knowledge and practice. The Faculty Self-Reported Assessment Survey (FRAS) is a new instrument for evaluating science faculty assessment knowledge and experience. Instrument validation was composed of two distinct studies: an empirical evaluation of the psychometric properties of the FRAS and a comparative known-groups validation to explore the ability of the FRAS to differentiate levels of faculty assessment experience. The FRAS was found to be highly reliable (α = 0.96). The dimensionality of the instrument enabled distinction of assessment knowledge into categories of program design, instrumentation, and validation. In the known-groups validation, the FRAS distinguished between faculty groups with differing levels of assessment experience. Faculty members with formal assessment experience self-reported higher levels of familiarity with assessment terms, higher frequencies of assessment activity, increased confidence in conducting assessment, and more positive attitudes toward assessment than faculty members who were novices in assessment. These results suggest that the FRAS can reliably and validly differentiate levels of expertise in faculty knowledge of assessment. © 2015 D. I. Hanauer and C. Bauerle. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  15. Development and evaluation of the EOL-ICDQ as a measure of experiences, attitudes and knowledge in end-of-life in patients living with an implantable cardioverter defibrillator.

    PubMed

    Thylén, Ingela; Wenemark, Marika; Fluur, Christina; Strömberg, Anna; Bolse, Kärstin; Årestedt, Kristofer

    2014-04-01

    Due to extended indications and resynchronization therapy, many implantable cardioverter defibrillator (ICD) recipients will experience progressive co-morbid conditions and will be more likely to die of causes other than cardiac death. It is therefore important to elucidate the ICD patients' preferences when nearing end-of-life. Instead of avoiding the subject of end-of-life, a validated questionnaire may be helpful to explore patients' experiences and attitudes about end-of-life concerns and to assess knowledge of the function of the ICD in end-of-life. Validated instruments assessing patients' perspective concerning end-of-life issues are scarce. The purpose of this study was to develop and evaluate respondent satisfaction and measurement properties of the 'Experiences, Attitudes and Knowledge of End-of-Life Issues in Implantable Cardioverter Defibrillator Patients' Questionnaire' (EOL-ICDQ). The instrument was tested for validity, respondent satisfaction, and for homogeneity and stability in the Swedish language. An English version of the EOL-ICDQ was validated, but has not yet been pilot tested. The final instrument contained three domains, which were clustered into 39 items measuring: experiences (10 items), attitudes (18 items), and knowledge (11 items) of end-of-life concerns in ICD patients. In addition, the questionnaire also contained items on socio-demographic background (six items) and ICD-specific background (eight items). The validity and reliability properties were considered sufficient. The EOL-ICDQ has the potential to be used in clinical practice and future research. Further studies are needed using this instrument in an Anglo-Saxon context with a sample of English-speaking ICD recipients.

  16. Scale for positive aspects of caregiving experience: development, reliability, and factor structure.

    PubMed

    Kate, N; Grover, S; Kulhara, P; Nehra, R

    2012-06-01

    OBJECTIVE. To develop an instrument (Scale for Positive Aspects of Caregiving Experience [SPACE]) that evaluates positive caregiving experience and assess its psychometric properties. METHODS. Available scales which assess some aspects of positive caregiving experience were reviewed and a 50-item questionnaire with a 5-point rating was constructed. In all, 203 primary caregivers of patients with severe mental disorders were asked to complete the questionnaire. Internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity were evaluated. Principal component factor analysis was run to assess the factorial validity of the scale. RESULTS. The scale developed as part of the study was found to have good internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity. Principal component factor analysis yielded a 4-factor structure, which also had good test-retest reliability and cross-language reliability. There was a strong correlation between the 4 factors obtained. CONCLUSION. The SPACE developed as part of this study has good psychometric properties.

  17. Validation of multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Siewiorek, D. P.; Segall, Z.; Kong, T.

    1982-01-01

    Experiments that can be used to validate fault free performance of multiprocessor systems in aerospace systems integrating flight controls and avionics are discussed. Engineering prototypes for two fault tolerant multiprocessors are tested.

  18. Factor Analysis of the Mystical Experience Questionnaire: A Study of Experiences Occasioned by the Hallucinogen Psilocybin

    PubMed Central

    Maclean, Katherine A.; Leoutsakos, Jeannie-Marie S.; Johnson, Matthew W.; Griffiths, Roland R.

    2012-01-01

    A large body of historical evidence describes the use of hallucinogenic compounds, such as psilocybin mushrooms, for religious purposes. But few scientific studies have attempted to measure or characterize hallucinogen-occasioned spiritual experiences. The present study examined the factor structure of the Mystical Experience Questionnaire (MEQ), a self-report measure that has been used to assess the effects of hallucinogens in laboratory studies. Participants (N=1602) completed the 43-item MEQ in reference to a mystical or profound experience they had had after ingesting psilocybin. Exploratory factor analysis of the MEQ retained 30 items and revealed a 4-factor structure covering the dimensions of classic mystical experience: unity, noetic quality, sacredness (F1); positive mood (F2); transcendence of time/space (F3); and ineffability (F4). MEQ factor scores showed good internal reliability and correlated with the Hood Mysticism Scale, indicating convergent validity. Participants who endorsed having had a mystical experience on psilocybin, compared to those who did not, had significantly higher factor scores, indicating construct validity. The 4-factor structure was confirmed in a second sample (N=440) and demonstrated superior fit compared to alternative models. The results provide initial evidence of the validity, reliability, and factor structure of a 30-item scale for measuring single, hallucinogen-occasioned mystical experiences, which may be a useful tool in the scientific study of mysticism. PMID:23316089

  19. A New CCI ECV Release (v2.0) to Accurately Measure the Sea Level Change from space (1993-2015)

    NASA Astrophysics Data System (ADS)

    Legeais, Jean-Francois; Benveniste, Jérôme

    2017-04-01

    Accurate monitoring of the sea level is required to better understand its variability and changes. Sea level is one of the Essential Climate Variables (ECV) selected in the frame of the ESA Climate Change Initiative (CCI) program. It aims at providing a long-term homogeneous and accurate sea level record. The needs and feedback of the climate research community have been collected so that the development of the products is adapted to the users. A first version of the sea level ECV product has been generated during phase I of the project (2011-2013). Within phase II (2014-2016), the 15 partner consortium has prepared the production of a new reprocessed homogeneous and accurate altimeter sea level record which is now available (see http://www.esa-sealevel-cci.org/products ). New level 2 altimeter standards developed and tested within the project as well as external contributions have been identified, processed and evaluated by comparison with a reference for different altimeter missions (TOPEX/Poseidon, Jason-1 & 2, ERS-1 & 2, Envisat, GFO, SARAL/AltiKa and CryoSat-2). The main evolutions are associated with the wet troposphere correction (based on the GPD+ algorithm including inter calibration with respect to external sensors) but also to the orbit solutions (POE-E and GFZ15), the ERA-Interim based atmospheric corrections and the FES2014 ocean tide model. A new pole tide solution is used and anomalies are referenced to the MSS DTU15. The presentation will focus on the main achievements of the ESA CCI Sea Level project and on the description of the new SL_cci ECV release covering 1993-2015. The major steps required to produce the reprocessed 23 year climate time series will be described. The impacts of the selected level 2 altimeter standards on the SL_cci ECV have been assessed on different spatial scales (global, regional, mesoscale) and temporal scales (long-term, inter-annual, periodic signals). A significant improvement is observed compared to the current v1.1, with the main impacts observed on the long-term evolution on decadal time scale, on global and regional scales, and for mesoscale signals. The results from product validation, carried out by several groups of the ocean and climate modeling community will be also presented.

  20. The influence of cueing on attentional focus in perceptual decision making.

    PubMed

    Yang, Cheng-Ta; Little, Daniel R; Hsu, Ching-Chun

    2014-11-01

    Selective attention has been known to play an important role in decision making. In the present study, we combined a cueing paradigm with a redundant-target detection task to examine how attention affects the decision process when detecting the redundant targets. Cue validity was manipulated in two experiments. The results showed that when the cue was 50 % valid in one experiment, the participants adopted a parallel self-terminating processing strategy, indicative of a diffuse attentional focus on both target locations. When the cue was 100 % valid in the second experiment, all of the participants switched to a serial self-terminating processing strategy, which in our study indicated focused attention to a single target location. This study demonstrates the flexibility of the decision mechanism and highlights the importance of top-down control in selecting a decision strategy.

  1. SCALE TSUNAMI Analysis of Critical Experiments for Validation of 233U Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Don; Rearden, Bradley T

    2009-01-01

    Oak Ridge National Laboratory (ORNL) staff used the SCALE TSUNAMI tools to provide a demonstration evaluation of critical experiments considered for use in validation of current and anticipated operations involving {sup 233}U at the Radiochemical Development Facility (RDF). This work was reported in ORNL/TM-2008/196 issued in January 2009. This paper presents the analysis of two representative safety analysis models provided by RDF staff.

  2. Implicit attitudes towards homosexuality: reliability, validity, and controllability of the IAT.

    PubMed

    Banse, R; Seise, J; Zerbes, N

    2001-01-01

    Two experiments were conducted to investigate the psychometric properties of an Implicit Association Test (IAT; Greenwald, McGhee, & Schwartz, 1998) that was adapted to measure implicit attitudes towards homosexuality. In a first experiment, the validity of the Homosexuality-IAT was tested using a known group approach. Implicit and explicit attitudes were assessed in heterosexual and homosexual men and women (N = 101). The results provided compelling evidence for the convergent and discriminant validity of the Homosexuality-IAT as a measure of implicit attitudes. No evidence was found for two alternative explanations of IAT effects (familiarity with stimulus material and stereotype knowledge). The internal consistency of IAT scores was satisfactory (alpha s > .80), but retest correlations were lower. In a second experiment (N = 79) it was shown that uninformed participants were able to fake positive explicit but not implicit attitudes. Discrepancies between implicit and explicit attitudes towards homosexuality could be partially accounted for by individual differences in the motivation to control prejudiced behavior, thus providing independent evidence for the validity of the implicit attitude measure. Neither explicit nor implicit attitudes could be changed by persuasive messages. The results of both experiments are interpreted as evidence for a single construct account of implicit and explicit attitudes towards homosexuality.

  3. Crazy like a fox. Validity and ethics of animal models of human psychiatric disease.

    PubMed

    Rollin, Michael D H; Rollin, Bernard E

    2014-04-01

    Animal models of human disease play a central role in modern biomedical science. Developing animal models for human mental illness presents unique practical and philosophical challenges. In this article we argue that (1) existing animal models of psychiatric disease are not valid, (2) attempts to model syndromes are undermined by current nosology, (3) models of symptoms are rife with circular logic and anthropomorphism, (4) any model must make unjustified assumptions about subjective experience, and (5) any model deemed valid would be inherently unethical, for if an animal adequately models human subjective experience, then there is no morally relevant difference between that animal and a human.

  4. [Formal sample size calculation and its limited validity in animal studies of medical basic research].

    PubMed

    Mayer, B; Muche, R

    2013-01-01

    Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.

  5. An Approach for Validating Actinide and Fission Product Burnup Credit Criticality Safety Analyses: Criticality (k eff) Predictions

    DOE PAGES

    Scaglione, John M.; Mueller, Don E.; Wagner, John C.

    2014-12-01

    One of the most important remaining challenges associated with expanded implementation of burnup credit in the United States is the validation of depletion and criticality calculations used in the safety evaluation—in particular, the availability and use of applicable measured data to support validation, especially for fission products (FPs). Applicants and regulatory reviewers have been constrained by both a scarcity of data and a lack of clear technical basis or approach for use of the data. In this study, this paper describes a validation approach for commercial spent nuclear fuel (SNF) criticality safety (k eff) evaluations based on best-available data andmore » methods and applies the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The criticality validation approach utilizes not only available laboratory critical experiment (LCE) data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the French Haut Taux de Combustion program to support validation of the principal actinides but also calculated sensitivities, nuclear data uncertainties, and limited available FP LCE data to predict and verify individual biases for relevant minor actinides and FPs. The results demonstrate that (a) sufficient critical experiment data exist to adequately validate k eff calculations via conventional validation approaches for the primary actinides, (b) sensitivity-based critical experiment selection is more appropriate for generating accurate application model bias and uncertainty, and (c) calculated sensitivities and nuclear data uncertainties can be used for generating conservative estimates of bias for minor actinides and FPs. Results based on the SCALE 6.1 and the ENDF/B-VII.0 cross-section libraries indicate that a conservative estimate of the bias for the minor actinides and FPs is 1.5% of their worth within the application model. Finally, this paper provides a detailed description of the approach and its technical bases, describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models, and provides reference bias results based on the prerelease SCALE 6.1 code package and ENDF/B-VII nuclear cross-section data.« less

  6. Validating presupposed versus focused text information.

    PubMed

    Singer, Murray; Solar, Kevin G; Spear, Jackie

    2017-04-01

    There is extensive evidence that readers continually validate discourse accuracy and congruence, but that they may also overlook conspicuous text contradictions. Validation may be thwarted when the inaccurate ideas are embedded sentence presuppositions. In four experiments, we examined readers' validation of presupposed ("given") versus new text information. Throughout, a critical concept, such as a truck versus a bus, was introduced early in a narrative. Later, a character stated or thought something about the truck, which therefore matched or mismatched its antecedent. Furthermore, truck was presented as either given or new information. Mismatch target reading times uniformly exceeded the matching ones by similar magnitudes for given and new concepts. We obtained this outcome using different grammatical constructions and with different antecedent-target distances. In Experiment 4, we examined only given critical ideas, but varied both their matching and the main verb's factivity (e.g., factive know vs. nonfactive think). The Match × Factivity interaction closely resembled that previously observed for new target information (Singer, 2006). Thus, readers can successfully validate given target information. Although contemporary theories tend to emphasize either deficient or successful validation, both types of theory can accommodate the discourse and reader variables that may regulate validation.

  7. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability?

    PubMed

    Janssen, Ellen M; Marshall, Deborah A; Hauber, A Brett; Bridges, John F P

    2017-12-01

    The recent endorsement of discrete-choice experiments (DCEs) and other stated-preference methods by regulatory and health technology assessment (HTA) agencies has placed a greater focus on demonstrating the validity and reliability of preference results. Areas covered: We present a practical overview of tests of validity and reliability that have been applied in the health DCE literature and explore other study qualities of DCEs. From the published literature, we identify a variety of methods to assess the validity and reliability of DCEs. We conceptualize these methods to create a conceptual model with four domains: measurement validity, measurement reliability, choice validity, and choice reliability. Each domain consists of three categories that can be assessed using one to four procedures (for a total of 24 tests). We present how these tests have been applied in the literature and direct readers to applications of these tests in the health DCE literature. Based on a stakeholder engagement exercise, we consider the importance of study characteristics beyond traditional concepts of validity and reliability. Expert commentary: We discuss study design considerations to assess the validity and reliability of a DCE, consider limitations to the current application of tests, and discuss future work to consider the quality of DCEs in healthcare.

  8. A systematic review of the reliability and validity of discrete choice experiments in valuing non-market environmental goods.

    PubMed

    Rakotonarivo, O Sarobidy; Schaafsma, Marije; Hockley, Neal

    2016-12-01

    While discrete choice experiments (DCEs) are increasingly used in the field of environmental valuation, they remain controversial because of their hypothetical nature and the contested reliability and validity of their results. We systematically reviewed evidence on the validity and reliability of environmental DCEs from the past thirteen years (Jan 2003-February 2016). 107 articles met our inclusion criteria. These studies provide limited and mixed evidence of the reliability and validity of DCE. Valuation results were susceptible to small changes in survey design in 45% of outcomes reporting reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2-90% of respondents protested against a feature of the survey, and a considerable proportion found DCEs to be incomprehensible or inconsequential (17-40% and 10-62% respectively). DCE remains useful for non-market valuation, but its results should be used with caution. Given the sparse and inconclusive evidence base, we recommend that tests of reliability and validity are more routinely integrated into DCE studies and suggest how this might be achieved. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Validation of the Vanderbilt Holistic Face Processing Test.

    PubMed

    Wang, Chao-Chih; Ross, David A; Gauthier, Isabel; Richler, Jennifer J

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom), which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.

  10. Validation of the Vanderbilt Holistic Face Processing Test

    PubMed Central

    Wang, Chao-Chih; Ross, David A.; Gauthier, Isabel; Richler, Jennifer J.

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom), which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1. PMID:27933014

  11. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  12. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    ERIC Educational Resources Information Center

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  13. Demonstrating Experimenter "Ineptitude" as a Means of Teaching Internal and External Validity

    ERIC Educational Resources Information Center

    Treadwell, Kimberli R.H.

    2008-01-01

    Internal and external validity are key concepts in understanding the scientific method and fostering critical thinking. This article describes a class demonstration of a "botched" experiment to teach validity to undergraduates. Psychology students (N = 75) completed assessments at the beginning of the semester, prior to and immediately following…

  14. Latinas/os in Community College Developmental Education: Increasing Moments of Academic and Interpersonal Validation

    ERIC Educational Resources Information Center

    Acevedo-Gil, Nancy; Solorzano, Daniel G.; Santos, Ryan E.

    2014-01-01

    This qualitative study examines the experiences of Latinas/os in community college English and math developmental education courses. Critical race theory in education and the theory of validation serve as guiding frameworks. The authors find that institutional agents provide academic validation by emphasizing high expectations, focusing on social…

  15. Latinas/os in Community College Developmental Education: Increasing Moments of Academic and Interpersonal Validation

    ERIC Educational Resources Information Center

    Acevedo-Gil, Nancy; Santos, Ryan E.; Alonso, LLuliana; Solorzano, Daniel G.

    2015-01-01

    This qualitative study examines the experiences of Latinas/os in community college English and math developmental education courses. Critical race theory in education and the theory of validation serve as guiding frameworks. The authors find that institutional agents provide academic validation by emphasizing high expectations, focusing on social…

  16. Rediscovery rate estimation for assessing the validation of significant findings in high-throughput studies.

    PubMed

    Ganna, Andrea; Lee, Donghwan; Ingelsson, Erik; Pawitan, Yudi

    2015-07-01

    It is common and advised practice in biomedical research to validate experimental or observational findings in a population different from the one where the findings were initially assessed. This practice increases the generalizability of the results and decreases the likelihood of reporting false-positive findings. Validation becomes critical when dealing with high-throughput experiments, where the large number of tests increases the chance to observe false-positive results. In this article, we review common approaches to determine statistical thresholds for validation and describe the factors influencing the proportion of significant findings from a 'training' sample that are replicated in a 'validation' sample. We refer to this proportion as rediscovery rate (RDR). In high-throughput studies, the RDR is a function of false-positive rate and power in both the training and validation samples. We illustrate the application of the RDR using simulated data and real data examples from metabolomics experiments. We further describe an online tool to calculate the RDR using t-statistics. We foresee two main applications. First, if the validation study has not yet been collected, the RDR can be used to decide the optimal combination between the proportion of findings taken to validation and the size of the validation study. Secondly, if a validation study has already been done, the RDR estimated using the training data can be compared with the observed RDR from the validation data; hence, the success of the validation study can be assessed. © The Author 2014. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  17. Development and validation of the BRIGHTLIGHT Survey, a patient-reported experience measure for young people with cancer.

    PubMed

    Taylor, Rachel M; Fern, Lorna A; Solanki, Anita; Hooker, Louise; Carluccio, Anna; Pye, Julia; Jeans, David; Frere-Smith, Tom; Gibson, Faith; Barber, Julie; Raine, Rosalind; Stark, Dan; Feltbower, Richard; Pearce, Susie; Whelan, Jeremy S

    2015-07-28

    Patient experience is increasingly used as an indicator of high quality care in addition to more traditional clinical end-points. Surveys are generally accepted as appropriate methodology to capture patient experience. No validated patient experience surveys exist specifically for adolescents and young adults (AYA) aged 13-24 years at diagnosis with cancer. This paper describes early work undertaken to develop and validate a descriptive patient experience survey for AYA with cancer that encompasses both their cancer experience and age-related issues. We aimed to develop, with young people, an experience survey meaningful and relevant to AYA to be used in a longitudinal cohort study (BRIGHTLIGHT), ensuring high levels of acceptability to maximise study retention. A three-stage approach was employed: Stage 1 involved developing a conceptual framework, conducting literature/Internet searches and establishing content validity of the survey; Stage 2 confirmed the acceptability of methods of administration and consisted of four focus groups involving 11 young people (14-25 years), three parents and two siblings; and Stage 3 established survey comprehension through telephone-administered cognitive interviews with a convenience sample of 23 young people aged 14-24 years. Stage 1: Two-hundred and thirty eight questions were developed from qualitative reports of young people's cancer and treatment-related experience. Stage 2: The focus groups identified three core themes: (i) issues directly affecting young people, e.g. impact of treatment-related fatigue on ability to complete survey; (ii) issues relevant to the actual survey, e.g. ability to answer questions anonymously; (iii) administration issues, e.g. confusing format in some supporting documents. Stage 3: Cognitive interviews indicated high levels of comprehension requiring minor survey amendments. Collaborating with young people with cancer has enabled a survey of to be developed that is both meaningful to young people but also examines patient experience and outcomes associated with specialist cancer care. Engagement of young people throughout the survey development has ensured the content appropriately reflects their experience and is easily understood. The BRIGHTLIGHT survey was developed for a specific research project but has the potential to be used as a TYA cancer survey to assess patient experience and the care they receive.

  18. The SCALE Verified, Archived Library of Inputs and Data - VALID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less

  19. Inferring unknow boundary conditions of the Greenland Ice Sheet by assimilating ICESat-1 and IceBridge altimetry intothe Ice Sheet System Model.

    NASA Astrophysics Data System (ADS)

    Larour, E. Y.; Khazendar, A.; Seroussi, H. L.; Schlegel, N.; Csatho, B. M.; Schenk, A. F.; Rignot, E. J.; Morlighem, M.

    2014-12-01

    Altimetry signals from missions such as ICESat-1, CryoSat, EnviSat, as well as altimeters onboard Operation IceBridge provide vital insights into processes such as surface mass balance, mass transport and ice-flow dynamics. Historically however, ice-flow models have been focused on assimilating surface velocities from satellite-based radar observations, to infer properties such as basal friction or the position of the bedrock. Here, we leverage a new methodology based on automatic differentation of the Ice Sheet System Model to assimilate surface altimetry data into a reconstruction of the past decade of ice flow on the North Greenland area. We infer corrections to boundary conditions such as basal friction and surface mass balance, as well as corrections to the ice hardness, to best-match the observed altimetry record. We compare these corrections between glaciers such as Petermann Glacier, 79 North and Zacchariae Isstrom. The altimetry signals exhibit very different patterns between East and West, which translate into very different signatures for the inverted boundary conditions. This study gives us greater insights into what differentiates different basins, both in terms of mass transport and ice-flow dynamics, and what could bethe controlling mechanisms behind the very different evolutions of these basins.

  20. The International DORIS Service (IDS) - Recent Developments in Preparation for ITRF2013

    NASA Technical Reports Server (NTRS)

    Willis, Pascal; Lemoine, Frank G.; Moreaux, Guilhem; Soudarin, Laurent; Ferrage, Pascale; Ries, John; Otten, Michiel; Saunier, Jerome; Noll, Carey E.; Biancale, Richard; hide

    2014-01-01

    The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French DORIS tracking system and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). We first present some general background related to the DORIS system (current and planned satellites, current tracking network and expected evolution) and to the general IDS organization (from Data Centers, Analysis Centers and Combination Center). Then, we discuss some of the steps recently taken to prepare the IDS submission to ITRF2013 (combined weekly time series based on individual solutions from several Analysis Centers). In particular, recent results obtained from the Analysis Centers and the Combination Center show that improvements can still be made when updating physical models of some DORIS satellites, such as Envisat, Cryosat-2 or Jason-2. The DORIS contribution to ITRF2013 should also benefit from the larger number of ground observations collected by the last generation of DGXX receivers (first instrument being onboard Jason-2 satellite). In particular for polar motion, sub-millarcsecond accuracy seems now to be achievable. Weekly station positioning internal consistency also seems to be improved with a larger DORIS constellation.

  1. Climatology of the Arctic Sea Ice Thickness Distribution as a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Toppaladoddi, S.; Wettlaufer, J. S.

    2016-12-01

    We study the seasonal changes in the thickness distribution of Arctic sea ice, g(h), under climate forcing. Our analytical and numerical approach is based on a Fokker-Planck equation for g(h) (Toppaladoddi & Wettlaufer Phys. Rev. Lett. 115, 148501, 2015), in which the thermodynamic growth rates are determined using observed climatology. In particular, the Fokker-Planck equation is coupled to the observationally consistent thermodynamic model of Eisenman & Wettlaufer (Proc. Natl. Acad. Sci. USA 106, pp. 28-32, 2009). We find that due to the combined effects of thermodynamics and mechanics, g(h) spreads during winter and contracts during summer. This behavior is in agreement with recent satellite observations from CryoSat-2 (Kwok & Cunningham, Phil. Trans. R. Soc. A 373, 20140157, 2015). Because g(h) is a probability density function, we quantify all of the key moments (e.g., mean thickness, fraction of thin/thick ice, mean albedo, relaxation time scales) as greenhouse-gas radiative forcing, ΔF0, increases. The mean ice thickness decays exponentially with ΔF0, but much slower than do solely thermodynamic models. This exhibits the crucial role that ice mechanics plays in maintaining the ice cover, by redistributing thin ice to thick ice-far more rapidly than can thermal growth alone.

  2. Statistical Mechanics and the Climatology of the Arctic Sea Ice Thickness Distribution

    NASA Astrophysics Data System (ADS)

    Toppaladoddi, Srikanth; Wettlaufer, J. S.

    2017-05-01

    We study the seasonal changes in the thickness distribution of Arctic sea ice, g( h), under climate forcing. Our analytical and numerical approach is based on a Fokker-Planck equation for g( h) (Toppaladoddi and Wettlaufer in Phys Rev Lett 115(14):148501, 2015), in which the thermodynamic growth rates are determined using observed climatology. In particular, the Fokker-Planck equation is coupled to the observationally consistent thermodynamic model of Eisenman and Wettlaufer (Proc Natl Acad Sci USA 106:28-32, 2009). We find that due to the combined effects of thermodynamics and mechanics, g( h) spreads during winter and contracts during summer. This behavior is in agreement with recent satellite observations from CryoSat-2 (Kwok and Cunningham in Philos Trans R Soc A 373(2045):20140157, 2015). Because g( h) is a probability density function, we quantify all of the key moments (e.g., mean thickness, fraction of thin/thick ice, mean albedo, relaxation time scales) as greenhouse-gas radiative forcing, Δ F_0, increases. The mean ice thickness decays exponentially with Δ F_0, but much slower than do solely thermodynamic models. This exhibits the crucial role that ice mechanics plays in maintaining the ice cover, by redistributing thin ice to thick ice-far more rapidly than can thermal growth alone.

  3. The family experiences of in-hospital care questionnaire in severe traumatic brain injury (FECQ-TBI): a validation study.

    PubMed

    Anke, Audny; Manskow, Unn Sollid; Friborg, Oddgeir; Røe, Cecilie; Arntzen, Cathrine

    2016-11-28

    Family members are important for support and care of their close relative after severe traumas, and their experiences are vital health care quality indicators. The objective was to describe the development of the Family Experiences of in-hospital Care Questionnaire for family members of patients with severe Traumatic Brain Injury (FECQ-TBI), and to evaluate its psychometric properties and validity. The design of the study is a Norwegian multicentre study inviting 171 family members. The questionnaire developmental process included a literature review, use of an existing instrument (the parent experience of paediatric care questionnaire), focus group with close family members, as well as expert group judgments. Items asking for family care experiences related to acute wards and rehabilitation were included. Several items of the paediatric care questionnaire were removed or the wording of the items was changed to comply with the present purpose. Questions covering experiences with the inpatient rehabilitation period, the discharge phase, the family experiences with hospital facilities, the transfer between departments and the economic needs of the family were added. The developed questionnaire was mailed to the participants. Exploratory factor analyses were used to examine scale structure, in addition to screening for data quality, and analyses of internal consistency and validity. The questionnaire was returned by 122 (71%) of family members. Principal component analysis extracted six dimensions (eigenvalues > 1.0): acute organization and information (10 items), rehabilitation organization (13 items), rehabilitation information (6 items), discharge (4 items), hospital facilities-patients (4 items) and hospital facilities-family (2 items). Items related to the acute phase were comparable to items in the two dimensions of rehabilitation: organization and information. All six subscales had high Cronbach's alpha coefficients >0.80. The construct validity was confirmed. The FECQ-TBI assesses important aspects of in-hospital care in the acute and rehabilitation phases, as seen from a family perspective. The psychometric properties and the construct validity of the questionnaire were good, hence supporting the use of the FECQ-TBI to assess quality of care in rehabilitation departments.

  4. Isotope Inversion Experiment evaluating the suitability of calibration in surrogate matrix for quantification via LC-MS/MS-Exemplary application for a steroid multi-method.

    PubMed

    Suhr, Anna Catharina; Vogeser, Michael; Grimm, Stefanie H

    2016-05-30

    For quotable quantitative analysis of endogenous analytes in complex biological samples by isotope dilution LC-MS/MS, the creation of appropriate calibrators is a challenge, since analyte-free authentic material is in general not available. Thus, surrogate matrices are often used to prepare calibrators and controls. However, currently employed validation protocols do not include specific experiments to verify the suitability of a surrogate matrix calibration for quantification of authentic matrix samples. The aim of the study was the development of a novel validation experiment to test whether surrogate matrix based calibrators enable correct quantification of authentic matrix samples. The key element of the novel validation experiment is the inversion of nonlabelled analytes and their stable isotope labelled (SIL) counterparts in respect to their functions, i.e. SIL compound is the analyte and nonlabelled substance is employed as internal standard. As a consequence, both surrogate and authentic matrix are analyte-free regarding SIL analytes, which allows a comparison of both matrices. We called this approach Isotope Inversion Experiment. As figure of merit we defined the accuracy of inverse quality controls in authentic matrix quantified by means of a surrogate matrix calibration curve. As a proof-of-concept application a LC-MS/MS assay addressing six corticosteroids (cortisol, cortisone, corticosterone, 11-deoxycortisol, 11-deoxycorticosterone, and 17-OH-progesterone) was chosen. The integration of the Isotope Inversion Experiment in the validation protocol for the steroid assay was successfully realized. The accuracy results of the inverse quality controls were all in all very satisfying. As a consequence the suitability of a surrogate matrix calibration for quantification of the targeted steroids in human serum as authentic matrix could be successfully demonstrated. The Isotope Inversion Experiment fills a gap in the validation process for LC-MS/MS assays quantifying endogenous analytes. We consider it a valuable and convenient tool to evaluate the correct quantification of authentic matrix samples based on a calibration curve in surrogate matrix. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. A design of experiments approach to validation sampling for logistic regression modeling with error-prone medical records.

    PubMed

    Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay

    2016-04-01

    Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. EXCALIBUR-at-CALIBAN: a neutron transmission experiment for {sup 238}U(n,n'{sub continuum}γ) nuclear data validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, David; Leconte, Pierre; Destouches, Christophe

    2015-07-01

    Two recent papers justified a new experimental program to give a new basis for the validation of {sup 238}U nuclear data, namely neutron induced inelastic scattering and transport codes at neutron fission energies. The general idea is to perform a neutron transmission experiment through natural uranium material. As shown by Hans Bethe, neutron transmissions measured by dosimetric responses are linked to inelastic cross sections. This paper describes the principle and the results of such an experience called EXCALIBUR performed recently (January and October 2014) at the CALIBAN reactor facility. (authors)

  7. A Framework for Understanding Experiments

    DTIC Science & Technology

    2008-06-01

    operations. Experiments that emphasize free play and uncertainty in scenarios reflect conditions found in existent operations and satisfy external...validity Requirement 4, the ability to relate results. Conversely, experiments emphasizing similar conditions with diminished free play across multiple

  8. Anger Assessment in Clinical and Nonclinical Populations: Further Validation of the State-Trait Anger Expression Inventory-2.

    PubMed

    Lievaart, Marien; Franken, Ingmar H A; Hovens, Johannes E

    2016-03-01

    The most commonly used instrument for measuring anger is the State-Trait Anger Expression Inventory-2 (STAXI-2; Spielberger, 1999). This study further examines the validity of the STAXI-2 and compares anger scores between several clinical and nonclinical samples. Reliability, concurrent, and construct validity were investigated in Dutch undergraduate students (N = 764), a general population sample (N = 1211), and psychiatric outpatients (N = 226). The results support the reliability and validity of the STAXI-2. Concurrent validity was strong, with meaningful correlations between the STAXI-2 scales and anger-related constructs in both clinical and nonclinical samples. Importantly, patients showed higher experience and expression of anger than the general population sample. Additionally, forensic outpatients with addiction problems reported higher Anger Expression-Out than general psychiatric outpatients. Our conclusion is that the STAXI-2 is a suitable instrument to measure both the experience and the expression of anger in both general and clinical populations. © 2016 Wiley Periodicals, Inc.

  9. The Experiences in Close Relationship Scale (ECR)-short form: reliability, validity, and factor structure.

    PubMed

    Wei, Meifen; Russell, Daniel W; Mallinckrodt, Brent; Vogel, David L

    2007-04-01

    We developed a 12-item, short form of the Experiences in Close Relationship Scale (ECR; Brennan, Clark, & Shaver, 1998) across 6 studies. In Study 1, we examined the reliability and factor structure of the measure. In Studies 2 and 3, we cross-validated the reliability, factor structure, and validity of the short form measure; whereas in Study 4, we examined test-retest reliability over a 1-month period. In Studies 5 and 6, we further assessed the reliability, factor structure, and validity of the short version of the ECR when administered as a stand-alone instrument. Confirmatory factor analyses indicated that 2 factors, labeled Anxiety and Avoidance, provided a good fit to the data after removing the influence of response sets. We found validity to be equivalent for the short and the original versions of the ECR across studies. Finally, the results were comparable when we embedded the short form within the original version of the ECR and when we administered it as a stand-alone measure.

  10. CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction

    NASA Technical Reports Server (NTRS)

    Davis, David O.

    2015-01-01

    Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.

  11. Validation and Application of Pharmacokinetic Models for Interspecies Extrapolations in Toxicity Risk Assessments of Volatile Organics

    DTIC Science & Technology

    1989-07-21

    formulation of physiologically-based pharmacokinetic models. Adult male Sprague-Dawley rats and male beagle dogs will be administered equal doses...experiments in the 0 dog . Physiologically-based pharmacokinetic models will be developed and validated for oral and inhalation exposures to halocarbons...of conducting experiments in dogs . The original physiolo ic model for the rat will be scaled up to predict halocarbon pharmacokinetics in the dog . The

  12. Quality Control and Analysis of Microphysical Data Collected in TRMM Aircraft Validation Experiments

    NASA Technical Reports Server (NTRS)

    Heymsfield, Andrew J.

    2004-01-01

    This report summarizes our efforts on the funded project 'Quality Control and Analysis of Microphysical Data Collected in TRMM Airborne Validation Experiments', NASA NAG5-9663, Andrew Heymsfield, P. I. We begin this report by summarizing our activities in FY2000-FY2004. We then present some highlights of our work. The last part of the report lists the publications that have resulted from our funding through this grant.

  13. A Validation Framework for the Long Term Preservation of High Energy Physics Data

    NASA Astrophysics Data System (ADS)

    Ozerov, Dmitri; South, David M.

    2014-06-01

    The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.

  14. Utilization of sounding rockets and balloons in the German Space Programme

    NASA Astrophysics Data System (ADS)

    Preu, Peter; Friker, Achim; Frings, Wolfgang; Püttmann, Norbert

    2005-08-01

    Sounding rockets and balloons are important tools of Germany's Space Programme. DLR manages these activities and promotes scientific experiments and validation programmes within (1) Space Science, (2) Earth Observation, (3) Microgravity Research and (4) Re-entry Technologies (SHEFEX). In Space Science the present focus is at atmospheric research. Concerning Earth Observation balloon-borne measurements play a key role in the validation of atmospheric satellite sounders (ENVISAT). TEXUS and MAXUS sounding rockets are successfully used for short duration microgravity experiments. The Sharp Edge Flight Experiment SHEFEX will deliver data from a hypersonic flight for the validation of a new Thermal Protection System (TPS), wind tunnel testing and numerical analysis of aerothermodynamics. Signing the Revised Esrange and Andøya Special Project (EASP) Agreement 2006-2010 in June 2004 Germany has made an essential contribution to the long-term availability of the Scandinavian ranges for the European science community.

  15. Relations between inductive reasoning and deductive reasoning.

    PubMed

    Heit, Evan; Rotello, Caren M

    2010-05-01

    One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments. Experiment 1 showed 2 dissociations: For a common set of arguments, deduction judgments were more affected by validity, and induction judgments were more affected by similarity. Moreover, Experiment 2 showed that fast deduction judgments were like induction judgments-in terms of being more influenced by similarity and less influenced by validity, compared with slow deduction judgments. These novel results pose challenges for a 1-process account of reasoning and are interpreted in terms of a 2-process account of reasoning, which was implemented as a multidimensional signal detection model and applied to receiver operating characteristic data. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  16. Logical fallacies in animal model research.

    PubMed

    Sjoberg, Espen A

    2017-02-15

    Animal models of human behavioural deficits involve conducting experiments on animals with the hope of gaining new knowledge that can be applied to humans. This paper aims to address risks, biases, and fallacies associated with drawing conclusions when conducting experiments on animals, with focus on animal models of mental illness. Researchers using animal models are susceptible to a fallacy known as false analogy, where inferences based on assumptions of similarities between animals and humans can potentially lead to an incorrect conclusion. There is also a risk of false positive results when evaluating the validity of a putative animal model, particularly if the experiment is not conducted double-blind. It is further argued that animal model experiments are reconstructions of human experiments, and not replications per se, because the animals cannot follow instructions. This leads to an experimental setup that is altered to accommodate the animals, and typically involves a smaller sample size than a human experiment. Researchers on animal models of human behaviour should increase focus on mechanistic validity in order to ensure that the underlying causal mechanisms driving the behaviour are the same, as relying on face validity makes the model susceptible to logical fallacies and a higher risk of Type 1 errors. We discuss measures to reduce bias and risk of making logical fallacies in animal research, and provide a guideline that researchers can follow to increase the rigour of their experiments.

  17. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  18. Learning to recognize rat social behavior: Novel dataset and cross-dataset application.

    PubMed

    Lorbach, Malte; Kyriakou, Elisavet I; Poppe, Ronald; van Dam, Elsbeth A; Noldus, Lucas P J J; Veltkamp, Remco C

    2018-04-15

    Social behavior is an important aspect of rodent models. Automated measuring tools that make use of video analysis and machine learning are an increasingly attractive alternative to manual annotation. Because machine learning-based methods need to be trained, it is important that they are validated using data from different experiment settings. To develop and validate automated measuring tools, there is a need for annotated rodent interaction datasets. Currently, the availability of such datasets is limited to two mouse datasets. We introduce the first, publicly available rat social interaction dataset, RatSI. We demonstrate the practical value of the novel dataset by using it as the training set for a rat interaction recognition method. We show that behavior variations induced by the experiment setting can lead to reduced performance, which illustrates the importance of cross-dataset validation. Consequently, we add a simple adaptation step to our method and improve the recognition performance. Most existing methods are trained and evaluated in one experimental setting, which limits the predictive power of the evaluation to that particular setting. We demonstrate that cross-dataset experiments provide more insight in the performance of classifiers. With our novel, public dataset we encourage the development and validation of automated recognition methods. We are convinced that cross-dataset validation enhances our understanding of rodent interactions and facilitates the development of more sophisticated recognition methods. Combining them with adaptation techniques may enable us to apply automated recognition methods to a variety of animals and experiment settings. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. [Validation of a Japanese version of the Experience in Close Relationship- Relationship Structure].

    PubMed

    Komura, Kentaro; Murakami, Tatsuya; Toda, Koji

    2016-08-01

    The purpose of this study was to translate the Experience of Close Relationship-Relationship Structure (ECRRS) and evaluate its validity. In study 1 (N = 982), evidence based internal structure (factor structure, internal consistency, and correlation among sub-scales) and evidence based relations to other variables (depression, reassurance seeking and self-esteem) were confirmed. In study 2 (N = 563), evidence based on internal structure was reconfirmed, and evidence based relations to other variables (IWMS, RQ, and ECR-GO) were confirmed. In study 3 (N = 342), evidence based internal structure (test-retest reliability) was confirmed. Based on these results, we concluded that ECR-RS was valid for measuring adult attachment style.

  20. Use of integral experiments in support to the validation of JEFF-3.2 nuclear data evaluation

    NASA Astrophysics Data System (ADS)

    Leclaire, Nicolas; Cochet, Bertrand; Jinaphanh, Alexis; Haeck, Wim

    2017-09-01

    For many years now, IRSN has developed its own Monte Carlo continuous energy capability, which allows testing various nuclear data libraries. In that prospect, a validation database of 1136 experiments was built from cases used for the validation of the APOLLO2-MORET 5 multigroup route of the CRISTAL V2.0 package. In this paper, the keff obtained for more than 200 benchmarks using the JEFF-3.1.1 and JEFF-3.2 libraries are compared to benchmark keff values and main discrepancies are analyzed regarding the neutron spectrum. Special attention is paid on benchmarks for which the results have been highly modified between both JEFF-3 versions.

  1. Cryosat-2 and Sentinel-3 tropospheric corrections: their evaluation over rivers and lakes

    NASA Astrophysics Data System (ADS)

    Fernandes, Joana; Lázaro, Clara; Vieira, Telmo; Restano, Marco; Ambrózio, Américo; Benveniste, Jérôme

    2017-04-01

    In the scope of the Sentinel-3 Hydrologic Altimetry PrototypE (SHAPE) project, errors that presently affect the tropospheric corrections i.e. dry and wet tropospheric corrections (DTC and WTC, respectively) given in satellite altimetry products are evaluated over inland water regions. These errors arise because both corrections, function of altitude, are usually computed with respect to an incorrect altitude reference. Several regions of interest (ROI) where CryoSat-2 (CS-2) is operating in SAR/SAR-In modes were selected for this evaluation. In this study, results for Danube River, Amazon Basin, Vanern and Titicaca lakes, and Caspian Sea, using Level 1B CS-2 data, are shown. DTC and WTC have been compared to those derived from ECMWF Operational model and computed at different altitude references: i) ECMWF orography; ii) ACE2 (Altimeter Corrected Elevations 2) and GWD-LR (Global Width Database for Large Rivers) global digital elevation models; iii) mean lake level, derived from Envisat mission data, or river profile derived in the scope of SHAPE project by AlongTrack (ATK) using Jason-2 data. Whenever GNSS data are available in the ROI, a GNSS-derived WTC was also generated and used for comparison. Overall, results show that the tropospheric corrections present in CS-2 L1B products are provided at the level of ECMWF orography, which can depart from the mean lake level or river profile by hundreds of metres. Therefore, the use of the model orography originates errors in the corrections. To mitigate these errors, both DTC and WTC should be provided at the mean river profile/lake level. For example, for the Caspian Sea with a mean level of -27 m, the tropospheric corrections provided in CS-2 products were computed at mean sea level (zero level), leading therefore to a systematic error in the corrections. In case a mean lake level is not available, it can be easily determined from satellite altimetry. In the absence of a mean river profile, both mentioned DEM, considered better altimetric surfaces when compared to the ECMWF orography, can be used. When using the model orography, systematic errors up to 3-5 cm are found in the DTC for most of the selected regions, which can induce significant errors in e.g. the determination of mean river profiles or lake level time series. For the Danube River, larger DTC errors up to 10 cm, due to terrain characteristics, can appear. For the WTC, with higher spatial variability, model errors of magnitude 1-3 cm are expected over inland waters. In the Danube region, the comparison of GNSS- and ECMWF-derived WTC has shown that the error in the WTC computed at orography level can be up to 3 cm. WTC errors with this magnitude have been found for all ROI. Although globally small, these errors are systematic and must be corrected prior to the generation of CS-2 Level 2 products. Once computed at the mean profile and mean lake level, the results show that tropospheric corrections have accuracy better than 1 cm. This analysis is currently being extended to S3 data and the first results are shown.

  2. Development and validation of a Chinese music quality rating test.

    PubMed

    Cai, Yuexin; Zhao, Fei; Zheng, Yiqing

    2013-09-01

    The present study aims to develop and validate a Chinese music quality rating test (MQRT). In Experiment 1, 22 music pieces were initially selected and paired as a 'familiar music piece' and 'unfamiliar music piece' based on familiarities amongst the general public in the categories of classical music (6), Chinese folk music (8), and pop music (8). Following the selection criteria, one pair of music pieces from each music category was selected and used for the MQRT in Experiment 2. In Experiment 2, the MQRT was validated using these music pieces in the categories 'Pleasantness', 'Naturalness', 'Fullness', 'Roughness', and 'Sharpness'. Seventy-two adult participants and 30 normal-hearing listeners were recruited in Experiments 1 and 2, respectively. Significant differences between the familiar and unfamiliar music pieces were found in respect of pleasantness rating for folk and pop music pieces as well as in sharpness rating for pop music pieces. The comparison of music category effect on MQRT found significant differences in pleasantness, fullness, and sharpness ratings. The Chinese MQRT developed in the present study is an effective tool for assessing music quality.

  3. Development and validation of the crew-station system-integration research facility

    NASA Technical Reports Server (NTRS)

    Nedell, B.; Hardy, G.; Lichtenstein, T.; Leong, G.; Thompson, D.

    1986-01-01

    The various issues associated with the use of integrated flight management systems in aircraft were discussed. To address these issues a fixed base integrated flight research (IFR) simulation of a helicopter was developed to support experiments that contribute to the understanding of design criteria for rotorcraft cockpits incorporating advanced integrated flight management systems. A validation experiment was conducted that demonstrates the main features of the facility and the capability to conduct crew/system integration research.

  4. Extension and Validation of a Hybrid Particle-Finite Element Method for Hypervelocity Impact Simulation. Chapter 2

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.; Shivarama, Ravishankar

    2004-01-01

    The hybrid particle-finite element method of Fahrenthold and Horban, developed for the simulation of hypervelocity impact problems, has been extended to include new formulations of the particle-element kinematics, additional constitutive models, and an improved numerical implementation. The extended formulation has been validated in three dimensional simulations of published impact experiments. The test cases demonstrate good agreement with experiment, good parallel speedup, and numerical convergence of the simulation results.

  5. The Canadian Experiment for Freeze/Thaw in 2012 or 2013 CanEx-FT12 or FT13

    NASA Technical Reports Server (NTRS)

    Belair, Stephane; Bernier, Monique; Colliander, Andreas; Jackson, Thomas; McDonald, Kyle; Walker, Anne

    2011-01-01

    General objectives of the experiment are: Pre-launch Calibration/Validation of SMAP Freeze/Thaw products and retrieval algorithms and rehearsal for Soil Moisture Active-Passive (SMAP) post launch validation. The basis of the radar freeze-thaw measurement is the large shift in dielectric constant and backscatter (dB) between predominantly frozen & thawed conditions. The Dielectric constant of liquid water varies with frequency, whereas that of pure ice is constant

  6. Rover-based visual target tracking validation and mission infusion

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Steele, Robert D.; Ansar, Adnan I.; Ali, Khaled; Nesnas, Issa

    2005-01-01

    The Mars Exploration Rovers (MER'03), Spirit and Opportunity, represent the state of the art in rover operations on Mars. This paper presents validation experiments of different visual tracking algorithms using the rover's navigation camera.

  7. Validation of a wireless modular monitoring system for structures

    NASA Astrophysics Data System (ADS)

    Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind

    2002-06-01

    A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.

  8. Analysis procedures and subjective flight results of a simulator validation and cue fidelity experiment

    NASA Technical Reports Server (NTRS)

    Carr, Peter C.; Mckissick, Burnell T.

    1988-01-01

    A joint experiment to investigate simulator validation and cue fidelity was conducted by the Dryden Flight Research Facility of NASA Ames Research Center (Ames-Dryden) and NASA Langley Research Center. The primary objective was to validate the use of a closed-loop pilot-vehicle mathematical model as an analytical tool for optimizing the tradeoff between simulator fidelity requirements and simulator cost. The validation process includes comparing model predictions with simulation and flight test results to evaluate various hypotheses for differences in motion and visual cues and information transfer. A group of five pilots flew air-to-air tracking maneuvers in the Langley differential maneuvering simulator and visual motion simulator and in an F-14 aircraft at Ames-Dryden. The simulators used motion and visual cueing devices including a g-seat, a helmet loader, wide field-of-view horizon, and a motion base platform.

  9. The New Millennium Program: Validating Advanced Technologies for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Minning, Charles P.; Luers, Philip

    1999-01-01

    This presentation reviews the activities of the New Millennium Program (NMP) in validating advanced technologies for space missions. The focus of these breakthrough technologies are to enable new capabilities to fulfill the science needs, while reducing costs of future missions. There is a broad spectrum of NMP partners, including government agencies, universities and private industry. The DS-1 was launched on October 24, 1998. Amongst the technologies validated by the NMP on DS-1 are: a Low Power Electronics Experiment, the Power Activation and Switching Module, Multi-Functional Structures. The first two of these technologies are operational and the data analysis is still ongoing. The third program is also operational, and its performance parameters have been verified. The second program, DS-2, was launched January 3 1999. It is expected to impact near Mars southern polar region on 3 December 1999. The technologies used on this mission awaiting validation are an advanced microcontroller, a power microelectronics unit, an evolved water experiment and soil thermal conductivity experiment, Lithium-Thionyl Chloride batteries, the flexible cable interconnect, aeroshell/entry system, and a compact telecom system. EO-1 on schedule for launch in December 1999 carries several technologies to be validated. Amongst these are: a Carbon-Carbon Radiator, an X-band Phased Array Antenna, a pulsed plasma thruster, a wideband advanced recorder processor, an atmospheric corrector, lightweight flexible solar arrays, Advanced Land Imager and the Hyperion instrument

  10. Evaluation of Fission Product Critical Experiments and Associated Biases for Burnup Credit Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Don; Rearden, Bradley T; Reed, Davis Allan

    2010-01-01

    One of the challenges associated with implementation of burnup credit is the validation of criticality calculations used in the safety evaluation; in particular the availability and use of applicable critical experiment data. The purpose of the validation is to quantify the relationship between reality and calculated results. Validation and determination of bias and bias uncertainty require the identification of sets of critical experiments that are similar to the criticality safety models. A principal challenge for crediting fission products (FP) in a burnup credit safety evaluation is the limited availability of relevant FP critical experiments for bias and bias uncertainty determination.more » This paper provides an evaluation of the available critical experiments that include FPs, along with bounding, burnup-dependent estimates of FP biases generated by combining energy dependent sensitivity data for a typical burnup credit application with the nuclear data uncertainty information distributed with SCALE 6. A method for determining separate bias and bias uncertainty values for individual FPs and illustrative results is presented. Finally, a FP bias calculation method based on data adjustment techniques and reactivity sensitivity coefficients calculated with the SCALE sensitivity/uncertainty tools and some typical results is presented. Using the methods described in this paper, the cross-section bias for a representative high-capacity spent fuel cask associated with the ENDF/B-VII nuclear data for 16 most important stable or near stable FPs is predicted to be no greater than 2% of the total worth of the 16 FPs, or less than 0.13 % k/k.« less

  11. Probing eukaryotic cell mechanics via mesoscopic simulations

    NASA Astrophysics Data System (ADS)

    Pivkin, Igor V.; Lykov, Kirill; Nematbakhsh, Yasaman; Shang, Menglin; Lim, Chwee Teck

    2017-11-01

    We developed a new mesoscopic particle based eukaryotic cell model which takes into account cell membrane, cytoskeleton and nucleus. The breast epithelial cells were used in our studies. To estimate the viscoelastic properties of cells and to calibrate the computational model, we performed micropipette aspiration experiments. The model was then validated using data from microfluidic experiments. Using the validated model, we probed contributions of sub-cellular components to whole cell mechanics in micropipette aspiration and microfluidics experiments. We believe that the new model will allow to study in silico numerous problems in the context of cell biomechanics in flows in complex domains, such as capillary networks and microfluidic devices.

  12. The Synthetic Experiment: E. B. Titchener's Cornell Psychological Laboratory and the Test of Introspective Analysis.

    PubMed

    Evans, Rand B

    2017-01-01

    Beginning in 1 9a0, a major thread of research was added to E. B. Titchener's Cornell laboratory: the synthetic experiment. Titchener and his graduate students used introspective analysis to reduce a perception, a complex experience, into its simple sensory constituents. To test the validity of that analysis, stimulus patterns were selected to reprodiuce the patterns of sensations found in the introspective analyses. If the original perception can be reconstructed in this way, then the analysis was considered validated. This article reviews development of the synthetic method in E. B. Titchener's laboratory at Cornell University and examines its impact on psychological research.

  13. The Validity and Precision of the Comparative Interrupted Time-Series Design: Three Within-Study Comparisons

    ERIC Educational Resources Information Center

    St. Clair, Travis; Hallberg, Kelly; Cook, Thomas D.

    2016-01-01

    We explore the conditions under which short, comparative interrupted time-series (CITS) designs represent valid alternatives to randomized experiments in educational evaluations. To do so, we conduct three within-study comparisons, each of which uses a unique data set to test the validity of the CITS design by comparing its causal estimates to…

  14. The Validity and Reliability of Rhythm Measurements in Automatically Scoring the English Rhythm Proficiency of Chinese EFL Learners

    ERIC Educational Resources Information Center

    Chen, Jin; Lin, Jianghao; Li, Xinguang

    2015-01-01

    This article aims to find out the validity of rhythm measurements to capture the rhythmic features of Chinese English. Besides, the reliability of the valid rhythm measurements applied in automatically scoring the English rhythm proficiency of Chinese EFL learners is also explored. Thus, two experiments were carried out. First, thirty students of…

  15. Freshwater fluxes into the subpolar North Atlantic from secular trends in Arctic land ice mass balance

    NASA Astrophysics Data System (ADS)

    Bamber, J. L.; Enderlin, E. M.; Howat, I. M.; Wouters, B.; van den Broeke, M.

    2015-12-01

    Freshwater fluxes (FWF) from river runoff and precipitation minus evaporation for the pan Arctic seas are relatively well documented and prescribed in ocean GCMs. Fluxes from Greenland and Arctic glaciers and ice caps on the other hand are generally ignored, despite their potential impacts on ocean circulation and marine biology and growing evidence for changes to the hydrography of parts of the subpolar North Atlantic. In a previous study we determined the FWF from Greenland for the period 1958-2010 using a combination of observations and regional climate modeling. Here, we update the analysis with data from new satellite observations to extend the record both in space and time. The new FWF estimates cover the period 1958-2014 and include the Canadian, Russian and Norwegian Arctic (Svalbard) in addition to the contributions from Greenland. We combine satellite altimetry (including CryoSat 2) with grounding line flux data, regional climate modeling of surface mass balance and gravimetry to produce consistent estimates of solid ice and liquid FWF into the Arctic and North Atlantic Oceans. The total cumulative FWF anomaly from land ice mass loss started to increase significantly in the mid 1990s and now exceeds 5000 km^3, a value that is about half of the Great Salinity Anomaly of the 1970s. The majority of the anomaly is entering two key areas of deep water overturning in the Labrador and Irminger Seas, at a rate that has been increasing steadily over the last ~20 years. Since the mid 2000s, however, the Canadian Arctic archipelago has been making a significant contribution to the FW anomaly entering Baffin Bay. Tracer experiments with eddy-permitting ocean GCMs suggest that the FW input from southern Greenland and the Canadian Arctic should accumulate in Baffin Bay with the potential to affect geostrophic circulation, stratification in the region and possibly the strength of the Atlantic Meridional Overturning Circulation. We also examine the trajectory of freshwater input in the form of icebergs, which does not have the same fate as liquid input.

  16. Operational support to collision avoidance activities by ESA's space debris office

    NASA Astrophysics Data System (ADS)

    Braun, V.; Flohrer, T.; Krag, H.; Merz, K.; Lemmens, S.; Bastida Virgili, B.; Funke, Q.

    2016-09-01

    The European Space Agency's (ESA) Space Debris Office provides a service to support operational collision avoidance activities. This support currently covers ESA's missions Cryosat-2, Sentinel-1A and -2A, the constellation of Swarm-A/B/C in low-Earth orbit (LEO), as well as missions of third-party customers. In this work, we describe the current collision avoidance process for ESA and third-party missions in LEO. We give an overview on the upgrades developed and implemented since the advent of conjunction summary messages (CSM)/conjunction data messages (CDM), addressing conjunction event detection, collision risk assessment, orbit determination, orbit and covariance propagation, process control, and data handling. We pay special attention to the effect of warning thresholds on the risk reduction and manoeuvre rates, as they are established through risk mitigation and analysis tools, such as ESA's Debris Risk Assessment and Mitigation Analysis (DRAMA) software suite. To handle the large number of CDMs and the associated risk analyses, a database-centric approach has been developed. All CDMs and risk analysis results are stored in a database. In this way, a temporary local "mini-catalogue" of objects close to our target spacecraft is obtained, which can be used, e.g., for manoeuvre screening and to update the risk analysis whenever a new ephemeris becomes available from the flight dynamics team. The database is also used as the backbone for a Web-based tool, which consists of the visualization component and a collaboration tool that facilitates the status monitoring and task allocation within the support team as well as communication with the control team. The visualization component further supports the information sharing by displaying target and chaser motion over time along with the involved uncertainties. The Web-based solution optimally meets the needs for a concise and easy-to-use way to obtain a situation picture in a very short time, and the support for third-party missions not operated from the European Space Operations Centre (ESOC). Finally, we provide statistics on the identified conjunction events, taking into account the known significant changes in the LEO orbital environment and share ESA's experience along with recent examples.

  17. Altimetry, gravimetry, GPS and viscoelastic modeling data for the joint inversion for glacial isostatic adjustment in Antarctica (ESA STSE Project REGINA)

    NASA Astrophysics Data System (ADS)

    Sasgen, Ingo; Martín-Español, Alba; Horvath, Alexander; Klemann, Volker; Petrie, Elizabeth J.; Wouters, Bert; Horwath, Martin; Pail, Roland; Bamber, Jonathan L.; Clarke, Peter J.; Konrad, Hannes; Wilson, Terry; Drinkwater, Mark R.

    2018-03-01

    The poorly known correction for the ongoing deformation of the solid Earth caused by glacial isostatic adjustment (GIA) is a major uncertainty in determining the mass balance of the Antarctic ice sheet from measurements of satellite gravimetry and to a lesser extent satellite altimetry. In the past decade, much progress has been made in consistently modeling ice sheet and solid Earth interactions; however, forward-modeling solutions of GIA in Antarctica remain uncertain due to the sparsity of constraints on the ice sheet evolution, as well as the Earth's rheological properties. An alternative approach towards estimating GIA is the joint inversion of multiple satellite data - namely, satellite gravimetry, satellite altimetry and GPS, which reflect, with different sensitivities, trends in recent glacial changes and GIA. Crucial to the success of this approach is the accuracy of the space-geodetic data sets. Here, we present reprocessed rates of surface-ice elevation change (Envisat/Ice, Cloud,and land Elevation Satellite, ICESat; 2003-2009), gravity field change (Gravity Recovery and Climate Experiment, GRACE; 2003-2009) and bedrock uplift (GPS; 1995-2013). The data analysis is complemented by the forward modeling of viscoelastic response functions to disc load forcing, allowing us to relate GIA-induced surface displacements with gravity changes for different rheological parameters of the solid Earth. The data and modeling results presented here are available in the PANGAEA database (https://doi.org/10.1594/PANGAEA.875745). The data sets are the input streams for the joint inversion estimate of present-day ice-mass change and GIA, focusing on Antarctica. However, the methods, code and data provided in this paper can be used to solve other problems, such as volume balances of the Antarctic ice sheet, or can be applied to other geographical regions in the case of the viscoelastic response functions. This paper presents the first of two contributions summarizing the work carried out within a European Space Agency funded study: Regional glacial isostatic adjustment and CryoSat elevation rate corrections in Antarctica (REGINA).

  18. Methodology and issues of integral experiments selection for nuclear data validation

    NASA Astrophysics Data System (ADS)

    Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian

    2017-09-01

    Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).

  19. Semi-structured Interview Measure of Stigma (SIMS) in psychosis: Assessment of psychometric properties.

    PubMed

    Wood, Lisa; Burke, Eilish; Byrne, Rory; Enache, Gabriela; Morrison, Anthony P

    2016-10-01

    Stigma is a significant difficulty for people who experience psychosis. To date, there have been no outcome measures developed to examine stigma exclusively in people with psychosis. The aim of this study was develop and validate a semi-structured interview measure of stigma (SIMS) in psychosis. The SIMS is an eleven item measure of stigma developed in consultation with service users who have experienced psychosis. 79 participants with experience of psychosis were recruited for the purposes of this study. They were administered the SIMS alongside a battery of other relevant outcome measures to examine reliability and validity. A one-factor solution was identified for the SIMS which encompassed all ten rateable items. The measure met all reliability and validity criteria and illustrated good internal consistency, inter-rater reliability, test retest reliability, criterion validity, construct validity, sensitivity to change and had no floor or ceiling effects. The SIMS is a reliable and valid measure of stigma in psychosis. It may be more engaging and acceptable than other stigma measures due to its semi-structured interview format. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  20. [Perception scales of validated food insecurity: the experience of the countries in Latin America and the Caribbean].

    PubMed

    Sperandio, Naiara; Morais, Dayane de Castro; Priore, Silvia Eloiza

    2018-02-01

    The scope of this systematic review was to compare the food insecurity scales validated and used in the countries in Latin America and the Caribbean, and analyze the methods used in validation studies. A search was conducted in the Lilacs, SciELO and Medline electronic databases. The publications were pre-selected by titles and abstracts, and subsequently by a full reading. Of the 16,325 studies reviewed, 14 were selected. Twelve validated scales were identified for the following countries: Venezuela, Brazil, Colombia, Bolivia, Ecuador, Costa Rica, Mexico, Haiti, the Dominican Republic, Argentina and Guatemala. Besides these, there is the Latin American and Caribbean scale, the scope of which is regional. The scales ranged from the standard reference used, number of questions and diagnosis of insecurity. The methods used by the studies for internal validation were calculation of Cronbach's alpha and the Rasch model; for external validation the authors calculated association and /or correlation with socioeconomic and food consumption variables. The successful experience of Latin America and the Caribbean in the development of national and regional scales can be an example for other countries that do not have this important indicator capable of measuring the phenomenon of food insecurity.

  1. Replicating the Z iron opacity experiments on the NIF

    NASA Astrophysics Data System (ADS)

    Perry, T. S.; Heeter, R. F.; Opachich, Y. P.; Ross, P. W.; Kline, J. L.; Flippo, K. A.; Sherrill, M. E.; Dodd, E. S.; DeVolder, B. G.; Cardenas, T.; Archuleta, T. N.; Craxton, R. S.; Zhang, R.; McKenty, P. W.; Garcia, E. M.; Huffman, E. J.; King, J. A.; Ahmed, M. F.; Emig, J. A.; Ayers, S. L.; Barrios, M. A.; May, M. J.; Schneider, M. B.; Liedahl, D. A.; Wilson, B. G.; Urbatsch, T. J.; Iglesias, C. A.; Bailey, J. E.; Rochau, G. A.

    2017-06-01

    X-ray opacity is a crucial factor of all radiation-hydrodynamics calculations, yet it is one of the least validated of the material properties in the simulation codes. Recent opacity experiments at the Sandia Z-machine have shown up to factors of two discrepancies between theory and experiment, casting doubt on the validity of the opacity models. Therefore, a new experimental opacity platform is being developed on the National Ignition Facility (NIF) not only to verify the Z-machine experimental results but also to extend the experiments to other temperatures and densities. The first experiments will be directed towards measuring the opacity of iron at a temperature of ∼160 eV and an electron density of ∼7 × 1021 cm-3. Preliminary experiments on NIF have demonstrated the ability to create a sufficiently bright point backlighter using an imploding plastic capsule and also a hohlraum that can heat the opacity sample to the desired conditions. The first of these iron opacity experiments is expected to be performed in 2017.

  2. Inner experience in the scanner: can high fidelity apprehensions of inner experience be integrated with fMRI?

    PubMed Central

    Kühn, Simone; Fernyhough, Charles; Alderson-Day, Benjamin; Hurlburt, Russell T.

    2014-01-01

    To provide full accounts of human experience and behavior, research in cognitive neuroscience must be linked to inner experience, but introspective reports of inner experience have often been found to be unreliable. The present case study aimed at providing proof of principle that introspection using one method, descriptive experience sampling (DES), can be reliably integrated with fMRI. A participant was trained in the DES method, followed by nine sessions of sampling within an MRI scanner. During moments where the DES interview revealed ongoing inner speaking, fMRI data reliably showed activation in classic speech processing areas including left inferior frontal gyrus. Further, the fMRI data validated the participant’s DES observations of the experiential distinction between inner speaking and innerly hearing her own voice. These results highlight the precision and validity of the DES method as a technique of exploring inner experience and the utility of combining such methods with fMRI. PMID:25538649

  3. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  4. Validation of the Neonatal Satisfaction Survey (NSS-8) in six Norwegian neonatal intensive care units: a quantitative cross-sectional study.

    PubMed

    Hagen, Inger Hilde; Svindseth, Marit Følsvik; Nesset, Erik; Orner, Roderick; Iversen, Valentina Cabral

    2018-03-27

    The experience of having their new-borns admitted to an intensive care unit (NICU) can be extremely distressing. Subsequent risk of post-incident-adjustment difficulties are increased for parents, siblings, and affected families. Patient and next of kin satisfaction surveys provide key indicators of quality in health care. Methodically constructed and validated survey tools are in short supply and parents' experiences of care in Neonatal Intensive Care Units is under-researched. This paper reports a validation of the Neonatal Satisfaction Survey (NSS-8) in six Norwegian NICUs. Parents' survey returns were collected using the Neonatal Satisfaction Survey (NSS-13). Data quality and psychometric properties were systematically assessed using exploratory factor analysis, tests of internal consistency, reliability, construct, convergent and discriminant validity. Each set of hospital returns were subjected to an apostasy analysis before an overall satisfaction rate was calculated. The survey sample of 568 parents represents 45% of total eligible population for the period of the study. Missing data accounted for 1,1% of all returns. Attrition analysis shows congruence between sample and total population. Exploratory factor analysis identified eight factors of concern to parents,"Care and Treatment", "Doctors", "Visits", "Information", "Facilities", "Parents' Anxiety", "Discharge" and "Sibling Visits". All factors showed satisfactory internal consistency, good reliability (Cronbach's alpha ranged from 0.70-0.94). For the whole scale of 51 items α 0.95. Convergent validity using Spearman's rank between the eight factors and question measuring overall satisfaction was significant on all factors. Discriminant validity was established for all factors. Overall satisfaction rates ranged from 86 to 90% while for each of the eight factors measures of satisfaction varied between 64 and 86%. The NSS-8 questionnaire is a valid and reliable scale for measuring parents' assessment of quality of care in NICU. Statistical analysis confirms the instrument's capacity to gauge parents' experiences of NICU. Further research is indicated to validate the survey questionnaire in other Nordic countries and beyond.

  5. Longitudinal Measurement Equivalence of Subjective Language Brokering Experiences Scale in Mexican American Adolescents

    PubMed Central

    Kim, Su Yeong; Hou, Yang; Shen, Yishan; Zhang, Minyu

    2016-01-01

    Objectives Language brokering occurs frequently in immigrant families and can have significant implications for the well-being of family members involved. The present study aimed to develop and validate a measure that can be used to assess multiple dimensions of subjective language brokering experiences among Mexican American adolescents. Methods Participants were 557 adolescent language brokers (54.2% female, Mage.wave1 =12.96, SD=.94) in Mexican American families. Results Using exploratory and confirmatory factor analyses, we were able to identify seven reliable subscales of language brokering: linguistic benefits, socio-emotional benefits, efficacy, positive parent-child relationships, parental dependence, negative feelings, and centrality. Tests of factorial invariance show that these subscales demonstrate, at minimum, partial strict invariance across time and across experiences of translating for mothers and fathers, and in most cases, also across adolescent gender, nativity, and translation frequency. Thus, in general, the means of the subscales and the relations among the subscales with other variables can be compared across these different occasions and groups. Tests of criterion-related validity demonstrated that these subscales correlated, concurrently and longitudinally, with parental warmth and hostility, parent-child alienation, adolescent family obligation, depressive symptoms, resilience, and life meaning. Conclusions This reliable and valid subjective language brokering experiences scale will be helpful for gaining a better understanding of adolescents’ language brokering experiences with their mothers and fathers, and how such experiences may influence their development. PMID:27362872

  6. Reliability, validity and sensitivity of a computerized visual analog scale measuring state anxiety.

    PubMed

    Abend, Rany; Dan, Orrie; Maoz, Keren; Raz, Sivan; Bar-Haim, Yair

    2014-12-01

    Assessment of state anxiety is frequently required in clinical and research settings, but its measurement using standard multi-item inventories entails practical challenges. Such inventories are increasingly complemented by paper-and-pencil, single-item visual analog scales measuring state anxiety (VAS-A), which allow rapid assessment of current anxiety states. Computerized versions of VAS-A offer additional advantages, including facilitated and accurate data collection and analysis, and applicability to computer-based protocols. Here, we establish the psychometric properties of a computerized VAS-A. Experiment 1 assessed the reliability, convergent validity, and discriminant validity of the computerized VAS-A in a non-selected sample. Experiment 2 assessed its sensitivity to increase in state anxiety following social stress induction, in participants with high levels of social anxiety. Experiment 1 demonstrated the computerized VAS-A's test-retest reliability (r = .44, p < .001); convergent validity with the State-Trait Anxiety Inventory's state subscale (STAI-State; r = .60, p < .001); and discriminant validity as indicated by significantly lower correlations between VAS-A and different psychological measures relative to the correlation between VAS-A and STAI-State. Experiment 2 demonstrated the VAS-A's sensitivity to changes in state anxiety via a significant pre- to during-stressor rise in VAS-A scores (F(1,48) = 25.13, p < .001). Set-order administration of measures, absence of clinically-anxious population, and gender-unbalanced samples. The adequate psychometric characteristics, combined with simple and rapid administration, make the computerized VAS-A a valuable self-rating tool for state anxiety. It may prove particularly useful for clinical and research settings where multi-item inventories are less applicable, including computer-based treatment and assessment protocols. The VAS-A is freely available: http://people.socsci.tau.ac.il/mu/anxietytrauma/visual-analog-scale/. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Radiative transfer model validations during the First ISLSCP Field Experiment

    NASA Technical Reports Server (NTRS)

    Frouin, Robert; Breon, Francois-Marie; Gautier, Catherine

    1990-01-01

    Two simple radiative transfer models, the 5S model based on Tanre et al. (1985, 1986) and the wide-band model of Morcrette (1984) are validated by comparing their outputs with results obtained during the First ISLSCP Field Experiment on concomitant radiosonde, aerosol turbidity, and radiation measurements and sky photographs. Results showed that the 5S model overestimates the short-wave irradiance by 13.2 W/sq m, whereas the Morcrette model underestimated the long-wave irradiance by 7.4 W/sq m.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchat, Thomas K.; Jernigan, Dann A.

    A set of experiments and test data are outlined in this report that provides radiation intensity data for the validation of models for the radiative transfer equation. The experiments were performed with lightly-sooting liquid hydrocarbon fuels that yielded fully turbulent fires 2 m diameter). In addition, supplemental measurements of air flow and temperature, fuel temperature and burn rate, and flame surface emissive power, wall heat, and flame height and width provide a complete set of boundary condition data needed for validation of models used in fire simulations.

  9. Assessing positive and negative experiences: validation of a new measure of well-being in an Italian population.

    PubMed

    Corno, Giulia; Molinari, Guadalupe; Baños, Rosa Maria

    2016-01-01

    The aim of this study is to explore the psychometric properties of an affect scale, the Scale of Positive and Negative Experience (SPANE), in an Italian-speaking population. The results of this study demonstrate that the Italian version of the SPANE has psychometric properties similar to those shown by the original and previous versions, and it presents satisfactory reliability and factorial validity. The results of the Confirmatory Factor Analysis support the expected two-factor structure, positive and negative feeling, which characterized the previous versions. As expected, measures of negative affect, anxiety, negative future expectances, and depression correlated positively with the negative experiences SPANE subscale, and negatively with the positive experiences SPANE subscale. Results of this study demonstrate that the Italian version of the SPANE has psychometric properties similar to those shown by the original and previous versions, and it presents satisfactory reliability and factorial validity. The use of this instrument provides clinically useful information about a person’s overall emotional experience and it is an indicator of well-being. Although further studies are required to confirm the psychometric characteristics of the scale, the SPANE Italian version is expected to improve theoretical and empirical research on the well-being of the Italian population.

  10. Work-related subjective experiences among community residents with schizophrenia or schizoaffective disorder.

    PubMed

    Waghorn, Geoff; Chant, David; King, Robert

    2005-04-01

    To develop a self-report scale of subjective experiences of illness perceived to impact on employment functioning, as an alternative to a diagnostic perspective, for anticipating the vocational assistance needs of people with schizophrenia or schizoaffective disorders. A repeated measures pilot study (n(1) = 26, n(2) = 21) of community residents with schizophrenia identified a set of work-related subjective experiences perceived to impact on employment functioning. Items with the best psychometric properties were applied in a 12 month longitudinal survey of urban residents with schizophrenia or schizoaffective disorder (n(1) = 104; n(2) = 94; n(3) = 94). Construct validity, factor structure, responsiveness, internal consistency, stability, and criterion validity investigations produced favourable results. Work-related subjective experiences provide information about the intersection of the person, the disorder, and expectations of employment functioning, which suggest new opportunities for vocational professionals to explore and discuss individual assistance needs. Further psychometric investigations of test-retest reliability, discriminant and predictive validity, and research applications in supported employment and vocational rehabilitation, are recommended. Subject to adequate psychometric properties, the new measure promises to facilitate exploring: individuals' specific subjective experiences; how each is perceived to contribute to employment restrictions; and the corresponding implications for specialized treatment, vocational interventions and workplace accommodations.

  11. Design and Validation of an Augmented Reality System for Laparoscopic Surgery in a Real Environment

    PubMed Central

    López-Mir, F.; Naranjo, V.; Fuertes, J. J.; Alcañiz, M.; Bueno, J.; Pareja, E.

    2013-01-01

    Purpose. This work presents the protocol carried out in the development and validation of an augmented reality system which was installed in an operating theatre to help surgeons with trocar placement during laparoscopic surgery. The purpose of this validation is to demonstrate the improvements that this system can provide to the field of medicine, particularly surgery. Method. Two experiments that were noninvasive for both the patient and the surgeon were designed. In one of these experiments the augmented reality system was used, the other one was the control experiment, and the system was not used. The type of operation selected for all cases was a cholecystectomy due to the low degree of complexity and complications before, during, and after the surgery. The technique used in the placement of trocars was the French technique, but the results can be extrapolated to any other technique and operation. Results and Conclusion. Four clinicians and ninety-six measurements obtained of twenty-four patients (randomly assigned in each experiment) were involved in these experiments. The final results show an improvement in accuracy and variability of 33% and 63%, respectively, in comparison to traditional methods, demonstrating that the use of an augmented reality system offers advantages for trocar placement in laparoscopic surgery. PMID:24236293

  12. A Preliminary Assessment of the SURF Reactive Burn Model Implementation in FLAG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Carl Edward; McCombe, Ryan Patrick; Carver, Kyle

    Properly validated and calibrated reactive burn models (RBM) can be useful engineering tools for assessing high explosive performance and safety. Experiments with high explosives are expensive. Inexpensive RBM calculations are increasingly relied on for predictive analysis for performance and safety. This report discusses the validation of Menikoff and Shaw’s SURF reactive burn model, which has recently been implemented in the FLAG code. The LANL Gapstick experiment is discussed as is its’ utility in reactive burn model validation. Data obtained from pRad for the LT-63 series is also presented along with FLAG simulations using SURF for both PBX 9501 and PBXmore » 9502. Calibration parameters for both explosives are presented.« less

  13. Don't believe everything you hear: Routine validation of audiovisual information in children and adults.

    PubMed

    Piest, Benjamin A; Isberner, Maj-Britt; Richter, Tobias

    2018-04-05

    Previous research has shown that the validation of incoming information during language comprehension is a fast, efficient, and routine process (epistemic monitoring). Previous research on this topic has focused on epistemic monitoring during reading. The present study extended this research by investigating epistemic monitoring of audiovisual information. In a Stroop-like paradigm, participants (Experiment 1: adults; Experiment 2: 10-year-old children) responded to the probe words correct and false by keypress after the presentation of auditory assertions that could be either true or false with respect to concurrently presented pictures. Results provide evidence for routine validation of audiovisual information. Moreover, the results show a stronger and more stable interference effect for children compared with adults.

  14. Large-scale experimental technology with remote sensing in land surface hydrology and meteorology

    NASA Technical Reports Server (NTRS)

    Brutsaert, Wilfried; Schmugge, Thomas J.; Sellers, Piers J.; Hall, Forrest G.

    1988-01-01

    Two field experiments to study atmospheric and land surface processes and their interactions are summarized. The Hydrologic-Atmospheric Pilot Experiment, which tested techniques for measuring evaporation, soil moisture storage, and runoff at scales of about 100 km, was conducted over a 100 X 100 km area in France from mid-1985 to early 1987. The first International Satellite Land Surface Climatology Program field experiment was conducted in 1987 to develop and use relationships between current satellite measurements and hydrologic, climatic, and biophysical variables at the earth's surface and to validate these relationships with ground truth. This experiment also validated surface parameterization methods for simulation models that describe surface processes from the scale of vegetation leaves up to scales appropriate to satellite remote sensing.

  15. Validation of a unique concept for a low-cost, lightweight space-deployable antenna structure

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Bilyeu, G. D.; Veal, G. R.

    1993-01-01

    An experiment conducted in the framework of a NASA In-Space Technology Experiments Program based on a concept of inflatable deployable structures is described. The concept utilizes very low inflation pressure to maintain the required geometry on orbit and gravity-induced deflection of the structure precludes any meaningful ground-based demonstrations of functions performance. The experiment is aimed at validating and characterizing the mechanical functional performance of a 14-m-diameter inflatable deployable reflector antenna structure in the orbital operational environment. Results of the experiment are expected to significantly reduce the user risk associated with using large space-deployable antennas by demonstrating the functional performance of a concept that meets the criteria for low-cost, lightweight, and highly reliable space-deployable structures.

  16. Flight Experiments for Living With a Star Space Environment Testbed (LWS-SET): Relationship to Technology

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.; Barth, Janet L.; Brewer, Dana A.

    2003-01-01

    This viewgraph presentation provides information on flight validation experiments for technologies to determine solar effects. The experiments are intended to demonstrate tolerance to a solar variant environment. The technologies tested are microelectronics, photonics, materials, and sensors.

  17. Validity of clinical color vision tests for air traffic control specialists.

    DOT National Transportation Integrated Search

    1992-10-01

    An experiment on the relationship between aeromedical color vision screening test performance and performance on color-dependent tasks of Air Traffic Control Specialists was replicated to expand the data base supporting the job-related validity of th...

  18. Assessing the empirical validity of the "take-the-best" heuristic as a model of human probabilistic inference.

    PubMed

    Bröder, A

    2000-09-01

    The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.

  19. Preliminary Results from the GPS-Reflections Mediterranean Balloon Experiment (GPSR-MEBEX)

    NASA Technical Reports Server (NTRS)

    Garrison, James L.; Ruffini, Giulio; Rius, Antonio; Cardellach, Estelle; Masters, Dallas; Armatys, Michael; Zavorotny, Valery; Bauer, Frank H. (Technical Monitor)

    2000-01-01

    An experiment to collect bistatically scattered GPS signals from a balloon at 37 km altitude has been conducted. This experiment represented the highest altitude to date that such signals were successfully recorded. The flight took place in August 1999 over the Mediterranean sea, between a launch in Sicily and recovery near Nerpio, a town in the Sierra de Segura, Albacete province of Huelva, Spain. Results from this experiment are presented, showing the waveform shape as compared to theoretical calculations. These results will be used to validate analytical models which form the basis of wind vector retrieval algorithms. These algorithms are already being validated from aircraft altitudes, but may be applied to data from future spacebourne GPS receivers. Surface wind data from radiosondes were used for comparison. This experiment was a cooperative project between NASA, the IEEC in Barcelona, and the University of Colorado at Boulder.

  20. Preliminary Results from the GPS-Reflections Mediterranean Balloon Experiment (GPSR MEBEX)

    NASA Technical Reports Server (NTRS)

    Garrison, James L.; Ruffini, Giulio; Rius, Antonio; Cardellach, Estelle; Masters, Dallas; Armathys, Michael; Zavorotny, Valery

    2000-01-01

    An experiment to collect bistatically scattered GPS signals from a balloon at 37 km altitude has been conducted. This experiment represented the highest altitude to date that such signals were successfully recorded. The flight took place in August 1999 over the Mediterranean sea, between a launch in Sicily and recovery near Nerpio, a town in the Sierra de Segura, Albacete province of Huelva, Spain. Results from this experiment are presented, showing the waveform shape as compared to theoretical calculations. These results will be used to validate analytical models which form the basis of wind vector retrieval algorithms. These algorithms are already being validated from aircraft altitudes, but may be applied to data from future spaceborne GPS receivers. Surface wind data from radiosondes were used for comparison. This experiment was a cooperative project between NASA, the IEEC in Barcelona, and the University of Colorado at Boulder.

  1. Using Transcripts To Validate Institutional Mission: The Role of the Community College in the Postsecondary Experience of a Generation. ASHE Annual Meeting Paper.

    ERIC Educational Resources Information Center

    Adelman, Clifford

    Information is presented on the use of transcripts to validate institutional mission, proposing that transcript archives can serve as grounds against which the validity of an institution's claimed mission with respect to its primary beneficiaries can be measured. This is done with a focus on the community college. The National Longitudinal Study…

  2. Fast Sampling Gas Chromatography (GC) System for Speciation in a Shock Tube

    DTIC Science & Technology

    2016-10-31

    capture similar ethylene decomposition rates for temperature-dependent shock experiments. (a) Papers published in peer-reviewed journals (N/A for none...3 GC Sampling System Validation Experiments ............................................................................... 5 Ethylene ...results for cold shock experiments, and both techniques capture similar ethylene decomposition rates for temperature-dependent shock experiments. Problem

  3. Are awareness questionnaires valid? Investigating the use of posttest questionnaires for assessing awareness in implicit memory tests.

    PubMed

    Barnhardt, Terrence M; Geraci, Lisa

    2008-01-01

    Two experiments--one employing a perceptual implicit memory test and the other a conceptual implicit memory test--investigated the validity of posttest questionnaires for determining the incidence of awareness in implicit memory tests. In both experiments, a condition in which none of the studied words could be used as test responses (i.e., the none-studied condition) was compared with a standard implicit test condition. Results showed that reports of awareness on the posttest questionnaire were much less frequent in the none-studied condition than in the standard condition. This was especially true after deep processing at study. In both experiments, 83% of the participants in the none-studied condition stated they were unaware even though there were strong demands for claiming awareness. Although there was a small bias in the questionnaire (i.e., 17% of the participants in the none-studied condition stated they were aware), overall, there was strong support for the validity of awareness questionnaires.

  4. Digital Fly-By-Wire Flight Control Validation Experience

    NASA Technical Reports Server (NTRS)

    Szalai, K. J.; Jarvis, C. R.; Krier, G. E.; Megna, V. A.; Brock, L. D.; Odonnell, R. N.

    1978-01-01

    The experience gained in digital fly-by-wire technology through a flight test program being conducted by the NASA Dryden Flight Research Center in an F-8C aircraft is described. The system requirements are outlined, along with the requirements for flight qualification. The system is described, including the hardware components, the aircraft installation, and the system operation. The flight qualification experience is emphasized. The qualification process included the theoretical validation of the basic design, laboratory testing of the hardware and software elements, systems level testing, and flight testing. The most productive testing was performed on an iron bird aircraft, which used the actual electronic and hydraulic hardware and a simulation of the F-8 characteristics to provide the flight environment. The iron bird was used for sensor and system redundancy management testing, failure modes and effects testing, and stress testing in many cases with the pilot in the loop. The flight test program confirmed the quality of the validation process by achieving 50 flights without a known undetected failure and with no false alarms.

  5. EAQUATE: An International Experiment for Hyper-Spectral Atmospheric Sounding Validation

    NASA Technical Reports Server (NTRS)

    Taylor, J. P.; Smith, W.; Cuomo, V.; Larar, A.; Zhou, D.; Serio, C.; Maestri, T.; Rizzi, R.; Newman, S.; Antonelli, P.; hide

    2008-01-01

    The international experiment called EAQUATE (European AQUA Thermodynamic Experiment) was held in September 2004 in Italy and the United Kingdom to demonstrate certain ground-based and airborne systems useful for validating hyperspectral satellite sounding observations. A range of flights over land and marine surfaces were conducted to coincide with overpasses of the AIRS instrument on the EOS Aqua platform. Direct radiance evaluation of AIRS using NAST-I and SHIS has shown excellent agreement. Comparisons of level 2 retrievals of temperature and water vapor from AIRS and NAST-I validated against high quality lidar and drop sonde data show that the 1K/1km and 10%/1km requirements for temperature and water vapor (respectively) are generally being met. The EAQUATE campaign has proven the need for synergistic measurements from a range of observing systems for satellite cal/val and has paved the way for future cal/val activities in support of IASI on the European Metop platform and CrIS on the US NPP/NPOESS platform.

  6. Development and validation of the FertiMed questionnaire assessing patients' experiences with hormonal fertility medication.

    PubMed

    Lankreijer, K; D'Hooghe, T; Sermeus, W; van Asseldonk, F P M; Repping, S; Dancet, E A F

    2016-08-01

    Can a valid and reliable questionnaire be developed to assess patients' experiences with all of the characteristics of hormonal fertility medication valued by them? The FertiMed questionnaire is a valid and reliable tool that assesses patients' experiences with all medication characteristics valued by them and that can be used for all hormonal fertility medications, irrespective of their route of administration. Hormonal fertility medications cause emotional strain and differ in their dosage regime and route of administration, although they often have comparable effectiveness. Medication experiences of former patients would be informative for medication choices. A recent literature review showed that there is no trustworthy tool to compare patients' experiences of medications with differing routes of administration, regarding all medication characteristics which patients value. The items of the new FertiMed questionnaire were generated by literature review and 23 patient interviews. In 2013, 411 IVF-patients were asked to retrospectively complete the FertiMed questionnaire to assess 1 out of the 8 different medications used for ovarian stimulation, induction of pituitary quiescence, ovulation triggering or luteal support. In total, 276 patients (on average 35 per medication) from 2 university fertility clinics (Belgium, the Netherlands) completed the FertiMed questionnaire (67% response rate). The FertiMed questionnaire questioned whether items were valued by patients and whether these items were experienced while using the assessed medication. Hence, the final outcome 'Experiences with Valued Aspects Scores' (EVAS) combined importance and experience ratings. The content and face validity, reliability, feasibility and discriminative potential of the FertiMed questionnaire were tested and changes were made accordingly. Patient interviews defined 51 items relevant to seven medication characteristics previously proved to be important to patients. Item analysis deleted 10 items. The combined results from the reliability and content validity analysis identified 10 characteristics instead of 7. The final FertiMed questionnaire was valid (Adapted Goodness of Fit Index = 0.95) and all but one characteristic ('ease of use: disturbance') could be assessed reliably (Cronbach's α > 0.60). The EVAS per characteristic differed between the medications inducing pituitary quiescence (P = 0.001). As all eight medications prescribed in the recruiting clinics were questioned, sample sizes per medication were rather small for presenting EVAS per medication and for testing the discriminative potential of the FertiMed questionnaire. The FertiMed questionnaire can be used for all hormonal fertility medications to assess in a valid and reliable way whether patients experience what they value regarding 10 medication characteristics (e.g. side effects and ease of use). Future randomized controlled trials (RCT) comparing medications could include the FertiMed questionnaire as a Patient Reported Experience Measure (PREM). Insights from these RCTs could be used to develop evidence-based decision aids aiming to facilitate shared physician-patient medication choices. Funding was received from the University of Leuven and Amsterdam University Medical Centre. E.A.F.D. holds a postdoctoral fellowship of the Research Foundation of Flanders. T.D. was appointed Vice-President and Head Global Medical Affairs Fertility at Merck (Darmstadt, Germany) on 1 October 2015. The reported project was initiated and finished before this date. The other authors had no conflicts of interest to declare. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Instrument Motion Metrics for Laparoscopic Skills Assessment in Virtual Reality and Augmented Reality.

    PubMed

    Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A

    2016-11-01

    To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.

  8. Development and Validation of a Scale Assessing Mental Health Clinicians' Experiences of Associative Stigma.

    PubMed

    Yanos, Philip T; Vayshenker, Beth; DeLuca, Joseph S; O'Connor, Lauren K

    2017-10-01

    Mental health professionals who work with people with serious mental illnesses are believed to experience associative stigma. Evidence suggests that associative stigma could play an important role in the erosion of empathy among professionals; however, no validated measure of the construct currently exists. This study examined the convergent and discriminant validity and factor structure of a new scale assessing the associative stigma experiences of clinicians working with people with serious mental illnesses. A total of 473 clinicians were recruited from professional associations in the United States and participated in an online study. Participants completed the Clinician Associative Stigma Scale (CASS) and measures of burnout, quality of care, expectations about recovery, and self-efficacy. Associative stigma experiences were commonly endorsed; eight items on the 18-item scale were endorsed as being experienced "sometimes" or "often" by over 50% of the sample. The new measure demonstrated a logical four-factor structure: "negative stereotypes about professional effectiveness," "discomfort with disclosure," "negative stereotypes about people with mental illness," and "stereotypes about professionals' mental health." The measure had good internal consistency. It was significantly related to measures of burnout and quality of care, but it was not related to measures of self-efficacy or expectations about recovery. Findings suggest that the CASS is internally consistent and shows evidence of convergent validity and that associative stigma is commonly experienced by mental health professionals who work with people with serious mental illnesses.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Amy B.; Stauffer, Philip H.; Reed, Donald T.

    The primary objective of the experimental effort described here is to aid in understanding the complex nature of liquid, vapor, and solid transport occurring around heated nuclear waste in bedded salt. In order to gain confidence in the predictive capability of numerical models, experimental validation must be performed to ensure that (a) hydrological and physiochemical parameters and (b) processes are correctly simulated. The experiments proposed here are designed to study aspects of the system that have not been satisfactorily quantified in prior work. In addition to exploring the complex coupled physical processes in support of numerical model validation, lessons learnedmore » from these experiments will facilitate preparations for larger-scale experiments that may utilize similar instrumentation techniques.« less

  10. Simulating Small-Scale Experiments of In-Tunnel Airblast Using STUN and ALE3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuscamman, Stephanie; Glenn, Lewis; Schebler, Gregory

    2011-09-12

    This report details continuing validation efforts for the Sphere and Tunnel (STUN) and ALE3D codes. STUN has been validated previously for blast propagation through tunnels using several sets of experimental data with varying charge sizes and tunnel configurations, including the MARVEL nuclear driven shock tube experiment (Glenn, 2001). The DHS-funded STUNTool version is compared to experimental data and the LLNL ALE3D hydrocode. In this particular study, we compare the performance of the STUN and ALE3D codes in modeling an in-tunnel airblast to experimental results obtained by Lunderman and Ohrt in a series of small-scale high explosive experiments (1997).

  11. Temperature measurement reliability and validity with thermocouple extension leads or changing lead temperature.

    PubMed

    Jutte, Lisa S; Long, Blaine C; Knight, Kenneth L

    2010-01-01

    Thermocouples' leads are often too short, necessitating the use of an extension lead. To determine if temperature measures were influenced by extension-lead use or lead temperature changes. Descriptive laboratory study. Laboratory. Experiment 1: 10 IT-21 thermocouples and 5 extension leads. Experiment 2: 5 IT-21 and PT-6 thermocouples. In experiment 1, temperature data were collected on 10 IT-21 thermocouples in a stable water bath with and without extension leads. In experiment 2, temperature data were collected on 5 IT-21 and PT-6 thermocouples in a stable water bath before, during, and after ice-pack application to extension leads. In experiment 1, extension leads did not influence IT-21 validity (P  =  .45) or reliability (P  =  .10). In experiment 2, postapplication IT-21 temperatures were greater than preapplication and application measures (P < .05). Extension leads had no influence on temperature measures. Ice application to leads may increase measurement error.

  12. A novel cell culture model as a tool for forensic biology experiments and validations.

    PubMed

    Feine, Ilan; Shpitzen, Moshe; Roth, Jonathan; Gafny, Ron

    2016-09-01

    To improve and advance DNA forensic casework investigation outcomes, extensive field and laboratory experiments are carried out in a broad range of relevant branches, such as touch and trace DNA, secondary DNA transfer and contamination confinement. Moreover, the development of new forensic tools, for example new sampling appliances, by commercial companies requires ongoing validation and assessment by forensic scientists. A frequent challenge in these kinds of experiments and validations is the lack of a stable, reproducible and flexible biological reference material. As a possible solution, we present here a cell culture model based on skin-derived human dermal fibroblasts. Cultured cells were harvested, quantified and dried on glass slides. These slides were used in adhesive tape-lifting experiments and tests of DNA crossover confinement by UV irradiation. The use of this model enabled a simple and concise comparison between four adhesive tapes, as well as a straightforward demonstration of the effect of UV irradiation intensities on DNA quantity and degradation. In conclusion, we believe this model has great potential to serve as an efficient research tool in forensic biology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. CFD Validation Studies for Hypersonic Flow Prediction

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2001-01-01

    A series of experiments to measure pressure and heating for code validation involving hypersonic, laminar, separated flows was conducted at the Calspan-University at Buffalo Research Center (CUBRC) in the Large Energy National Shock (LENS) tunnel. The experimental data serves as a focus for a code validation session but are not available to the authors until the conclusion of this session. The first set of experiments considered here involve Mach 9.5 and Mach 11.3 N2 flow over a hollow cylinder-flare with 30 degree flare angle at several Reynolds numbers sustaining laminar, separated flow. Truncated and extended flare configurations are considered. The second set of experiments, at similar conditions, involves flow over a sharp, double cone with fore-cone angle of 25 degrees and aft-cone angle of 55 degrees. Both sets of experiments involve 30 degree compressions. Location of the separation point in the numerical simulation is extremely sensitive to the level of grid refinement in the numerical predictions. The numerical simulations also show a significant influence of Reynolds number on extent of separation. Flow unsteadiness was easily introduced into the double cone simulations using aggressive relaxation parameters that normally promote convergence.

  14. CFD Validation Studies for Hypersonic Flow Prediction

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2001-01-01

    A series of experiments to measure pressure and heating for code validation involving hypersonic, laminar, separated flows was conducted at the Calspan-University at Buffalo Research Center (CUBRC) in the Large Energy National Shock (LENS) tunnel. The experimental data serves as a focus for a code validation session but are not available to the authors until the conclusion of this session. The first set of experiments considered here involve Mach 9.5 and Mach 11.3 N, flow over a hollow cylinder-flare with 30 deg flare angle at several Reynolds numbers sustaining laminar, separated flow. Truncated and extended flare configurations are considered. The second set of experiments, at similar conditions, involves flow over a sharp, double cone with fore-cone angle of 25 deg and aft-cone angle of 55 deg. Both sets of experiments involve 30 deg compressions. Location of the separation point in the numerical simulation is extremely sensitive to the level of grid refinement in the numerical predictions. The numerical simulations also show a significant influence of Reynolds number on extent of separation. Flow unsteadiness was easily introduced into the double cone simulations using aggressive relaxation parameters that normally promote convergence.

  15. Robotic surgery training: construct validity of Global Evaluative Assessment of Robotic Skills (GEARS).

    PubMed

    Sánchez, Renata; Rodríguez, Omaira; Rosciano, José; Vegas, Liumariel; Bond, Verónica; Rojas, Aram; Sanchez-Ismayel, Alexis

    2016-09-01

    The objective of this study is to determine the ability of the GEARS scale (Global Evaluative Assessment of Robotic Skills) to differentiate individuals with different levels of experience in robotic surgery, as a fundamental validation. This is a cross-sectional study that included three groups of individuals with different levels of experience in robotic surgery (expert, intermediate, novice) their performance were assessed by GEARS applied by two reviewers. The difference between groups was determined by Mann-Whitney test and the consistency between the reviewers was studied by Kendall W coefficient. The agreement between the reviewers of the scale GEARS was 0.96. The score was 29.8 ± 0.4 to experts, 24 ± 2.8 to intermediates and 16 ± 3 to novices, with a statistically significant difference between all of them (p < 0.05). All parameters from the scale allow discriminating between different levels of experience, with exception of the depth perception item. We conclude that the scale GEARS was able to differentiate between individuals with different levels of experience in robotic surgery and, therefore, is a validated and useful tool to evaluate surgeons in training.

  16. Intoxication-Related AmED (Alcohol Mixed with Energy Drink) Expectancies Scale: Initial Development and Validation

    PubMed Central

    Miller, Kathleen E.; Dermen, Kurt H.; Lucke, Joseph F.

    2017-01-01

    BACKGROUND Young adult use of alcohol mixed with energy drinks (AmEDs) has been linked with elevated risks for a constellation of problem behaviors. These risks may be conditioned by expectancies regarding the effects of caffeine in conjunction with alcohol consumption. The aim of this study was to describe the construction and psychometric evaluation of the Intoxication-Related AmED Expectancies Scale (AmED_EXPI), 15 self-report items measuring beliefs about how the experience of AmED intoxication differs from the experience of noncaffeinated alcohol (NCA) intoxication. METHODS Scale development and testing were conducted using data from a U.S. national sample of 3,105 adolescents and emerging adults aged 13–25. Exploratory and confirmatory factor analyses were conducted to evaluate the factor structure and establish factor invariance across gender, age, and prior experience with AmED use. Cross-sectional and longitudinal analyses examining correlates of AmED use were used to assess construct and predictive validity. RESULTS In confirmatory factor analyses, fit indices for the hypothesized four-factor structure (i.e., Intoxication Management [IM], Alertness [AL], Sociability [SO], and Jitters [JT]) revealed a moderately good fit to the data. Together, these factors accounted for 75.3% of total variance. The factor structure was stable across male/female, teen/young adult, and AmED experience/no experience subgroups. The resultant unit-weighted subscales showed strong internal consistency and satisfactory convergent validity. Baseline scores on the IM, SO, and JT subscales predicted changes in AmED use over a subsequent three-month period. CONCLUSIONS The AmED_EXPI appears to be a reliable and valid tool for measuring expectancies about the effects of caffeine during alcohol intoxication. PMID:28421613

  17. Reactivity loss validation of high burn-up PWR fuels with pile-oscillation experiments in MINERVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leconte, P.; Vaglio-Gaudard, C.; Eschbach, R.

    2012-07-01

    The ALIX experimental program relies on the experimental validation of the spent fuel inventory, by chemical analysis of samples irradiated in a PWR between 5 and 7 cycles, and also on the experimental validation of the spent fuel reactivity loss with bum-up, obtained by pile-oscillation measurements in the MINERVE reactor. These latter experiments provide an overall validation of both the fuel inventory and of the nuclear data responsible for the reactivity loss. This program offers also unique experimental data for fuels with a burn-up reaching 85 GWd/t, as spent fuels in French PWRs never exceeds 70 GWd/t up to now.more » The analysis of these experiments is done in two steps with the APOLLO2/SHEM-MOC/CEA2005v4 package. In the first one, the fuel inventory of each sample is obtained by assembly calculations. The calculation route consists in the self-shielding of cross sections on the 281 energy group SHEM mesh, followed by the flux calculation by the Method Of Characteristics in a 2D-exact heterogeneous geometry of the assembly, and finally a depletion calculation by an iterative resolution of the Bateman equations. In the second step, the fuel inventory is used in the analysis of pile-oscillation experiments in which the reactivity of the ALIX spent fuel samples is compared to the reactivity of fresh fuel samples. The comparison between Experiment and Calculation shows satisfactory results with the JEFF3.1.1 library which predicts the reactivity loss within 2% for burn-up of {approx}75 GWd/t and within 4% for burn-up of {approx}85 GWd/t. (authors)« less

  18. Reliability and Validity of Gaze-Dependent Functional Vision Space: A Novel Metric Quantifying Visual Function in Infantile Nystagmus Syndrome.

    PubMed

    Roberts, Tawna L; Kester, Kristi N; Hertle, Richard W

    2018-04-01

    This study presents test-retest reliability of optotype visual acuity (OVA) across 60° of horizontal gaze position in patients with infantile nystagmus syndrome (INS). Also, the validity of the metric gaze-dependent functional vision space (GDFVS) is shown in patients with INS. In experiment 1, OVA was measured twice in seven horizontal gaze positions from 30° left to right in 10° steps in 20 subjects with INS and 14 without INS. Test-retest reliability was assessed using intraclass correlation coefficient (ICC) in each gaze. OVA area under the curve (AUC) was calculated with horizontal eye position on the x-axis, and logMAR visual acuity on the y-axis and then converted to GDFVS. In experiment 2, validity of GDFVS was determined over 40° horizontal gaze by applying the 95% limits of agreement from experiment 1 to pre- and post-treatment GDFVS values from 85 patients with INS. In experiment 1, test-retest reliability for OVA was high (ICC ≥ 0.88) as the difference in test-retest was on average less than 0.1 logMAR in each gaze position. In experiment 2, as a group, INS subjects had a significant increase (P < 0.001) in the size of their GDFVS that exceeded the 95% limits of agreement found during test-retest. OVA is a reliable measure in INS patients across 60° of horizontal gaze position. GDFVS is a valid clinical method to be used to quantify OVA as a function of eye position in INS patients. This method captures the dynamic nature of OVA in INS patients and may be a valuable measure to quantify visual function patients with INS, particularly in quantifying change as part of clinical studies.

  19. Intoxication-Related Alcohol Mixed with Energy Drink Expectancies Scale: Initial Development and Validation.

    PubMed

    Miller, Kathleen E; Dermen, Kurt H; Lucke, Joseph F

    2017-06-01

    Young adult use of alcohol mixed with energy drinks (AmEDs) has been linked with elevated risks of a constellation of problem behaviors. These risks may be conditioned by expectancies regarding the effects of caffeine in conjunction with alcohol consumption. The aim of this study was to describe the construction and psychometric evaluation of the Intoxication-Related AmED Expectancies Scale (AmED_EXPI), 15 self-report items measuring beliefs about how the experience of AmED intoxication differs from the experience of noncaffeinated alcohol (NCA) intoxication. Scale development and testing were conducted using data from a U.S. national sample of 3,105 adolescents and emerging adults aged 13 to 25. Exploratory and confirmatory factor analyses were conducted to evaluate the factor structure and establish factor invariance across gender, age, and prior experience with AmED use. Cross-sectional and longitudinal analyses examining correlates of AmED use were used to assess construct and predictive validity. In confirmatory factor analyses, fit indices for the hypothesized 4-factor structure (i.e., Intoxication Management [IM], Alertness [AL], Sociability [SO], and Jitters [JT]) revealed a moderately good fit to the data. Together, these factors accounted for 75.3% of total variance. The factor structure was stable across male/female, teen/young adult, and AmED experience/no experience subgroups. The resultant unit-weighted subscales showed strong internal consistency and satisfactory convergent validity. Baseline scores on the IM, SO, and JT subscales predicted changes in AmED use over a subsequent 3-month period. The AmED_EXPI appears to be a reliable and valid tool for measuring expectancies about the effects of caffeine during alcohol intoxication. Copyright © 2017 by the Research Society on Alcoholism.

  20. [Caregiver's health: adaption and validation in a Spanish population of the Experience of Caregiving Inventory (ECI)].

    PubMed

    Crespo-Maraver, Mariacruz; Doval, Eduardo; Fernández-Castro, Jordi; Giménez-Salinas, Jordi; Prat, Gemma; Bonet, Pere

    2018-04-04

    To adapt and to validate the Experience of Caregiving Inventory (ECI) in a Spanish population, providing empirical evidence of its internal consistency, internal structure and validity. Psychometric validation of the adapted version of the ECI. One hundred and seventy-two caregivers (69.2% women), mean age 57.51 years (range: 21-89) participated. Demographic and clinical data, standardized measures (ECI, suffering scale of SCL-90-R, Zarit burden scale) were used. The two scales of negative evaluation of the ECI most related to serious mental disorders (disruptive behaviours [DB] and negative symptoms [NS]) and the two scales of positive appreciation (positive personal experiences [PPE], and good aspects of the relationship [GAR]) were analyzed. Exploratory structural equation modelling was used to analyze the internal structure. The relationship between the ECI scales and the SCL-90-R and Zarit scores was also studied. The four-factor model presented a good fit. Cronbach's alpha (DB: 0.873; NS: 0.825; PPE: 0.720; GAR: 0.578) showed a higher homogeneity in the negative scales. The SCL-90-R scores correlated with the negative ECI scales, and none of the ECI scales correlated with the Zarit scale. The Spanish version of the ECI can be considered a valid, reliable, understandable and feasible self-report measure for its administration in the health and community context. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. Validation of an auditory startle response system using chemicals or parametric modulation as positive controls.

    PubMed

    Marable, Brian R; Maurissen, Jacques P J

    2004-01-01

    Neurotoxicity regulatory guidelines mandate that automated test systems be validated using chemicals. However, in some cases, chemicals may not necessarily be needed to prove test system validity. To examine this issue, two independent experiments were conducted to validate an automated auditory startle response (ASR) system. In Experiment 1, we used adult (PND 63) and weanling (PND 22) Sprague-Dawley rats (10/sex/dose) to determine the effect of either d-amphetamine (4.0 or 8.0 mg/kg) or clonidine (0.4 or 0.8 mg/kg) on the ASR peak amplitude (ASR PA). The startle response of each rat to a short burst of white noise (120 dB SPL) was recorded over 50 consecutive trials. The ASR PA was significantly decreased (by clonidine) and increased (by d-amphetamine) compared to controls in PND 63 rats. In PND 22 rats, the response to clonidine was similar to adults, but d-amphetamine effects were not significant. Neither drug affected the rate of the decrease in ASR PA over time (habituation). In Experiment 2, PND 31 Sprague-Dawley rats (8/sex) were presented with 150 trials consisting of either white noise bursts of variable intensity (70-120 dB SPL in 10 dB increments, presented in random order) or null (0 dB SPL) trials. Statistically significant sex- and intensity-dependent differences were detected in the ASR PA. These results suggest that in some cases, parametric modulation may be an alternative to using chemicals for test system validation.

  2. Performance-based comparison of neonatal intubation training outcomes: simulator and live animal.

    PubMed

    Andreatta, Pamela B; Klotz, Jessica J; Dooley-Hash, Suzanne L; Hauptman, Joe G; Biddinger, Bea; House, Joseph B

    2015-02-01

    The purpose of this article was to establish psychometric validity evidence for competency assessment instruments and to evaluate the impact of 2 forms of training on the abilities of clinicians to perform neonatal intubation. To inform the development of assessment instruments, we conducted comprehensive task analyses including each performance domain associated with neonatal intubation. Expert review confirmed content validity. Construct validity was established using the instruments to differentiate between the intubation performance abilities of practitioners (N = 294) with variable experience (novice through expert). Training outcomes were evaluated using a quasi-experimental design to evaluate performance differences between 294 subjects randomly assigned to 1 of 2 training groups. The training intervention followed American Heart Association Pediatric Advanced Life Support and Neonatal Resuscitation Program protocols with hands-on practice using either (1) live feline or (2) simulated feline models. Performance assessment data were captured before and directly following the training. All data were analyzed using analysis of variance with repeated measures and statistical significance set at P < .05. Content validity, reliability, and consistency evidence were established for each assessment instrument. Construct validity for each assessment instrument was supported by significantly higher scores for subjects with greater levels of experience, as compared with those with less experience (P = .000). Overall, subjects performed significantly better in each assessment domain, following the training intervention (P = .000). After controlling for experience level, there were no significant differences among the cognitive, performance, and self-efficacy outcomes between clinicians trained with live animal model or simulator model. Analysis of retention scores showed that simulator trained subjects had significantly higher performance scores after 18 weeks (P = .01) and 52 weeks (P = .001) and cognitive scores after 52 weeks (P = .001). The results of this study demonstrate the feasibility of using valid, reliable assessment instruments to assess clinician competency and self-efficacy in the performance of neonatal intubation. We demonstrated the relative equivalency of live animal and simulation-based models as tools to support acquisition of neonatal intubation skills. Retention of performance abilities was greater for subjects trained using the simulator, likely because it afforded greater opportunity for repeated practice. Outcomes in each assessment area were influenced by the previous intubation experience of participants. This suggests that neonatal intubation training programs could be tailored to the level of provider experience to make efficient use of time and educational resources. Future research focusing on the uses of assessment in the applied clinical environment, as well as identification of optimal training cycles for performance retention, is merited.

  3. Coping With Existential and Emotional Challenges: Development and Validation of the Self-Competence in Death Work Scale.

    PubMed

    Chan, Wallace Chi Ho; Tin, Agnes Fong; Wong, Karen Lok Yi

    2015-07-01

    Palliative care professionals often are confronted by death in their work. They may experience challenges to self, such as aroused emotions and queries about life's meaningfulness. Assessing their level of "self-competence" in coping with these challenges is crucial in understanding their needs in death work. This study aims to develop and validate the Self-Competence in Death Work Scale (SC-DWS). Development of this scale involved three steps: 1) items generated from a qualitative study with palliative care professionals, (2) expert panel review, and (3) pilot test. Analysis was conducted to explore the factor structure and examine the reliability and validity of the scale. Helping professionals involved in death work were recruited to complete questionnaires comprising the SC-DWS and other scales. A total of 151 participants were recruited. Both one-factor and two-factor structures were found. Emotional and existential coping were identified as subscales in the two-factor structure. Correlations of the whole scale and subscales with measures of death attitudes, meaning in life, burnout and depression provided evidence for the construct validity. Discriminative validity was supported by showing participants with bereavement experience and longer experience in the profession and death work possessed a significantly higher level of self-competence. Reliability analyses showed that the entire scale and subscales were internally consistent. The SC-DWS was found to be valid and reliable. This scale may facilitate helping professionals' understanding of their self-competence in death work, so appropriate professional support and training may be obtained. Copyright © 2015 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  4. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    2000-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community.

  5. MicroRNA Expression Profiling to Identify and Validate Reference Genes for the Relative Quantification of microRNA in Rectal Cancer.

    PubMed

    Eriksen, Anne Haahr Mellergaard; Andersen, Rikke Fredslund; Pallisgaard, Niels; Sørensen, Flemming Brandt; Jakobsen, Anders; Hansen, Torben Frøstrup

    2016-01-01

    MicroRNAs (miRNAs) play important roles in regulating biological processes at the post-transcriptional level. Deregulation of miRNAs has been observed in cancer, and miRNAs are being investigated as potential biomarkers regarding diagnosis, prognosis and prediction in cancer management. Real-time quantitative polymerase chain reaction (RT-qPCR) is commonly used, when measuring miRNA expression. Appropriate normalisation of RT-qPCR data is important to ensure reliable results. The aim of the present study was to identify stably expressed miRNAs applicable as normaliser candidates in future studies of miRNA expression in rectal cancer. We performed high-throughput miRNA profiling (OpenArray®) on ten pairs of laser micro-dissected rectal cancer tissue and adjacent stroma. A global mean expression normalisation strategy was applied to identify the most stably expressed miRNAs for subsequent validation. In the first validation experiment, a panel of miRNAs were analysed on 25 pairs of micro dissected rectal cancer tissue and adjacent stroma. Subsequently, the same miRNAs were analysed in 28 pairs of rectal cancer tissue and normal rectal mucosa. From the miRNA profiling experiment, miR-645, miR-193a-5p, miR-27a and let-7g were identified as stably expressed, both in malignant and stromal tissue. In addition, NormFinder confirmed high expression stability for the four miRNAs. In the RT-qPCR based validation experiments, no significant difference between tumour and stroma/normal rectal mucosa was detected for the mean of the normaliser candidates miR-27a, miR-193a-5p and let-7g (first validation P = 0.801, second validation P = 0.321). MiR-645 was excluded from the data analysis, because it was undetected in 35 of 50 samples (first validation) and in 24 of 56 samples (second validation), respectively. Significant difference in expression level of RNU6B was observed between tumour and adjacent stromal (first validation), and between tumour and normal rectal mucosa (second validation). We recommend the mean expression of miR-27a, miR-193a-5p and let-7g as normalisation factor, when performing miRNA expression analyses by RT-qPCR on rectal cancer tissue.

  6. Development and Application of a Novel Rasch-Based Methodology for Evaluating Multi-Tiered Assessment Instruments: Validation and Utilization of an Undergraduate Diagnostic Test of the Water Cycle

    ERIC Educational Resources Information Center

    Romine, William L.; Schaffer, Dane L.; Barrow, Lloyd

    2015-01-01

    We describe the development and validation of a three-tiered diagnostic test of the water cycle (DTWC) and use it to evaluate the impact of prior learning experiences on undergraduates' misconceptions. While most approaches to instrument validation take a positivist perspective using singular criteria such as reliability and fit with a measurement…

  7. Increased importance of the documented development stage in process validation.

    PubMed

    Mohammed-Ziegler, Ildikó; Medgyesi, Ildikó

    2012-07-01

    Current trends in pharmaceutical quality assurance moved when the Federal Drug Administration (FDA) of the USA published its new guideline on process validation in 2011. This guidance introduced the lifecycle approach of process validation. In this short communication some typical changes from the point of view of practice of API production are addressed in the light of inspection experiences. Some details are compared with the European regulations.

  8. Validating the AIRS Version 5 CO Retrieval with DACOM In Situ Measurements During INTEX-A and -B

    NASA Technical Reports Server (NTRS)

    McMillan, Wallace W.; Evans, Keith D.; Barnet, Christopher D.; Maddy, Eric; Sachse, Glen W.; Diskin, Glenn S.

    2011-01-01

    Herein we provide a description of the atmospheric infrared sounder (AIRS) version 5 (v5) carbon monoxide (CO) retrieval algorithm and its validation with the DACOM in situ measurements during the INTEX-A and -B campaigns. All standard and support products in the AIRS v5 CO retrieval algorithm are documented. Building on prior publications, we describe the convolution of in situ measurements with the AIRS v5 CO averaging kernel and first-guess CO profile as required for proper validation. Validation is accomplished through comparison of AIRS CO retrievals with convolved in situ CO profiles acquired during the NASA Intercontinental Chemical Transport Experiments (INTEX) in 2004 and 2006. From 143 profiles in the northern mid-latitudes during these two experiments, we find AIRS v5 CO retrievals are biased high by 6% 10% between 900 and 300 hPa with a root-mean-square error of 8% 12%. No significant differences were found between validation using spiral profiles coincident with AIRS overpasses and in-transit profiles under the satellite track but up to 13 h off in time. Similarly, no significant differences in validation results were found for ocean versus land, day versus night, or with respect to retrieved cloud top pressure or cloud fraction.

  9. Symbolic control of visual attention: semantic constraints on the spatial distribution of attention.

    PubMed

    Gibson, Bradley S; Scheutz, Matthias; Davis, Gregory J

    2009-02-01

    Humans routinely use spatial language to control the spatial distribution of attention. In so doing, spatial information may be communicated from one individual to another across opposing frames of reference, which in turn can lead to inconsistent mappings between symbols and directions (or locations). These inconsistencies may have important implications for the symbolic control of attention because they can be translated into differences in cue validity, a manipulation that is known to influence the focus of attention. This differential validity hypothesis was tested in Experiment 1 by comparing spatial word cues that were predicted to have high learned spatial validity ("above/below") and low learned spatial validity ("left/right"). Consistent with this prediction, when two measures of selective attention were used, the results indicated that attention was less focused in response to "left/right" cues than in response to "above/below" cues, even when the actual validity of each of the cues was equal. In addition, Experiment 2 predicted that spatial words such as "left/right" would have lower spatial validity than would other directional symbols that specify direction along the horizontal axis, such as "<--/-->" cues. The results were also consistent with this hypothesis. Altogether, the present findings demonstrate important semantic-based constraints on the spatial distribution of attention.

  10. HPLC-MS/MS method for dexmedetomidine quantification with Design of Experiments approach: application to pediatric pharmacokinetic study.

    PubMed

    Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta

    2017-02-01

    The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.

  11. Measurements of Pollution in the Troposphere (MOPITT) Validation Exercises During Summer 2004 Field Campaigns over North America

    NASA Technical Reports Server (NTRS)

    Emmons, L. K.; Pfister, G. G.; Edwards, D. P.; Gille, J. C.; Sachse, G.; Blake, D.; Wofsy, S.; Gerbig, C.; Matross, D.; Nedelec, P.

    2007-01-01

    Measurements of carbon monoxide (CO) made as part of three aircraft experiments during the summer of 2004 over North America have been used for the continued validation of the CO retrievals from the Measurements of Pollution in the Troposphere (MOPITT) instrument on board the Terra satellite. Vertical profiles measured during the NASA INTEX-A campaign, designed to be coincident with MOPITT overpasses, as well as measurements made during the COBRA-2004 and MOZAIC experiments, provided valuable validation comparisons. On average, the MOPITT CO retrievals are biased slightly high for these North America locations. While the mean bias differs between the different aircraft experiments (e.g., 7.0 ppbv for MOZAIC to 18.4 ppbv for COBRA at 700 hPa), the standard deviations are quite large, so the results for the three data sets can be considered consistent. On average, it is estimated that MOPITT is 7- 14% high at 700 hPa and 03% high at 350 hPa. These results are consistent with the validation results for the Carr, Colorado, Harvard Forest, Massachusetts, and Poker Flats, Alaska, aircraft profiles for "phase 2" presented by Emmons et al. (2004) and are generally within the design criteria of 10% accuracy.

  12. Observations with the ROWS instrument during the Grand Banks calibration/validation experiments

    NASA Technical Reports Server (NTRS)

    Vandemark, D.; Chapron, B.

    1994-01-01

    As part of a global program to validate the ocean surface sensors on board ERS-1, a joint experiment on the Grand Banks of Newfoundland was carried out in Nov. 1991. The principal objective was to provide a field validation of ERS-1 Synthetic Aperture Radar (SAR) measurement of ocean surface structure. The NASA-P3 aircraft measurements made during this experiment provide independent measurements of the ocean surface along the validation swath. The Radar Ocean Wave Spectrometer (ROWS) is a radar sensor designed to measure direction of the long wave components using spectral analysis of the tilt induced radar backscatter modulation. This technique greatly differs from SAR and thus, provides a unique set of measurements for use in evaluating SAR performance. Also, an altimeter channel in the ROWS gives simultaneous information on the surface wave height and radar mean square slope parameter. The sets of geophysical parameters (wind speed, significant wave height, directional spectrum) are used to study the SAR's ability to accurately measure ocean gravity waves. The known distortion imposed on the true directional spectrum by the SAR imaging mechanism is discussed in light of the direct comparisons between ERS-1 SAR, airborne Canadian Center for Remote Sensing (CCRS) SAR, and ROWS spectra and the use of the nonlinear ocean SAR transform.

  13. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  14. Style preference survey: a report on the psychometric properties and a cross-validation experiment.

    PubMed

    Smith, Sherri L; Ricketts, Todd; McArdle, Rachel A; Chisolm, Theresa H; Alexander, Genevieve; Bratt, Gene

    2013-02-01

    Several self-report measures exist that target different aspects of outcomes for hearing aid use. Currently, no comprehensive questionnaire specifically assesses factors that may be important for differentiating outcomes pertaining to hearing aid style. The goal of this work was to develop the Style Preference Survey (SPS), a questionnaire aimed at outcomes associated with hearing aid style differences. Two experiments were conducted. After initial item development, Experiment 1 was conducted to refine the items and to determine its psychometric properties. Experiment 2 was designed to cross-validate the findings from the initial experiment. An observational design was used in both experiments. Participants who wore traditional, custom-fitted (TC) or open-canal (OC) style hearing aids from 3 mo to 3 yr completed the initial experiment. One-hundred and eighty-four binaural hearing aid users (120 of whom wore TC hearing aids and 64 of whom wore OC hearing aids) participated. A new sample of TC and OC users (n = 185) participated in the cross-validation experiment. Currently available self-report measures were reviewed to identify items that might differentiate between hearing aid styles, particularly preference for OC versus TC hearing aid styles. A total of 15 items were selected and modified from available self-report measures. An additional 55 items were developed through consensus of six audiologists for the initial version of the SPS. In the first experiment, the initial SPS version was mailed to 550 veterans who met the inclusion criteria. A total of 184 completed the SPS. Approximately three weeks later, a subset of participants (n = 83) completed the SPS a second time. Basic analyses were conducted to evaluate the psychometric properties of the SPS including subscale structure, internal consistency, test-retest reliability, and responsiveness. Based on the results of Experiment 1, the SPS was revised. A cross-validation experiment was then conducted using the revised version of the SPS to confirm the subscale structure, internal consistency, and responsiveness of the questionnaire in a new sample of participants. The final factor analysis led to the ultimate version of the SPS, which had a total of 35 items encompassing five subscales: (1) Feedback, (2) Occlusion/Own Voice Effects, (3) Localization, (4) Fit, Comfort, and Cosmetics, and (5) Ease of Use. The internal consistency of the total SPS (Cronbach's α = .92) and of the subscales (each Cronbach's α > .75) was high. Intraclass correlations (ICCs) showed that the test-retest reliability of the total SPS (ICC = .93) and of the subscales (each ICC > .80) also was high. TC hearing aid users had significantly poorer outcomes than OC hearing aid users on 4 of the 5 subscales, suggesting that the SPS largely is responsive to factors related to style-specific differences. The results suggest that the SPS has good psychometric properties and is a valid and reliable measure of outcomes related to style-specific, hearing aid preference. American Academy of Audiology.

  15. Validating a Geographical Image Retrieval System.

    ERIC Educational Resources Information Center

    Zhu, Bin; Chen, Hsinchun

    2000-01-01

    Summarizes a prototype geographical image retrieval system that demonstrates how to integrate image processing and information analysis techniques to support large-scale content-based image retrieval. Describes an experiment to validate the performance of this image retrieval system against that of human subjects by examining similarity analysis…

  16. MODELS FOR SUBMARINE OUTFALL - VALIDATION AND PREDICTION UNCERTAINTIES

    EPA Science Inventory

    This address reports on some efforts to verify and validate dilution models, including those found in Visual Plumes. This is done in the context of problem experience: a range of problems, including different pollutants such as bacteria; scales, including near-field and far-field...

  17. Validation of simultaneous reverse optimization reconstruction algorithm in a practical circular subaperture stitching interferometer

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Dong; Liu, Yu; Liu, Jingxiao; Li, Jingsong; Yu, Benli

    2017-11-01

    We demonstrate the validity of the simultaneous reverse optimization reconstruction (SROR) algorithm in circular subaperture stitching interferometry (CSSI), which is previously proposed for non-null aspheric annular subaperture stitching interferometry (ASSI). The merits of the modified SROR algorithm in CSSI, such as auto retrace error correction, no need of overlap and even permission of missed coverage, are analyzed in detail in simulations and experiments. Meanwhile, a practical CSSI system is proposed for this demonstration. An optical wedge is employed to deflect the incident beam for subaperture scanning by its rotation and shift instead of the six-axis motion-control system. Also the reference path can provide variable Zernike defocus for each subaperture test, which would decrease the fringe density. Experiments validating the SROR algorithm in this CSSI is implemented with cross validation by testing of paraboloidal mirror, flat mirror and astigmatism mirror. It is an indispensable supplement in SROR application in general subaperture stitching interferometry.

  18. Testing the construct validity of willingness to pay valuations using objective information about risk and health benefit.

    PubMed

    Philips, Zoë; Whynes, David K; Avis, Mark

    2006-02-01

    This paper describes an experiment to test the construct validity of contingent valuation, by eliciting women's valuations for the NHS cervical cancer screening programme. It is known that, owing to low levels of knowledge of cancer and screening in the general population, women both over-estimate the risk of disease and the efficacy of screening. The study is constructed as a randomised experiment, in which one group is provided with accurate information about cervical cancer screening, whilst the other is not. The first hypothesis supporting construct validity, that controls who perceive greater benefits from screening will offer higher valuations, is substantiated. Both groups are then provided with objective information on an improvement to the screening programme, and are asked to value the improvement as an increment to their original valuations. The second hypothesis supporting construct validity, that controls who perceive the benefits of the programme to be high already will offer lower incremental valuations, is also substantiated. Copyright 2005 John Wiley & Sons, Ltd.

  19. The marketing implications of affective product design.

    PubMed

    Seva, Rosemary R; Duh, Henry Been-Lirn; Helander, Martin G

    2007-11-01

    Emotions are compelling human experiences and product designers can take advantage of this by conceptualizing emotion-engendering products that sell well in the market. This study hypothesized that product attributes influence users' emotions and that the relationship is moderated by the adherence of these product attributes to purchase criteria. It was further hypothesized that the emotional experience of the user influences purchase intention. A laboratory study was conducted to validate the hypotheses using mobile phones as test products. Sixty-two participants were asked to assess eight phones from a display of 10 phones and indicate their emotional experiences after assessment. Results suggest that some product attributes can cause intense emotional experience. The attributes relate to the phone's dimensions and the relationship between these dimensions. The study validated the notion of integrating affect in designing products that convey users' personalities.

  20. Statistical validation and an empirical model of hydrogen production enhancement found by utilizing passive flow disturbance in the steam-reformation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Paul A.; Liao, Chang-hsien

    2007-11-15

    A passive flow disturbance has been proven to enhance the conversion of fuel in a methanol-steam reformer. This study presents a statistical validation of the experiment based on a standard 2{sup k} factorial experiment design and the resulting empirical model of the enhanced hydrogen producing process. A factorial experiment design was used to statistically analyze the effects and interactions of various input factors in the experiment. Three input factors, including the number of flow disturbers, catalyst size, and reactant flow rate were investigated for their effects on the fuel conversion in the steam-reformation process. Based on the experimental results, anmore » empirical model was developed and further evaluated with an uncertainty analysis and interior point data. (author)« less

  1. Relationship between a solar drying model of red pepper and the kinetics of pure water evaporation (1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Passamai, V.; Saravia, L.

    1997-05-01

    Drying of red pepper under solar radiation was investigated, and a simple model related to water evaporation was developed. Drying experiments at constant laboratory conditions were undertaken where solar radiation was simulated by a 1,000 W lamp. In this first part of the work, water evaporation under radiation is studied and laboratory experiments are presented with two objectives: to verify Penman`s model of evaporation under radiation, and to validate the laboratory experiments. Modifying Penman`s model of evaporation by introducing two drying conductances as a function of water content, allows the development of a drying model under solar radiation. In themore » second part of this paper, the model is validated by applying it to red pepper open air solar drying experiments.« less

  2. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    PubMed

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Soil Moisture Retrieval with Airborne PALS Instrument over Agricultural Areas in SMAPVEX16

    NASA Technical Reports Server (NTRS)

    Colliander, Andreas; Jackson, Thomas J.; Cosh, Mike; Misra, Sidharth; Bindlish, Rajat; Powers, Jarrett; McNairn, Heather; Bullock, P.; Berg, A.; Magagi, A.; hide

    2017-01-01

    NASA's SMAP (Soil Moisture Active Passive) calibration and validation program revealed that the soil moisture products are experiencing difficulties in meeting the mission requirements in certain agricultural areas. Therefore, the mission organized airborne field experiments at two core validation sites to investigate these anomalies. The SMAP Validation Experiment 2016 included airborne observations with the PALS (Passive Active L-band Sensor) instrument and intensive ground sampling. The goal of the PALS measurements are to investigate the soil moisture retrieval algorithm formulation and parameterization under the varying (spatially and temporally) conditions of the agricultural domains and to obtain high resolution soil moisture maps within the SMAP pixels. In this paper the soil moisture retrieval using the PALS brightness temperature observations in SMAPVEX16 is presented.

  4. Unforgiveness: Refining theory and measurement of an understudied construct.

    PubMed

    Stackhouse, Madelynn R D; Jones Ross, Rachel W; Boon, Susan D

    2018-01-01

    This research presents a multidimensional conceptualization of unforgiveness and the development and validation of the unforgiveness measure (UFM). The scale was developed based on a qualitative study of people's experiences of unforgiven interpersonal offences (Study 1). Three dimensions of unforgiveness emerged (Study 2): emotional-ruminative unforgiveness, cognitive-evaluative unforgiveness, and offender reconstrual. We supported the scale's factor structure, reliability, and validity (Study 3). We also established the convergent and discriminant validity of the UFM with measures of negative affect, rumination, forgiveness, cognitive reappraisal, and emotional suppression (Study 4). Together, our results suggest that the UFM can capture variability in victims' unforgiving experiences in the aftermath of interpersonal transgressions. Implications for understanding the construct of unforgiveness and directions for future research are discussed. © 2017 The British Psychological Society.

  5. Measuring engagement in nurses: the psychometric properties of the Persian version of Utrecht Work Engagement Scale

    PubMed Central

    Torabinia, Mansour; Mahmoudi, Sara; Dolatshahi, Mojtaba; Abyaz, Mohamad Reza

    2017-01-01

    Background: Considering the overall tendency in psychology, researchers in the field of work and organizational psychology have become progressively interested in employees’ effective and optimistic experiments at work such as work engagement. This study was conducted to investigate 2 main purposes: assessing the psychometric properties of the Utrecht Work Engagement Scale, and finding any association between work engagement and burnout in nurses. Methods: The present methodological study was conducted in 2015 and included 248 females and 34 males with 6 months to 30 years of job experience. After the translation process, face and content validity were calculated by qualitative and quantitative methods. Moreover, content validation ratio, scale-level content validity index and item-level content validity index were measured for this scale. Construct validity was determined by factor analysis. Moreover, internal consistency and stability reliability were assessed. Factor analysis, test-retest, Cronbach’s alpha, and association analysis were used as statistical methods. Results: Face and content validity were acceptable. Exploratory factor analysis suggested a new 3- factor model. In this new model, some items from the construct model of the original version were dislocated with the same 17 items. The new model was confirmed by divergent Copenhagen Burnout Inventory as the Persian version of UWES. Internal consistency reliability for the total scale and the subscales was 0.76 to 0.89. Results from Pearson correlation test indicated a high degree of test-retest reliability (r = 0. 89). ICC was also 0.91. Engagement was negatively related to burnout and overtime per month, whereas it was positively related with age and job experiment. Conclusion: The Persian 3– factor model of Utrecht Work Engagement Scale is a valid and reliable instrument to measure work engagement in Iranian nurses as well as in other medical professionals. PMID:28955665

  6. Design and validation of instruments to measure knowledge.

    PubMed

    Elliott, T E; Regal, R R; Elliott, B A; Renier, C M

    2001-01-01

    Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.

  7. Replicating the Z iron opacity experiments on the NIF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perry, T. S.; Heeter, R. F.; Opachich, Y. P.

    Here, X-ray opacity is a crucial factor of all radiation-hydrodynamics calculations, yet it is one of the least validated of the material properties in the simulation codes. Recent opacity experiments at the Sandia Z-machine have shown up to factors of two discrepancies between theory and experiment, casting doubt on the validity of the opacity models. Therefore, a new experimental opacity platform is being developed on the National Ignition Facility (NIF) not only to verify the Z-machine experimental results but also to extend the experiments to other temperatures and densities. The first experiments will be directed towards measuring the opacity ofmore » iron at a temperature of ~160 eV and an electron density of ~7 x 10 21 cm -3. Preliminary experiments on NIF have demonstrated the ability to create a sufficiently bright point backlighter using an imploding plastic capsule and also a hohlraum that can heat the opacity sample to the desired conditions. The first of these iron opacity experiments is expected to be performed in 2017.« less

  8. Replicating the Z iron opacity experiments on the NIF

    DOE PAGES

    Perry, T. S.; Heeter, R. F.; Opachich, Y. P.; ...

    2017-05-12

    Here, X-ray opacity is a crucial factor of all radiation-hydrodynamics calculations, yet it is one of the least validated of the material properties in the simulation codes. Recent opacity experiments at the Sandia Z-machine have shown up to factors of two discrepancies between theory and experiment, casting doubt on the validity of the opacity models. Therefore, a new experimental opacity platform is being developed on the National Ignition Facility (NIF) not only to verify the Z-machine experimental results but also to extend the experiments to other temperatures and densities. The first experiments will be directed towards measuring the opacity ofmore » iron at a temperature of ~160 eV and an electron density of ~7 x 10 21 cm -3. Preliminary experiments on NIF have demonstrated the ability to create a sufficiently bright point backlighter using an imploding plastic capsule and also a hohlraum that can heat the opacity sample to the desired conditions. The first of these iron opacity experiments is expected to be performed in 2017.« less

  9. Twelve tips for blueprinting.

    PubMed

    Coderre, Sylvain; Woloschuk, Wayne; McLaughlin, Kevin

    2009-04-01

    Content validity is a requirement of every evaluation and is achieved when the evaluation content is congruent with the learning objectives and the learning experiences. Congruence between these three pillars of education can be facilitated by blueprinting. Here we describe an efficient process for creating a blueprint and explain how to use this tool to guide all aspects of course creation and evaluation. A well constructed blueprint is a valuable tool for medical educators. In addition to validating evaluation content, a blueprint can also be used to guide selection of curricular content and learning experiences.

  10. Supersonic Coaxial Jet Experiment for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Cutler, A. D.; Carty, A. A.; Doerner, S. E.; Diskin, G. S.; Drummond, J. P.

    1999-01-01

    A supersonic coaxial jet facility has been designed to provide experimental data suitable for the validation of CFD codes used to analyze high-speed propulsion flows. The center jet is of a light gas and the coflow jet is of air, and the mixing layer between them is compressible. Various methods have been employed in characterizing the jet flow field, including schlieren visualization, pitot, total temperature and gas sampling probe surveying, and RELIEF velocimetry. A Navier-Stokes code has been used to calculate the nozzle flow field and the results compared to the experiment.

  11. Mean dynamic topography over Peninsular Malaysian seas using multimission satellite altimetry

    NASA Astrophysics Data System (ADS)

    Abazu, Isaac Chidi; Din, Ami Hassan Md; Omar, Kamaludin Mohd

    2017-04-01

    The development of satellite altimeters (SALTs) has brought huge benefits, among which is the ability to more adequately sense ocean-surface topography. The radar altimeter database system was used to capture and process ENVISAT, CRYOSAT-2, SARAL, JASON-1, and JASON-2 SALT data of 5 years between 2011 and 2015. The time series of monthly multimission SALT data showed an estimated sea level trend of 1.0, 2.4, 2.4, 3.6, and 12.0 mm/year at Gelang, Port Kelang, Kukup, Cendering, and Keling. The correlation analysis for the selected tide gauge stations produced satisfying results of R-squared with 0.86, 0.89, 0.91, and 0.97 for Cendering, Sedili, Gelang, and Geting, respectively. The ITG-Grace2010s geoid model was used to compute the mean dynamic topography (MDT) and plot to a grid of 0.25 deg for the Malacca Strait and South China Sea of Peninsular Malaysia, with Keling, Port Kelang, Geting, Sedili, and Johor Bahru tide gauge stations having values determined by interpolation to be 1.14, 1.19, 1.26, 1.88, and 2.91 m, respectively. MDT is computed from the SALT with respect to Port Kelang, the north-south sea slope ranges between -0.64 and 0.29 m/50 km and -0.01 and 0.52 m/50 km along the east and west coasts of Peninsular Malaysia, respectively.

  12. Precise orbit determination for the most recent altimeter missions: towards the 1 mm/y stability of the radial orbit error at regional scales

    NASA Astrophysics Data System (ADS)

    Couhert, Alexandre

    The reference Ocean Surface Topography Mission/Jason-2 satellite (CNES/NASA) has been in orbit for six years (since June 2008). It extends the continuous record of highly accurate sea surface height measurements begun in 1992 by the Topex/Poseidon mission and continued in 2001 by the Jason-1 mission. The complementary missions CryoSat-2 (ESA), HY-2A (CNSA) and SARAL/AltiKa (CNES/ISRO), with lower altitudes and higher inclinations, were launched in April 2010, August 2011 and February 2013, respectively. Although the three last satellites fly in different orbits, they contribute to the altimeter constellation while enhancing the global coverage. The CNES Precision Orbit Determination (POD) Group delivers precise and homogeneous orbit solutions for these independent altimeter missions. The focus of this talk will be on the long-term stability of the orbit time series for mean sea level applications on a regional scale. We discuss various issues related to the assessment of radial orbit error trends; in particular orbit errors dependant on the tracking technique, the reference frame accuracy and stability, the modeling of the temporal variations of the geopotential. Strategies are then explored to meet a 1 mm/y radial orbit stability over decadal periods at regional scales, and the challenge of evaluating such an improvement is discussed.

  13. Vertical land motion along the coast of Louisiana: Integrating satellite altimetry, tide gauge and GPS

    NASA Astrophysics Data System (ADS)

    Dixon, T. H.; A Karegar, M.; Uebbing, B.; Kusche, J.; Fenoglio-Marc, L.

    2017-12-01

    Coastal Louisiana is experiencing the highest rate of relative sea-level rise in North America due to the combination of sea-level rise and subsidence of the deltaic plain. The land subsidence in this region is studied using various techniques, with continuous GPS site providing high temporal resolution. Here, we use high resolution tide-gauge data and advanced processing of satellite altimetry to derive vertical displacements time series at NOAA tide-gauge stations along the coast (Figure 1). We apply state-of-the-art retracking techniques to process raw altimetry data, allowing high accuracy on range measurements close to the coast. Data from Jason-1, -2 and -3, Envisat, Saral and Cryosat-2 are used, corrected for solid Earth tide, pole tide and tidal ocean loading, using background models consistent with the GPS processing technique. We reprocess the available GPS data using precise point positioning and estimate the rate uncertainty accounting for correlated noise. The displacement time series are derived by directly subtracting tide-gauge data from the altimetry sea-level anomaly data. The quality of the derived displacement rates is evaluated in Grand Isle, Amerada Pass and Shell Beach where GPS data are available adjacent to the tide gauges. We use this technique to infer vertical displacement at tide gauges in New Orleans (New Canal Station) and Port Fourchon and Southwest Pass along the coastline.

  14. New Techniques for Radar Altimetry of Sea Ice and the Polar Oceans

    NASA Astrophysics Data System (ADS)

    Armitage, T. W. K.; Kwok, R.; Egido, A.; Smith, W. H. F.; Cullen, R.

    2017-12-01

    Satellite radar altimetry has proven to be a valuable tool for remote sensing of the polar oceans, with techniques for estimating sea ice thickness and sea surface height in the ice-covered ocean advancing to the point of becoming routine, if not operational, products. Here, we explore new techniques in radar altimetry of the polar oceans and the sea ice cover. First, we present results from fully-focused SAR (FFSAR) altimetry; by accounting for the phase evolution of scatterers in the scene, the FFSAR technique applies an inter-burst coherent integration, potentially over the entire duration that a scatterer remains in the altimeter footprint, which can narrow the effective along track resolution to just 0.5m. We discuss the improvement of using interleaved operation over burst-more operation for applying FFSAR processing to data acquired by future missions, such as a potential CryoSat follow-on. Second, we present simulated sea ice retrievals from the Ka-band Radar Interferometer (KaRIn), the instrument that will be launched on the Surface Water and Ocean Topography (SWOT) mission in 2021, that is capable of producing swath images of surface elevation. These techniques offer the opportunity to advance our understanding of the physics of the ice-covered oceans, plus new insight into how we interpret more conventional radar altimetry data in these regions.

  15. The Challenging Experience Questionnaire: Characterization of challenging experiences with psilocybin mushrooms.

    PubMed

    Barrett, Frederick S; Bradstreet, Matthew P; Leoutsakos, Jeannie-Marie S; Johnson, Matthew W; Griffiths, Roland R

    2016-12-01

    Acute adverse psychological reactions to classic hallucinogens ("bad trips" or "challenging experiences"), while usually benign with proper screening, preparation, and support in controlled settings, remain a safety concern in uncontrolled settings (such as illicit use contexts). Anecdotal and case reports suggest potential adverse acute symptoms including affective (panic, depressed mood), cognitive (confusion, feelings of losing sanity), and somatic (nausea, heart palpitation) symptoms. Responses to items from several hallucinogen-sensitive questionnaires (Hallucinogen Rating Scale, the States of Consciousness Questionnaire, and the Five-Dimensional Altered States of Consciousness questionnaire) in an Internet survey of challenging experiences with the classic hallucinogen psilocybin were used to construct and validate a Challenging Experience Questionnaire. The stand-alone Challenging Experience Questionnaire was then validated in a separate sample. Seven Challenging Experience Questionnaire factors (grief, fear, death, insanity, isolation, physical distress, and paranoia) provide a phenomenological profile of challenging aspects of experiences with psilocybin. Factor scores were associated with difficulty, meaningfulness, spiritual significance, and change in well-being attributed to the challenging experiences. The factor structure did not differ based on gender or prior struggle with anxiety or depression. The Challenging Experience Questionnaire provides a basis for future investigation of predictors and outcomes of challenging experiences with classic hallucinogens. © The Author(s) 2016.

  16. The Question of Education Science: "Experiment"ism Versus "Experimental"ism

    ERIC Educational Resources Information Center

    Howe, Kenneth R.

    2005-01-01

    The ascendant view in the current debate about education science -- experimentism -- is a reassertion of the randomized experiment as the methodological gold standard. Advocates of this view have ignored, not answered, long-standing criticisms of the randomized experiment: its frequent impracticality, its lack of external validity, its confinement…

  17. Identifying Attrition Risk Based on the First Year Experience

    ERIC Educational Resources Information Center

    Naylor, Ryan; Baik, Chi; Arkoudis, Sophia

    2018-01-01

    Using data collected from a recent national survey of Australian first-year students, this paper defines and validates four scales--belonging, feeling supported, intellectual engagement and workload stress--to measure the student experience of university. These scales provide insights into the university experience for both groups and individual…

  18. Review and assessment of turbulence models for hypersonic flows

    NASA Astrophysics Data System (ADS)

    Roy, Christopher J.; Blottner, Frederick G.

    2006-10-01

    Accurate aerodynamic prediction is critical for the design and optimization of hypersonic vehicles. Turbulence modeling remains a major source of uncertainty in the computational prediction of aerodynamic forces and heating for these systems. The first goal of this article is to update the previous comprehensive review of hypersonic shock/turbulent boundary-layer interaction experiments published in 1991 by Settles and Dodson (Hypersonic shock/boundary-layer interaction database. NASA CR 177577, 1991). In their review, Settles and Dodson developed a methodology for assessing experiments appropriate for turbulence model validation and critically surveyed the existing hypersonic experiments. We limit the scope of our current effort by considering only two-dimensional (2D)/axisymmetric flows in the hypersonic flow regime where calorically perfect gas models are appropriate. We extend the prior database of recommended hypersonic experiments (on four 2D and two 3D shock-interaction geometries) by adding three new geometries. The first two geometries, the flat plate/cylinder and the sharp cone, are canonical, zero-pressure gradient flows which are amenable to theory-based correlations, and these correlations are discussed in detail. The third geometry added is the 2D shock impinging on a turbulent flat plate boundary layer. The current 2D hypersonic database for shock-interaction flows thus consists of nine experiments on five different geometries. The second goal of this study is to review and assess the validation usage of various turbulence models on the existing experimental database. Here we limit the scope to one- and two-equation turbulence models where integration to the wall is used (i.e., we omit studies involving wall functions). A methodology for validating turbulence models is given, followed by an extensive evaluation of the turbulence models on the current hypersonic experimental database. A total of 18 one- and two-equation turbulence models are reviewed, and results of turbulence model assessments for the six models that have been extensively applied to the hypersonic validation database are compiled and presented in graphical form. While some of the turbulence models do provide reasonable predictions for the surface pressure, the predictions for surface heat flux are generally poor, and often in error by a factor of four or more. In the vast majority of the turbulence model validation studies we review, the authors fail to adequately address the numerical accuracy of the simulations (i.e., discretization and iterative error) and the sensitivities of the model predictions to freestream turbulence quantities or near-wall y+ mesh spacing. We recommend new hypersonic experiments be conducted which (1) measure not only surface quantities but also mean and fluctuating quantities in the interaction region and (2) provide careful estimates of both random experimental uncertainties and correlated bias errors for the measured quantities and freestream conditions. For the turbulence models, we recommend that a wide-range of turbulence models (including newer models) be re-examined on the current hypersonic experimental database, including the more recent experiments. Any future turbulence model validation efforts should carefully assess the numerical accuracy and model sensitivities. In addition, model corrections (e.g., compressibility corrections) should be carefully examined for their effects on a standard, low-speed validation database. Finally, as new experiments or direct numerical simulation data become available with information on mean and fluctuating quantities, they should be used to improve the turbulence models and thus increase their predictive capability.

  19. Assessing movement quality in persons with severe mental illness - Reliability and validity of the Body Awareness Scale Movement Quality and Experience.

    PubMed

    Hedlund, Lena; Gyllensten, Amanda Lundvik; Waldegren, Tomas; Hansson, Lars

    2016-05-01

    Motor disturbances and disturbed self-recognition are common features that affect mobility in persons with schizophrenia spectrum disorder and bipolar disorder. Physiotherapists in Scandinavia assess and treat movement difficulties in persons with severe mental illness. The Body Awareness Scale Movement Quality and Experience (BAS MQ-E) is a new and shortened version of the commonly used Body Awareness Scale-Health (BAS-H). The purpose of this study was to investigate the inter-rater reliability and the concurrent validity of BAS MQ-E in persons with severe mental illness. The concurrent validity was examined by investigating the relationships between neurological soft signs, alexithymia, fatigue, anxiety, and mastery. Sixty-two persons with severe mental illness participated in the study. The results showed a satisfactory inter-rater reliability (n = 53) and a concurrent validity (n = 62) with neurological soft signs, especially cognitive and perceptual based signs. There was also a concurrent validity linked to physical fatigue and aspects of alexithymia. The scores of BAS MQ-E were in general higher for persons with schizophrenia compared to persons with other diagnoses within the schizophrenia spectrum disorders and bipolar disorder. The clinical implications are presented in the discussion.

  20. The development and validation of three videos designed to psychologically prepare patients for coronary bypass surgery.

    PubMed

    Mahler, H I; Kulik, J A

    1995-02-01

    The purpose of this study was to demonstrate the validation of videotape interventions that were designed to prepare patients for coronary artery bypass graft (CABG) surgery. First, three videotapes were developed. Two of the tapes featured the experiences of three actual CABG patients and were constructed to present either an optimistic portrayal of the recovery period (mastery tape) or a portrayal designed to inoculate patients against potential problems (coping tape). The third videotape contained the more general nurse scenes and narration used in the other two tapes, but did not include the experiences of particular patients. We then conducted a study to establish the convergent and discriminant validity of the three tapes. That is, we sought to demonstrate both that the tapes did differ along the mastery-coping dimension, and that they did not differ in other respects (such as in the degree of information provided or the perceived credibility of the narrator). The validation study, conducted with 42 males who had previously undergone CABG, demonstrated that the intended equivalences and differences between the tapes were achieved. The importance of establishing the validity of health-related interventions is discussed.

  1. Pre-launch Optical Characteristics of the Oculus-ASR Nanosatellite for Attitude and Shape Recognition Experiments

    DTIC Science & Technology

    2011-12-02

    construction and validation of predictive computer models such as those used in Time-domain Analysis Simulation for Advanced Tracking (TASAT), a...characterization data, successful construction and validation of predictive computer models was accomplished. And an investigation in pose determination from...currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES

  2. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  3. Comparative assessment of three standardized robotic surgery training methods.

    PubMed

    Hung, Andrew J; Jayaratna, Isuru S; Teruya, Kara; Desai, Mihir M; Gill, Inderbir S; Goh, Alvin C

    2013-10-01

    To evaluate three standardized robotic surgery training methods, inanimate, virtual reality and in vivo, for their construct validity. To explore the concept of cross-method validity, where the relative performance of each method is compared. Robotic surgical skills were prospectively assessed in 49 participating surgeons who were classified as follows: 'novice/trainee': urology residents, previous experience <30 cases (n = 38) and 'experts': faculty surgeons, previous experience ≥30 cases (n = 11). Three standardized, validated training methods were used: (i) structured inanimate tasks; (ii) virtual reality exercises on the da Vinci Skills Simulator (Intuitive Surgical, Sunnyvale, CA, USA); and (iii) a standardized robotic surgical task in a live porcine model with performance graded by the Global Evaluative Assessment of Robotic Skills (GEARS) tool. A Kruskal-Wallis test was used to evaluate performance differences between novices and experts (construct validity). Spearman's correlation coefficient (ρ) was used to measure the association of performance across inanimate, simulation and in vivo methods (cross-method validity). Novice and expert surgeons had previously performed a median (range) of 0 (0-20) and 300 (30-2000) robotic cases, respectively (P < 0.001). Construct validity: experts consistently outperformed residents with all three methods (P < 0.001). Cross-method validity: overall performance of inanimate tasks significantly correlated with virtual reality robotic performance (ρ = -0.7, P < 0.001) and in vivo robotic performance based on GEARS (ρ = -0.8, P < 0.0001). Virtual reality performance and in vivo tissue performance were also found to be strongly correlated (ρ = 0.6, P < 0.001). We propose the novel concept of cross-method validity, which may provide a method of evaluating the relative value of various forms of skills education and assessment. We externally confirmed the construct validity of each featured training tool. © 2013 BJU International.

  4. Establishing high resolution melting analysis: method validation and evaluation for c-RET proto-oncogene mutation screening.

    PubMed

    Benej, Martin; Bendlova, Bela; Vaclavikova, Eliska; Poturnajova, Martina

    2011-10-06

    Reliable and effective primary screening of mutation carriers is the key condition for common diagnostic use. The objective of this study is to validate the method high resolution melting (HRM) analysis for routine primary mutation screening and accomplish its optimization, evaluation and validation. Due to their heterozygous nature, germline point mutations of c-RET proto-oncogene, associated to multiple endocrine neoplasia type 2 (MEN2), are suitable for HRM analysis. Early identification of mutation carriers has a major impact on patients' survival due to early onset of medullary thyroid carcinoma (MTC) and resistance to conventional therapy. The authors performed a series of validation assays according to International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidelines for validation of analytical procedures, along with appropriate design and optimization experiments. After validated evaluation of HRM, the method was utilized for primary screening of 28 pathogenic c-RET mutations distributed among nine exons of c-RET gene. Validation experiments confirm the repeatability, robustness, accuracy and reproducibility of HRM. All c-RET gene pathogenic variants were detected with no occurrence of false-positive/false-negative results. The data provide basic information about design, establishment and validation of HRM for primary screening of genetic variants in order to distinguish heterozygous point mutation carriers among the wild-type sequence carriers. HRM analysis is a powerful and reliable tool for rapid and cost-effective primary screening, e.g., of c-RET gene germline and/or sporadic mutations and can be used as a first line potential diagnostic tool.

  5. Numerical Investigation of the Performance of a Supersonic Combustion Chamber and Comparison with Experiments

    NASA Astrophysics Data System (ADS)

    Banica, M. C.; Chun, J.; Scheuermann, T.; Weigand, B.; Wolfersdorf, J. v.

    2009-01-01

    Scramjet powered vehicles can decrease costs for access to space but substantial obstacles still exist in their realization. For example, experiments in the relevant Mach number regime are difficult to perform and flight testing is expensive. Therefore, numerical methods are often employed for system layout but they require validation against experimental data. Here, we validate the commercial code CFD++ against experimental results for hydrogen combustion in the supersonic combustion facility of the Institute of Aerospace Thermodynamics (ITLR) at the Universität Stuttgart. Fuel is injected through a lobed a strut injector, which provides rapid mixing. Our numerical data shows reasonable agreement with experiments. We further investigate effects of varying equivalence ratios on several important performance parameters.

  6. Can training improve the quality of inferences made by raters in competency modeling? A quasi-experiment.

    PubMed

    Lievens, Filip; Sanchez, Juan I

    2007-05-01

    A quasi-experiment was conducted to investigate the effects of frame-of-reference training on the quality of competency modeling ratings made by consultants. Human resources consultants from a large consulting firm were randomly assigned to either a training or a control condition. The discriminant validity, interrater reliability, and accuracy of the competency ratings were significantly higher in the training group than in the control group. Further, the discriminant validity and interrater reliability of competency inferences were highest among an additional group of trained consultants who also had competency modeling experience. Together, these results suggest that procedural interventions such as rater training can significantly enhance the quality of competency modeling. 2007 APA, all rights reserved

  7. Development and initial validation of primary care provider mental illness management and team-based care self-efficacy scales.

    PubMed

    Loeb, Danielle F; Crane, Lori A; Leister, Erin; Bayliss, Elizabeth A; Ludman, Evette; Binswanger, Ingrid A; Kline, Danielle M; Smith, Meredith; deGruy, Frank V; Nease, Donald E; Dickinson, L Miriam

    Develop and validate self-efficacy scales for primary care provider (PCP) mental illness management and team-based care participation. We developed three self-efficacy scales: team-based care (TBC), mental illness management (MIM), and chronic medical illness (CMI). We developed the scales using Bandura's Social Cognitive Theory as a guide. The survey instrument included items from previously validated scales on team-based care and mental illness management. We administered a mail survey to 900 randomly selected Colorado physicians. We conducted exploratory principal factor analysis with oblique rotation. We constructed self-efficacy scales and calculated standardized Cronbach's alpha coefficients to test internal consistency. We calculated correlation coefficients between the MIM and TBC scales and previously validated measures related to each scale to evaluate convergent validity. We tested correlations between the TBC and the measures expected to correlate with the MIM scale and vice versa to evaluate discriminant validity. PCPs (n=402, response rate=49%) from diverse practice settings completed surveys. Items grouped into factors as expected. Cronbach's alphas were 0.94, 0.88, and 0.83 for TBC, MIM, and CMI scales respectively. In convergent validity testing, the TBC scale was correlated as predicted with scales assessing communications strategies, attitudes toward teams, and other teamwork indicators (r=0.25 to 0.40, all statistically significant). Likewise, the MIM scale was significantly correlated with several items about knowledge and experience managing mental illness (r=0.24 to 41, all statistically significant). As expected in discriminant validity testing, the TBC scale had only very weak correlations with the mental illness knowledge and experience managing mental illness items (r=0.03 to 0.12). Likewise, the MIM scale was only weakly correlated with measures of team-based care (r=0.09 to.17). This validation study of MIM and TBC self-efficacy scales showed high internal validity and good construct validity. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Improvement of Experiment Planning as an Important Precondition for the Quality of Educational Research

    ERIC Educational Resources Information Center

    Rutkiene, Ausra; Tereseviciene, Margarita

    2010-01-01

    The article presents the stages of the experiment planning that are necessary to ensure the validity and reliability of it. The research data reveal that doctoral students of Educational Research approach the planning of the experiment as the planning of the whole dissertation research; and the experiment as a research method is often confused…

  9. Validating An Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments

    NASA Astrophysics Data System (ADS)

    Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we compare our detection efficiency curves with those derived from the associated pixel-level transit injection experiments.Kepler was selected as the 10th mission of the Discovery Program. Funding for this mission is provided by NASA, Science Mission Directorate.

  10. Reducing Threats to Validity by Design in a Nonrandomized Experiment of a School-Wide Prevention Model

    ERIC Educational Resources Information Center

    Sørlie, Mari-Anne; Ogden, Terje

    2014-01-01

    This paper reviews literature on the rationale, challenges, and recommendations for choosing a nonequivalent comparison (NEC) group design when evaluating intervention effects. After reviewing frequently addressed threats to validity, the paper describes recommendations for strengthening the research design and how the recommendations were…

  11. Culture Training: Validation Evidence for the Culture Assimilator.

    ERIC Educational Resources Information Center

    Mitchell, Terence R.; And Others

    The culture assimilator, a programed self-instructional approach to culture training, is described and a series of laboratory experiments and field studies validating the culture assimilator are reviewed. These studies show that the culture assimilator is an effective method of decreasing some of the stress experienced when one works with people…

  12. 29 CFR 1607.5 - General standards for validity studies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... experience on the job. J. Interim use of selection procedures. Users may continue the use of a selection... studies. A. Acceptable types of validity studies. For the purposes of satisfying these guidelines, users... which has an adverse impact and which selection procedure has an adverse impact, each user should...

  13. 29 CFR 1607.5 - General standards for validity studies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... experience on the job. J. Interim use of selection procedures. Users may continue the use of a selection... studies. A. Acceptable types of validity studies. For the purposes of satisfying these guidelines, users... which has an adverse impact and which selection procedure has an adverse impact, each user should...

  14. 29 CFR 1607.5 - General standards for validity studies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... experience on the job. J. Interim use of selection procedures. Users may continue the use of a selection... studies. A. Acceptable types of validity studies. For the purposes of satisfying these guidelines, users... which has an adverse impact and which selection procedure has an adverse impact, each user should...

  15. Students' Self-Evaluation and Reflection (Part 1): "Measurement"

    ERIC Educational Resources Information Center

    Cambra-Fierro, Jesus; Cambra-Berdun, Jesus

    2007-01-01

    Purpose: The objective of the paper is the development and validation of scales to assess reflective learning. Design/methodology/approach: The research is based on a literature review plus in-classroom experience. For the scale validation process, exploratory and confirmatory analyses were conducted, following proposals made by Anderson and…

  16. Examining the Cultural Validity of a College Student Engagement Survey for Latinos

    ERIC Educational Resources Information Center

    Hernandez, Ebelia; Mobley, Michael; Coryell, Gayle; Yu, En-Hui; Martinez, Gladys

    2013-01-01

    Using critical race theory and quantitative criticalist stance, this study examines the construct validity of an engagement survey, "Student Experiences in the Research University" (SERU) for Latino college students through exploratory factor analysis. Results support the principal seven-factor SERU model. However subfactors exhibited…

  17. Reflexive Orienting in Response to Short- and Long-Duration Gaze Cues in Young, Young-Old, and Old-Old Adults

    PubMed Central

    Gayzur, Nora D.; Langley, Linda K.; Kelland, Chris; Wyman, Sara V.; Saville, Alyson L.; Ciernia, Annie T.; Padmanabhan, Ganesh

    2013-01-01

    Shifting visual focus based on the perceived gaze direction of another person is one form of joint attention. The present study investigated if this socially-relevant form of orienting is reflexive and whether it is influenced by age. Green and Woldorff (2012) argued that rapid cueing effects (faster responses to validly-cued targets than to invalidly-cued targets) were limited to conditions in which a cue overlapped in time with a target. They attributed slower responses following invalid cues to the time needed to resolve incongruent spatial information provided by the concurrently-presented cue and target. The present study examined orienting responses of young (18-31 years), young-old (60-74 years), and old-old adults (75-91 years) following uninformative central gaze cues that overlapped in time with the target (Experiment 1) or that were removed prior to target presentation (Experiment 2). When the cue and target overlapped, all three groups localized validly-cued targets faster than invalidly-cued targets, and validity effects emerged earlier for the two younger groups (at 100 ms post cue onset) than for the old-old group (at 300 ms post cue onset). With a short duration cue (Experiment 2), validity effects developed rapidly (by 100 ms) for all three groups, suggesting that validity effects resulted from reflexive orienting based on gaze cue information rather than from cue-target conflict. Thus, although old-old adults may be slow to disengage from persistent gaze cues, attention continues to be reflexively guided by gaze cues late in life. PMID:24170377

  18. Design of experiments in medical physics: Application to the AAA beam model validation.

    PubMed

    Dufreneix, S; Legrand, C; Di Bartolo, C; Bremaud, M; Mesgouez, J; Tiplica, T; Autret, D

    2017-09-01

    The purpose of this study is to evaluate the usefulness of the design of experiments in the analysis of multiparametric problems related to the quality assurance in radiotherapy. The main motivation is to use this statistical method to optimize the quality assurance processes in the validation of beam models. Considering the Varian Eclipse system, eight parameters with several levels were selected: energy, MLC, depth, X, Y 1 and Y 2 jaw dimensions, wedge and wedge jaw. A Taguchi table was used to define 72 validation tests. Measurements were conducted in water using a CC04 on a TrueBeam STx, a TrueBeam Tx, a Trilogy and a 2300IX accelerator matched by the vendor. Dose was computed using the AAA algorithm. The same raw data was used for all accelerators during the beam modelling. The mean difference between computed and measured doses was 0.1±0.5% for all beams and all accelerators with a maximum difference of 2.4% (under the 3% tolerance level). For all beams, the measured doses were within 0.6% for all accelerators. The energy was found to be an influencing parameter but the deviations observed were smaller than 1% and not considered clinically significant. Designs of experiment can help define the optimal measurement set to validate a beam model. The proposed method can be used to identify the prognostic factors of dose accuracy. The beam models were validated for the 4 accelerators which were found dosimetrically equivalent even though the accelerator characteristics differ. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Assessing quality of maternity care in Hungary: expert validation and testing of the mother-centered prenatal care (MCPC) survey instrument.

    PubMed

    Rubashkin, Nicholas; Szebik, Imre; Baji, Petra; Szántó, Zsuzsa; Susánszky, Éva; Vedam, Saraswathi

    2017-11-16

    Instruments to assess quality of maternity care in Central and Eastern European (CEE) region are scarce, despite reports of poor doctor-patient communication, non-evidence-based care, and informal cash payments. We validated and tested an online questionnaire to study maternity care experiences among Hungarian women. Following literature review, we collated validated items and scales from two previous English-language surveys and adapted them to the Hungarian context. An expert panel assessed items for clarity and relevance on a 4-point ordinal scale. We calculated item-level Content Validation Index (CVI) scores. We designed 9 new items concerning informal cash payments, as well as 7 new "model of care" categories based on mode of payment. The final questionnaire (N = 111 items) was tested in two samples of Hungarian women, representative (N = 600) and convenience (N = 657). We conducted bivariate analysis and thematic analysis of open-ended responses. Experts rated pre-existing English-language items as clear and relevant to Hungarian women's maternity care experiences with an average CVI for included questions of 0.97. Significant differences emerged across the model of care categories in terms of informal payments, informed consent practices, and women's perceptions of autonomy. Thematic analysis (N = 1015) of women's responses identified 13 priority areas of the maternity care experience, 9 of which were addressed by the questionnaire. We developed and validated a comprehensive questionnaire that can be used to evaluate respectful maternity care, evidence-based practice, and informal cash payments in CEE region and beyond.

  20. A self-report measure of legal and administrative aggression within intimate relationships.

    PubMed

    Hines, Denise A; Douglas, Emily M; Berger, Joshua L

    2015-01-01

    Although experts agree that intimate partner violence (IPV) is a multidimensional phenomenon comprised of both physical and non-physical acts, there is no measure of legal and administrative (LA) forms of IPV. LA aggression is when one partner manipulates the legal and other administrative systems to the detriment of his/her partner. Our measure was developed using the qualitative literature on male IPV victims' experiences. We tested the reliability and validity of our LA aggression measure on two samples of men: 611 men who sustained IPV and sought help, and 1,601 men in a population-based sample. Construct validity of the victimization scale was supported through factor analyses, correlations with other forms of IPV victimization, and comparisons of the rates of LA aggression between the two samples; reliability was established through Cronbach's alpha. Evidence for the validity and reliability of the perpetration scale was mixed and therefore needs further analyses and revisions before we can recommend its use in empirical work. There is initial support for the victimization scale as a valid and reliable measure of LA aggression victimization among men, but work is needed using women's victimization's experiences to establish reliability and validity of this measure for women. An LA aggression measure should be developed using LGBTQ victims' experiences, and for couples who are well into the divorce and child custody legal process. Legal personnel and practitioners should be educated on this form of IPV so that they can appropriately work with clients who have been victimized or perpetrate LA aggression. © 2014 Wiley Periodicals, Inc.

  1. Inter-Disciplinary Collaboration in Support of the Post-Standby TREAT Mission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark; Baker, Benjamin; Ortensi, Javier

    Although analysis methods have advanced significantly in the last two decades, high fidelity multi- physics methods for reactors systems have been under development for only a few years and are not presently mature nor deployed. Furthermore, very few methods provide the ability to simulate rapid transients in three dimensions. Data for validation of advanced time-dependent multi- physics is sparse; at TREAT, historical data were not collected for the purpose of validating three-dimensional methods, let alone multi-physics simulations. Existing data continues to be collected to attempt to simulate the behavior of experiments and calibration transients, but it will be insufficient formore » the complete validation of analysis methods used for TREAT transient simulations. Hence, a 2018 restart will most likely occur without the direct application of advanced modeling and simulation methods. At present, the current INL modeling and simulation team plans to work with TREAT operations staff in performing reactor simulations with MAMMOTH, in parallel with the software packages currently being used in preparation for core restart (e.g., MCNP5, RELAP5, ABAQUS). The TREAT team has also requested specific measurements to be performed during startup testing, currently scheduled to run from February to August of 2018. These startup measurements will be crucial in validating the new analysis methods in preparation for ultimate application for TREAT operations and experiment design. This document describes the collaboration between modeling and simulation staff and restart, operations, instrumentation and experiment development teams to be able to effectively interact and achieve successful validation work during restart testing.« less

  2. Rape Survivors' Experiences of the Silent Protest: Implications for Promoting Healing and Resilience.

    PubMed

    Padmanabhanunni, Anita; Edwards, David

    2016-05-01

    This article examines the experiences of nine rape survivors who participated in the Silent Protest, an annual protest march at Rhodes University that aims to highlight the sexual abuse of women, validate the harm done, and foster solidarity among survivors. Participants responded to a semi-structured interview focusing on the context of their rape and its impact, and their experiences of participation in the Protest In the first phase of data analysis, synoptic case narratives were written. In the second, themes from participants' experience were identified using interpretative phenomenological analysis. In the third, the data were examined in light of questions around the extent to which participation contributed to healing. Participants reported experiences of validation and empowerment but the majority were suffering from posttraumatic stress disorder. In some cases, participation had exacerbated self-blame and avoidant coping. Recommendations are made about the provision of psychoeducation and counseling at such events. © The Author(s) 2015.

  3. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  4. Multi-Evaporator Miniature Loop Heat Pipe for Small Spacecraft Thermal Control

    NASA Technical Reports Server (NTRS)

    Ku, Jentung; Ottenstein, Laura; Douglas, Donya

    2008-01-01

    This paper presents the development of the Thermal Loop experiment under NASA's New Millennium Program Space Technology 8 (ST8) Project. The Thermal Loop experiment was originally planned for validating in space an advanced heat transport system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers. Details of the thermal loop concept, technical advances and benefits, Level 1 requirements and the technology validation approach are described. An MLHP breadboard has been built and tested in the laboratory and thermal vacuum environments, and has demonstrated excellent performance that met or exceeded the design requirements. The MLHP retains all features of state-of-the-art loop heat pipes and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. In addition, an analytical model has been developed to simulate the steady state and transient operation of the MHLP, and the model predictions agreed very well with experimental results. A protoflight MLHP has been built and is being tested in a thermal vacuum chamber to validate its performance and technical readiness for a flight experiment.

  5. Continued Development and Validation of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2015-11-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.

  6. Zig-zag tape influence in NREL Phase VI wind turbine

    NASA Astrophysics Data System (ADS)

    Gomez-Iradi, Sugoi; Munduate, Xabier

    2014-06-01

    Two bladed 10 metre diameter wind turbine was tested in the 24.4m × 36.6m NASA-Ames wind tunnel (Phase VI). These experiments have been extensively used for validation purposes for CFD and other engineering tools. The free transition case (S), has been, and is, the most employed one for validation purposes, and consist in a 3° pitch case with a rotational speed of 72rpm upwind configuration with and without yaw misalignment. However, there is another less visited case (M) where identical configuration was tested but with the inclusion of a zig-zag tape. This was called transition fixed sequence. This paper shows the differences between the free and the fix transition cases, that should be more appropriate for comparison with fully turbulent simulations. Steady k-ω SST fully turbulent computations performed with WMB CFD method are compared with the experiments showing, better predictions in the attached flow region when it is compared with the transition fixed experiments. This work wants to prove the utility of M case (transition fixed) and show its differences respect the S case (free transition) for validation purposes.

  7. Anti-Collision Function Design and Performances of the CNES Formation Flying Experiment on the PRISMA Mission

    NASA Technical Reports Server (NTRS)

    Cayeux, P.; Raballand, F.; Borde, J.; Berges, J.-C.; Meyssignac, B.

    2007-01-01

    Within the framework of a partnership agreement, EADS ASTRIUM has worked since June 2006 for the CNES formation flying experiment on the PRISMA mission. EADS ASTRIUM is responsible for the anti-collision function. This responsibility covers the design and the development of the function as a Matlab/Simulink library, as well as its functional validation and performance assessment. PRISMA is a technology in-orbit testbed mission from the Swedish National Space Board, mainly devoted to formation flying demonstration. PRISMA is made of two micro-satellites that will be launched in 2009 on a quasi-circular SSO at about 700 km of altitude. The CNES FFIORD experiment embedded on PRISMA aims at flight validating an FFRF sensor designed for formation control, and assessing its performances, in preparation to future formation flying missions such as Simbol X; FFIORD aims as well at validating various typical autonomous rendezvous and formation guidance and control algorithms. This paper presents the principles of the collision avoidance function developed by EADS ASTRIUM for FFIORD; three kinds of maneuvers were implemented and are presented in this paper with their performances.

  8. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2016-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.

  9. A verification and validation effort for high explosives at Los Alamos National Lab (u)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scovel, Christina A; Menikoff, Ralph S

    2009-01-01

    We have started a project to verify and validate ASC codes used to simulate detonation waves in high explosives. Since there are no non-trivial analytic solutions, we are going to compare simulated results with experimental data that cover a wide range of explosive phenomena. The intent is to compare both different codes and different high explosives (HE) models. The first step is to test the products equation of state used for the HE models, For this purpose, the cylinder test, flyer plate and plate-push experiments are being used. These experiments sample different regimes in thermodynamic phase space: the CJ isentropemore » for the cylinder tests, the isentrope behind an overdriven detonation wave for the flyer plate experiment, and expansion following a reflected CJ detonation for the plate-push experiment, which is sensitive to the Gruneisen coefficient. The results of our findings for PBX 9501 are presented here.« less

  10. Introspection of subjective feelings is sensitive and specific.

    PubMed

    Questienne, Laurence; van Dijck, Jean-Philippe; Gevers, Wim

    2018-02-01

    Conversely to behaviorist ideas, recent studies suggest that introspection can be accurate and reliable. However, an unresolved question is whether people are able to report specific aspects of their phenomenal experience, or whether they report more general nonspecific experiences. To address this question, we investigated the sensitivity and validity of our introspection for different types of conflict. Taking advantage of the congruency sequence effect, we dissociated response conflict while keeping visual conflict unchanged in a Stroop and in a priming task. Participants were subsequently asked to report on either their experience of urge to err or on their feeling of visual conflict. Depending on the focus of the introspection, subjective reports specifically followed either the response conflict or the visual conflict. These results demonstrate that our introspective reports can be sensitive and that we are able to dissociate specific aspects of our phenomenal experiences in a valid manner. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Further cross-cultural validation of the theory of mental self-government.

    PubMed

    Zhang, L F

    1999-03-01

    This study was designed to achieve two objectives. The 1st was to investigate the cross-cultural validity of the Thinking Styles Inventory (TSI; R. J. Sternberg & R. K. Wagner, 1992), which is based on the theory of mental self-government (R. J. Sternberg, 1988, 1990, 1997). The 2nd was to examine the relationships between thinking styles as assessed by the TSI and a number of student characteristics, including age, gender, college class level, work experience, and travel experience. One hundred fifty-one students from the University of Hong Kong participated in the study. Results indicated that the thinking styles evaluated by the TSI could be identified among the participants. Moreover, there were significant relationships between certain thinking styles, especially creativity-relevant styles and 3 student characteristics: age, work experience, and travel experience. Implications of these findings for teaching and learning in and outside the classroom are discussed.

  12. Development and examination of the psychometric properties of the Learning Experience Scale in nursing.

    PubMed

    Takase, Miyuki; Imai, Takiko; Uemura, Chizuru

    2016-06-01

    This paper examines the psychometric properties of the Learning Experience Scale. A survey method was used to collect data from a total of 502 nurses. Data were analyzed by factor analysis and the known-groups technique to examine the construct validity of the scale. In addition, internal consistency was evaluated by Cronbach's alpha, and stability was examined by test-retest correlation. Factor analysis showed that the Learning Experience Scale consisted of five factors: learning from practice, others, training, feedback, and reflection. The scale also had the power to discriminate between nurses with high and low levels of nursing competence. The internal consistency and the stability of the scale were also acceptable. The Learning Experience Scale is a valid and reliable instrument, and helps organizations to effectively design learning interventions for nurses. © 2015 Wiley Publishing Asia Pty Ltd.

  13. Traverse Planning Experiments for Future Planetary Surface Exploration

    NASA Technical Reports Server (NTRS)

    Hoffman, Stephen J.; Voels, Stephen A.; Mueller, Robert P.; Lee, Pascal C.

    2012-01-01

    The purpose of the investigation is to evaluate methodology and data requirements for remotely-assisted robotic traverse of extraterrestrial planetary surface to support human exploration program, assess opportunities for in-transit science operations, and validate landing site survey and selection techniques during planetary surface exploration mission analog demonstration at Haughton Crater on Devon Island, Nunavut, Canada. Additionally, 1) identify quality of remote observation data sets (i.e., surface imagery from orbit) required for effective pre-traverse route planning and determine if surface level data (i.e., onboard robotic imagery or other sensor data) is required for a successful traverse, and if additional surface level data can improve traverse efficiency or probability of success (TRPF Experiment). 2) Evaluate feasibility and techniques for conducting opportunistic science investigations during this type of traverse. (OSP Experiment). 3) Assess utility of remotely-assisted robotic vehicle for landing site validation survey. (LSV Experiment).

  14. An Approach for Validating Actinide and Fission Product Burnup Credit Criticality Safety Analyses--Criticality (keff) Predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scaglione, John M; Mueller, Don; Wagner, John C

    2011-01-01

    One of the most significant remaining challenges associated with expanded implementation of burnup credit in the United States is the validation of depletion and criticality calculations used in the safety evaluation - in particular, the availability and use of applicable measured data to support validation, especially for fission products. Applicants and regulatory reviewers have been constrained by both a scarcity of data and a lack of clear technical basis or approach for use of the data. U.S. Nuclear Regulatory Commission (NRC) staff have noted that the rationale for restricting their Interim Staff Guidance on burnup credit (ISG-8) to actinide-only ismore » based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address the issue of validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach (both depletion and criticality) for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the criticality (k{sub eff}) validation approach, and resulting observations and recommendations. Validation of the isotopic composition (depletion) calculations is addressed in a companion paper at this conference. For criticality validation, the approach is to utilize (1) available laboratory critical experiment (LCE) data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the French Haut Taux de Combustion (HTC) program to support validation of the principal actinides and (2) calculated sensitivities, nuclear data uncertainties, and the limited available fission product LCE data to predict and verify individual biases for relevant minor actinides and fission products. This paper (1) provides a detailed description of the approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias results based on the prerelease SCALE 6.1 code package and ENDF/B-VII nuclear cross-section data, and (4) provides recommendations for application of the results and methods to other code and data packages.« less

  15. Design and validation of the Health Professionals' Attitudes Toward the Homeless Inventory (HPATHI).

    PubMed

    Buck, David S; Monteiro, F Marconi; Kneuper, Suzanne; Rochon, Donna; Clark, Dana L; Melillo, Allegra; Volk, Robert J

    2005-01-10

    Recent literature has called for humanistic care of patients and for medical schools to begin incorporating humanism into medical education. To assess the attitudes of health-care professionals toward homeless patients and to demonstrate how those attitudes might impact optimal care, we developed and validated a new survey instrument, the Health Professional Attitudes Toward the Homeless Inventory (HPATHI). An instrument that measures providers' attitudes toward the homeless could offer meaningful information for the design and implementation of educational activities that foster more compassionate homeless health care. Our intention was to describe the process of designing and validating the new instrument and to discuss the usefulness of the instrument for assessing the impact of educational experiences that involve working directly with the homeless on the attitudes, interest, and confidence of medical students and other health-care professionals. The study consisted of three phases: identifying items for the instrument; pilot testing the initial instrument with a group of 72 third-year medical students; and modifying and administering the instrument in its revised form to 160 health-care professionals and third-year medical students. The instrument was analyzed for reliability and validity throughout the process. A 19-item version of the HPATHI had good internal consistency with a Cronbach's alpha of 0.88 and a test-retest reliability coefficient of 0.69. The HPATHI showed good concurrent validity, and respondents with more than one year of experience with homeless patients scored significantly higher than did those with less experience. Factor analysis yielded three subscales: Personal Advocacy, Social Advocacy, and Cynicism. The HPATHI demonstrated strong reliability for the total scale and satisfactory test-retest reliability. Extreme group comparisons suggested that experience with the homeless rather than medical training itself could affect health-care professionals' attitudes toward the homeless. This could have implications for the evaluation of medical school curricula.

  16. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  17. Strain gauge validation experiments for the Sandia 34-meter VAWT (Vertical Axis Wind Turbine) test bed

    NASA Astrophysics Data System (ADS)

    Sutherland, Herbert J.

    1988-08-01

    Sandia National Laboratories has erected a research oriented, 34- meter diameter, Darrieus vertical axis wind turbine near Bushland, Texas. This machine, designated the Sandia 34-m VAWT Test Bed, is equipped with a large array of strain gauges that have been placed at critical positions about the blades. This manuscript details a series of four-point bend experiments that were conducted to validate the output of the blade strain gauge circuits. The output of a particular gauge circuit is validated by comparing its output to equivalent gauge circuits (in this stress state) and to theoretical predictions. With only a few exceptions, the difference between measured and predicted strain values for a gauge circuit was found to be of the order of the estimated repeatability for the measurement system.

  18. Experimental validation of predicted cancer genes using FRET

    NASA Astrophysics Data System (ADS)

    Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.

    2018-07-01

    Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.

  19. Validation of space-based polarization measurements by use of a single-scattering approximation, with application to the global ozone monitoring experiment.

    PubMed

    Aben, Ilse; Tanzi, Cristina P; Hartmann, Wouter; Stam, Daphne M; Stammes, Piet

    2003-06-20

    A method is presented for in-flight validation of space-based polarization measurements based on approximation of the direction of polarization of scattered sunlight by the Rayleigh single-scattering value. This approximation is verified by simulations of radiative transfer calculations for various atmospheric conditions. The simulations show locations along an orbit where the scattering geometries are such that the intensities of the parallel and orthogonal polarization components of the light are equal, regardless of the observed atmosphere and surface. The method can be applied to any space-based instrument that measures the polarization of reflected solar light. We successfully applied the method to validate the Global Ozone Monitoring Experiment (GOME) polarization measurements. The error in the GOME's three broadband polarization measurements appears to be approximately 1%.

  20. Fun and games in reviewing neonatal emergency care.

    PubMed

    Gordon, D W; Brown, H N

    1995-04-01

    To develop a game-based review instrument for use by newborn caregivers in preparing for emergency situations. One hundred and one test questions covering pathophysiology, resuscitation, and medications were developed. The questions then underwent expert and peer review, psychometric testing for content validity and test-retest reliability, and a game trial. The needs of adult learners are different from those of other learners. The gaming format uses knowledge gained through experience and provides an avenue for validating knowledge and sharing experiences. This format has been found effective for review and reinforcement of facts. Twelve nurses participated in a trial game and completed a written evaluation using a Likert scale. The Neonatal Emergency Trivia Game is an effective tool for reviewing material related to neonatal emergency care decisions. Additional testing with a larger group would strengthen validity and reliability data.

Top